Search results for: mixed methods approach
20120 A POX Controller Module to Collect Web Traffic Statistics in SDN Environment
Authors: Wisam H. Muragaa, Kamaruzzaman Seman, Mohd Fadzli Marhusin
Abstract:
Software Defined Networking (SDN) is a new norm of networks. It is designed to facilitate the way of managing, measuring, debugging and controlling the network dynamically, and to make it suitable for the modern applications. Generally, measurement methods can be divided into two categories: Active and passive methods. Active measurement method is employed to inject test packets into the network in order to monitor their behaviour (ping tool as an example). Meanwhile the passive measurement method is used to monitor the traffic for the purpose of deriving measurement values. The measurement methods, both active and passive, are useful for the collection of traffic statistics, and monitoring of the network traffic. Although there has been a work focusing on measuring traffic statistics in SDN environment, it was only meant for measuring packets and bytes rates for non-web traffic. In this study, a feasible method will be designed to measure the number of packets and bytes in a certain time, and facilitate obtaining statistics for both web traffic and non-web traffic. Web traffic refers to HTTP requests that use application layer; while non-web traffic refers to ICMP and TCP requests. Thus, this work is going to be more comprehensive than previous works. With a developed module on POX OpenFlow controller, information will be collected from each active flow in the OpenFlow switch, and presented on Command Line Interface (CLI) and wireshark interface. Obviously, statistics that will be displayed on CLI and on wireshark interfaces include type of protocol, number of bytes and number of packets, among others. Besides, this module will show the number of flows added to the switch whenever traffic is generated from and to hosts in the same statistics list. In order to carry out this work effectively, our Python module will send a statistics request message to the switch requesting its current ports and flows statistics in every five seconds; while the switch will reply with the required information in a message called statistics reply message. Thus, POX controller will be notified and updated with any changes could happen in the entire network in a very short time. Therefore, our aim of this study is to prepare a list for the important statistics elements that are collected from the whole network, to be used for any further researches; particularly, those that are dealing with the detection of the network attacks that cause a sudden rise in the number of packets and bytes like Distributed Denial of Service (DDoS).Keywords: mininet, OpenFlow, POX controller, SDN
Procedia PDF Downloads 23520119 Influence of Urban Microclimates on Human Perceptions and Behavioral Patterns: A Relational Context of Human Parameters in Urban Design
Authors: Naveed Mazhar
Abstract:
Our cities are known to have significant modifying effects on the local climate. The nature of the modifications depends on a range of physical variables, usually assessed at a wide range of spatial scales. Physical spatial dimensions, such as measured parameters of microclimates and their significant influence on human sensations, are known to have far-reaching effects on human thermal comfort and by corollary a force that influences human perception. Less scholarship has thrown light on the subjective dimension and insufficiently demonstrates a relational approach between human behavior and how it is affected by the phenomenon of urban microclimates. Other than identifying gaps in the most recent scholarship and providing future research opportunities, the scope of this study will help improve urban design guidelines and raise framework standards of socially responsive urban design. This study will help equip future professionals to ameliorate the effects of urban microclimates on participant’s perceptions enabling more frequent usage of the outdoor urban spaces. However, it is informed that the physical parameters of an outdoor open space determine psychological human adaptations and is a measure of the degree to which people are willing to adapt to their surroundings. A large amount of research is available related to urban microclimates. However, very few studies are focused on the elucidation of the critical factors influencing human perceptions of the microclimates in urban spatial configurations. Based on the most recent scholarship, this study has evaluated the role urban microclimatic conditions have in the formation of human perceptions and, by extension, behavioral patterns formulating in outdoor open spaces. Furthermore, this study also defines, in the backdrop of the current scholarly literature, the socio-spatial interdependence of behavioral patterns with relationship to the built urban fabric and its resultant correlation with human perception. A comprehensive review and analysis of the recent research conducted within the scope of the study will help frame gaps, issues, current research methods and future research opportunities.Keywords: urban design, urban microcliamate, human perception, human behavioral patterns
Procedia PDF Downloads 30420118 Disaster Management Approach for Planning an Early Response to Earthquakes in Urban Areas
Authors: Luis Reynaldo Mota-Santiago, Angélica Lozano
Abstract:
Determining appropriate measures to face earthquakesarea challenge for practitioners. In the literature, some analyses consider disaster scenarios, disregarding some important field characteristics. Sometimes, software that allows estimating the number of victims and infrastructure damages is used. Other times historical information of previous events is used, or the scenarios’informationis assumed to be available even if it isnot usual in practice. Humanitarian operations start immediately after an earthquake strikes, and the first hours in relief efforts are important; local efforts are critical to assess the situation and deliver relief supplies to the victims. A preparation action is prepositioning stockpiles, most of them at central warehouses placed away from damage-prone areas, which requires large size facilities and budget. Usually, decisions in the first 12 hours (standard relief time (SRT)) after the disaster are the location of temporary depots and the design of distribution paths. The motivation for this research was the delay in the reaction time of the early relief efforts generating the late arrival of aid to some areas after the Mexico City 7.1 magnitude earthquake in 2017. Hence, a preparation approach for planning the immediate response to earthquake disasters is proposed, intended for local governments, considering their capabilities for planning and for responding during the SRT, in order to reduce the start-up time of immediate response operations in urban areas. The first steps are the generation and analysis of disaster scenarios, which allow estimatethe relief demand before and in the early hours after an earthquake. The scenarios can be based on historical data and/or the seismic hazard analysis of an Atlas of Natural Hazards and Risk as a way to address the limited or null available information.The following steps include the decision processes for: a) locating local depots (places to prepositioning stockpiles)and aid-giving facilities at closer places as possible to risk areas; and b) designing the vehicle paths for aid distribution (from local depots to the aid-giving facilities), which can be used at the beginning of the response actions. This approach allows speeding up the delivery of aid in the early moments of the emergency, which could reduce the suffering of the victims allowing additional time to integrate a broader and more streamlined response (according to new information)from national and international organizations into these efforts. The proposed approachis applied to two case studies in Mexico City. These areas were affectedby the 2017’s earthquake, having limited aid response. The approach generates disaster scenarios in an easy way and plans a faster early response with a short quantity of stockpiles which can be managed in the early hours of the emergency by local governments. Considering long-term storage, the estimated quantities of stockpiles require a limited budget to maintain and a small storage space. These stockpiles are useful also to address a different kind of emergencies in the area.Keywords: disaster logistics, early response, generation of disaster scenarios, preparation phase
Procedia PDF Downloads 11020117 The Comparison Study of Human Microbiome in Chronic Rhinosinusitis between Adults and Children
Authors: Il Ho Park, Joong Seob Lee, Sung Hun Kang, Jae-Min Shin, Il Seok Park, Seok Min Hong, Seok Jin Hong
Abstract:
Introduction: The human microbiota is the aggregate of microorganisms, and the bacterial microbiome of the human digestive tract contributes to both health and disease. In health, bacteria are key components in the development of mucosal barrier function and in innate and adaptive immune responses, and they also work to suppress the establishment of pathogens. In human upper airway, the sinonasal microbiota might play an important role in chronic rhinosinusitis (CRS). The purpose of this study is to investigate the human upper airway microbiome in CRS patients and to compare the sinonasal microbiome of adults with children. Materials and methods: A total of 19 samples from 19 patients (Group1; 9 CRS in children, aged 5 to 14 years versus Group 2; 10 CRS in adults aged 21 to 59 years) were examined. Swabs were collected from the middle meatus and/or anterior ethmoid region under general anesthesia during endoscopic sinus surgery or tonsillectomy. After DNA extraction from swab samples, we analysed bacterial microbiome consortia using 16s rRNA gene sequencing approach (the Illumina MiSeq platform). Results: In this study, relatively abundance of the six bacterial phyla and tremendous genus and species found in substantial amounts in the individual sinus swab samples, include Corynebacterium, Hemophilus, Moraxella, and Streptococcus species. Anaerobes like Fusobacterium and Bacteroides were abundantly present in the children group, Bacteroides and Propionibacterium were present in adults group. In genus, Haemophilus was the most common CRS microbiome in children and Corynebacterium was the most common CRS microbiome in adults. Conclusions: Our results show the diversity of human upper airway microbiome, and the findings will suggest that CRS is a polymicrobial infection. The Corynebacterium and Hemophilus may live as commensals on mucosal surfaces of sinus in the upper respiratory tract. The further study will be needed for analysis of microbiome-human interactions in upper airway and CRS.Keywords: microbiome, upper airway, chronic rhinosinusitis, adult and children
Procedia PDF Downloads 12620116 Effect of Dissolved Oxygen Concentration on Iron Dissolution by Liquid Sodium
Authors: Sami Meddeb, M. L Giorgi, J. L. Courouau
Abstract:
This work presents the progress of studies aiming to guarantee the lifetime of 316L(N) steel in a sodium-cooled fast reactor by determining the elementary corrosion mechanism, which is akin to an accelerated dissolution by dissolved oxygen. The mechanism involving iron, the main element of steel, is particularly studied in detail, from the viewpoint of the data available in the literature, the modeling of the various mechanisms hypothesized. Experiments performed in the CORRONa facility at controlled temperature and dissolved oxygen content are used to test both literature data and hypotheses. Current tests, performed at various temperatures and oxygen content, focus on specifying the chemical reaction at play, determining its free enthalpy, as well as kinetics rate constants. Specific test configuration allows measuring the reaction kinetics and the chemical equilibrium state in the same test. In the current state of progress of these tests, the dissolution of iron accelerated by dissolved oxygen appears as directly related to a chemical complexation reaction of mixed iron-sodium oxide (Na-Fe-O), a compound that is soluble in the liquid sodium solution. Results obtained demonstrate the presence in the solution of this corrosion product, whose kinetics is the limiting step under the conditions of the test. This compound, the object of hypotheses dating back more than 50 years, is predominant in solution compared to atomic iron, presumably even for the low oxygen concentration, and cannot be neglected for the long-term corrosion modeling of any heat transfer system.Keywords: corrosion, sodium fast reactors, iron, oxygen
Procedia PDF Downloads 17920115 Optimization of Hate Speech and Abusive Language Detection on Indonesian-language Twitter using Genetic Algorithms
Authors: Rikson Gultom
Abstract:
Hate Speech and Abusive language on social media is difficult to detect, usually, it is detected after it becomes viral in cyberspace, of course, it is too late for prevention. An early detection system that has a fairly good accuracy is needed so that it can reduce conflicts that occur in society caused by postings on social media that attack individuals, groups, and governments in Indonesia. The purpose of this study is to find an early detection model on Twitter social media using machine learning that has high accuracy from several machine learning methods studied. In this study, the support vector machine (SVM), Naïve Bayes (NB), and Random Forest Decision Tree (RFDT) methods were compared with the Support Vector machine with genetic algorithm (SVM-GA), Nave Bayes with genetic algorithm (NB-GA), and Random Forest Decision Tree with Genetic Algorithm (RFDT-GA). The study produced a comparison table for the accuracy of the hate speech and abusive language detection model, and presented it in the form of a graph of the accuracy of the six algorithms developed based on the Indonesian-language Twitter dataset, and concluded the best model with the highest accuracy.Keywords: abusive language, hate speech, machine learning, optimization, social media
Procedia PDF Downloads 12820114 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure
Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer
Abstract:
The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition
Procedia PDF Downloads 10820113 The Functional-Engineered Product-Service System Model: An Extensive Review towards a Unified Approach
Authors: Nicolas Haber
Abstract:
The study addresses the design process of integrated product-service offerings as a measure of answering environmental sustainability concerns by replacing stand-alone physical artefacts with comprehensive solutions relying on functional results rather than conventional product sales. However, views regarding this transformation are dissimilar and differentiated: The study discusses the importance and requirements of product-service systems before analysing the theoretical studies accomplished in the extent of their design and development processes. Based on this, a framework, built on a design science approach, is proposed, where the distinct approaches from the literature are merged towards a unified structure serving as a generic methodology to designing product-service systems. Each stage of this model is then developed to present a holistic design proposal called the Functional Engineered Product-Service System (FEPSS) model. Product-service systems are portrayed as customisable solutions tailored to specific settings and defined circumstances. Moreover, the approaches adopted to guide the design process are diversified. A thorough analysis of the design strategies and development processes however, allowed the extraction of a design backbone, valid to varied situations and contexts whether they are product-oriented, use-oriented or result-oriented. The goal is to guide manufacturers towards an eased adoption of these integrated offerings, given their inherited environmental benefits, by proposing a robust all-purpose design process.Keywords: functional product, integrated product-service offerings, product-service systems, sustainable design
Procedia PDF Downloads 29420112 Determination of Physical Properties of Crude Oil Distillates by Near-Infrared Spectroscopy and Multivariate Calibration
Authors: Ayten Ekin Meşe, Selahattin Şentürk, Melike Duvanoğlu
Abstract:
Petroleum refineries are a highly complex process industry with continuous production and high operating costs. Physical separation of crude oil starts with the crude oil distillation unit, continues with various conversion and purification units, and passes through many stages until obtaining the final product. To meet the desired product specification, process parameters are strictly followed. To be able to ensure the quality of distillates, routine analyses are performed in quality control laboratories based on appropriate international standards such as American Society for Testing and Materials (ASTM) standard methods and European Standard (EN) methods. The cut point of distillates in the crude distillation unit is very crucial for the efficiency of the upcoming processes. In order to maximize the process efficiency, the determination of the quality of distillates should be as fast as possible, reliable, and cost-effective. In this sense, an alternative study was carried out on the crude oil distillation unit that serves the entire refinery process. In this work, studies were conducted with three different crude oil distillates which are Light Straight Run Naphtha (LSRN), Heavy Straight Run Naphtha (HSRN), and Kerosene. These products are named after separation by the number of carbons it contains. LSRN consists of five to six carbon-containing hydrocarbons, HSRN consist of six to ten, and kerosene consists of sixteen to twenty-two carbon-containing hydrocarbons. Physical properties of three different crude distillation unit products (LSRN, HSRN, and Kerosene) were determined using Near-Infrared Spectroscopy with multivariate calibration. The absorbance spectra of the petroleum samples were obtained in the range from 10000 cm⁻¹ to 4000 cm⁻¹, employing a quartz transmittance flow through cell with a 2 mm light path and a resolution of 2 cm⁻¹. A total of 400 samples were collected for each petroleum sample for almost four years. Several different crude oil grades were processed during sample collection times. Extended Multiplicative Signal Correction (EMSC) and Savitzky-Golay (SG) preprocessing techniques were applied to FT-NIR spectra of samples to eliminate baseline shifts and suppress unwanted variation. Two different multivariate calibration approaches (Partial Least Squares Regression, PLS and Genetic Inverse Least Squares, GILS) and an ensemble model were applied to preprocessed FT-NIR spectra. Predictive performance of each multivariate calibration technique and preprocessing techniques were compared, and the best models were chosen according to the reproducibility of ASTM reference methods. This work demonstrates the developed models can be used for routine analysis instead of conventional analytical methods with over 90% accuracy.Keywords: crude distillation unit, multivariate calibration, near infrared spectroscopy, data preprocessing, refinery
Procedia PDF Downloads 13120111 Evaluation of Oxidative Changes in Soybean Oil During Shelf-Life by Physico-Chemical Methods and Headspace-Liquid Phase Microextraction (HS-LPME) Technique
Authors: Maryam Enteshari, Kooshan Nayebzadeh, Abdorreza Mohammadi
Abstract:
In this study, the oxidative stability of soybean oil under different storage temperatures (4 and 25˚C) and during 6-month shelf-life was investigated by various analytical methods and headspace-liquid phase microextraction (HS-LPME) coupled to gas chromatography-mass spectrometry (GC-MS). Oxidation changes were monitored by analytical parameters consisted of acid value (AV), peroxide value (PV), p-Anisidine value (p-AV), thiobarbituric acid value (TBA), fatty acids profile, iodine value (IV), and oxidative stability index (OSI). In addition, concentrations of hexanal and heptanal as secondary volatile oxidation compounds were determined by HS-LPME/GC-MS technique. Rate of oxidation in soybean oil which stored at 25˚C was so higher. The AV, p-AV, and TBA were gradually increased during 6 months while the amount of unsaturated fatty acids, IV, and OSI decreased. Other parameters included concentrations of both hexanal and heptanal, and PV exhibited increasing trend during primitive months of storage; then, at the end of third and fourth months a sudden decrement was understood for the concentrations of hexanal and heptanal and the amount of PV, simultaneously. The latter parameters increased again until the end of shelf-time. As a result, the temperature and time were effective factors in oxidative stability of soybean oil. Also intensive correlations were found for soybean oil at 4 ˚C between AV and TBA (r2=0.96), PV and p-AV (r2=0.9), IV and TBA (-r2=0.9), and for soybean oil stored at 4˚C between p-AV and TBA (r2=0.99).Keywords: headspace-liquid phase microextraction, oxidation, shelf-life, soybean oil
Procedia PDF Downloads 40420110 Study of Large-Scale Atmospheric Convection over the Tropical Indian Ocean and Its Association with Oceanic Variables
Authors: Supriya Manikrao Ovhal
Abstract:
In India, the summer monsoon rainfall occurs owing to large scale convection with reference to continental ITCZ. It was found that convection over tropical ocean increases with SST from 26 to 28 degree C, and when SST is above 29 degree C, it sharply decreases for warm pool areas of Indian and for monsoon areas of West Pacific Ocean. The reduction in convection can be influenced by large scale subsidence forced by nearby or remotely generated deep convection, thus it was observed that under the influence of strong large scale rising motion, convection does not decreases but increases monotonically with SST even if SST value is higher than 29.5 degree C. Since convection is related to SST gradient, that helps to generate low level moisture convergence and upward vertical motion in the atmosphere. Strong wind fields like cross equatorial low level jet stream on equator ward side of the warm pool are produced due to convection initiated by SST gradient. Areas having maximum SST have low SST gradient, and that result in feeble convection. Hence it is imperative to mention that the oceanic role (other than SST) could be prominent in influencing large Scale Atmospheric convection. Since warm oceanic surface somewhere or the other contributes to penetrate the heat radiation to the subsurface of the ocean, and as there is no studies seen related to oceanic subsurface role in large Scale Atmospheric convection, in the present study, we are concentrating on the oceanic subsurface contribution in large Scale Atmospheric convection by considering the SST gradient, mixed layer depth (MLD), thermocline, barrier layer. The present study examines the probable role of subsurface ocean parameters in influencing convection. Procedia PDF Downloads 9320109 Coarse-Grained Computational Fluid Dynamics-Discrete Element Method Modelling of the Multiphase Flow in Hydrocyclones
Authors: Li Ji, Kaiwei Chu, Shibo Kuang, Aibing Yu
Abstract:
Hydrocyclones are widely used to classify particles by size in industries such as mineral processing and chemical processing. The particles to be handled usually have a broad range of size distributions and sometimes density distributions, which has to be properly considered, causing challenges in the modelling of hydrocyclone. The combined approach of Computational Fluid Dynamics (CFD) and Discrete Element Method (DEM) offers convenience to model particle size/density distribution. However, its direct application to hydrocyclones is computationally prohibitive because there are billions of particles involved. In this work, a CFD-DEM model with the concept of the coarse-grained (CG) model is developed to model the solid-fluid flow in a hydrocyclone. The DEM is used to model the motion of discrete particles by applying Newton’s laws of motion. Here, a particle assembly containing a certain number of particles with same properties is treated as one CG particle. The CFD is used to model the liquid flow by numerically solving the local-averaged Navier-Stokes equations facilitated with the Volume of Fluid (VOF) model to capture air-core. The results are analyzed in terms of fluid and solid flow structures, and particle-fluid, particle-particle and particle-wall interaction forces. Furthermore, the calculated separation performance is compared with the measurements. The results obtained from the present study indicate that this approach can offer an alternative way to examine the flow and performance of hydrocyclonesKeywords: computational fluid dynamics, discrete element method, hydrocyclone, multiphase flow
Procedia PDF Downloads 40820108 Lagrangian Approach for Modeling Marine Litter Transport
Authors: Sarra Zaied, Arthur Bonpain, Pierre Yves Fravallo
Abstract:
The permanent supply of marine litter implies their accumulation in the oceans, which causes the presence of more compact wastes layers. Their Spatio-temporal distribution is never homogeneous and depends mainly on the hydrodynamic characteristics of the environment and the size and location of the wastes. As part of optimizing collect of marine plastic wastes, it is important to measure and monitor their evolution over time. For this, many research studies have been dedicated to describing the wastes behavior in order to identify their accumulation in oceans areas. Several models are therefore developed to understand the mechanisms that allow the accumulation and the displacements of marine litter. These models are able to accurately simulate the drift of wastes to study their behavior and stranding. However, these works aim to study the wastes behavior over a long period of time and not at the time of waste collection. This work investigates the transport of floating marine litter (FML) to provide basic information that can help in optimizing wastes collection by proposing a model for predicting their behavior during collection. The proposed study is based on a Lagrangian modeling approach that uses the main factors influencing the dynamics of the waste. The performance of the proposed method was assessed on real data collected from the Copernicus Marine Environment Monitoring Service (CMEMS). Evaluation results in the Java Sea (Indonesia) prove that the proposed model can effectively predict the position and the velocity of marine wastes during collection.Keywords: floating marine litter, lagrangian transport, particle-tracking model, wastes drift
Procedia PDF Downloads 19120107 Using Scanning Electron Microscope and Computed Tomography for Concrete Diagnostics of Airfield Pavements
Authors: M. Linek
Abstract:
This article presents the comparison of selected evaluation methods regarding microstructure modification of hardened cement concrete intended for airfield pavements. Basic test results were presented for two pavement quality concrete lots. Analysis included standard concrete used for airfield pavements and modern material solutions based on concrete composite modification. In case of basic grain size distribution of concrete cement CEM I 42,5HSR NA, fine aggregate and coarse aggregate fractions in the form of granite chippings, water and admixtures were considered. In case of grain size distribution of modified concrete, the use of modern modifier as substitute of fine aggregate was suggested. Modification influence on internal concrete structure parameters using scanning electron microscope was defined. Obtained images were compared to the results obtained using computed tomography. Opportunity to use this type of equipment for internal concrete structure diagnostics and an attempt of its parameters evaluation was presented. Obtained test results enabled to reach a conclusion that both methods can be applied for pavement quality concrete diagnostics, with particular purpose of airfield pavements.Keywords: scanning electron microscope, computed tomography, cement concrete, airfield pavements
Procedia PDF Downloads 33920106 Digital Design and Practice of The Problem Based Learning in College of Medicine, Qassim University, Saudi Arabia
Authors: Ahmed Elzainy, Abir El Sadik, Waleed Al Abdulmonem, Ahmad Alamro, Homaidan Al-Homaidan
Abstract:
Problem-based learning (PBL) is an educational modality which stimulates critical and creative thinking. PBL has been practiced in the college of medicine, Qassim University, Saudi Arabia, since the 2002s with offline face to face activities. Therefore, crucial technological changes in paperless work were needed. The aim of the present study was to design and implement the digitalization of the PBL activities and to evaluate its impact on students' and tutors’ performance. This approach promoted the involvement of all stakeholders after their awareness of the techniques of using online tools. IT support, learning resources facilities, and required multimedia were prepared. Students’ and staff perception surveys reflected their satisfaction with these remarkable changes. The students were interested in the new digitalized materials and educational design, which facilitated the conduction of PBL sessions and provided sufficient time for discussion and peer sharing of knowledge. It enhanced the tutors for supervision and tracking students’ activities on the Learning Management System. It could be concluded that introducing of digitalization of the PBL activities promoted the students’ performance, engagement and enabled a better evaluation of PBL materials and getting prompt students as well as staff feedback. These positive findings encouraged the college to implement the digitalization approach in other educational activities, such as Team-Based Learning, as an additional opportunity for further development.Keywords: multimedia in PBL, online PBL, problem-based learning, PBL digitalization
Procedia PDF Downloads 12020105 Feature Based Unsupervised Intrusion Detection
Authors: Deeman Yousif Mahmood, Mohammed Abdullah Hussein
Abstract:
The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.Keywords: information gain (IG), intrusion detection system (IDS), k-means clustering, Weka
Procedia PDF Downloads 29620104 Lactobacillus sp. Isolates Slaughterhouse Waste as Probiotics for Broilers
Authors: Nourmalita Safitri Ningsih, Ridwan, Iqri Puspa Yunanda
Abstract:
The aim of this study was to utilize the waste from slaughterhouses for chicken feed ingredients is probiotic. Livestock waste produced by livestock activities such as feces, urine, food remains, as well as water from livestock and cage cleaning. The process starts with the isolation of bacteria. Rumen fluid is taken at Slaughterhouse Giwangan, Yogyakarta. Isolation of Lactobacillus ruminus is done by using de Mann Rogosa Sharpe (MRS) medium. In the sample showed a rod-shaped bacteria are streaked onto an agar plates. After it was incubated at 37ºC for 48 hours, after which it is observed. The observation of these lactic acid bacteria it will show a clear zone at about the colony. These bacterial colonies are white, round, small, shiny on the agar plate mikroenkapsul In the manufacturing process carried out by the method of freeze dried using skim milk in addition capsulated material. Then the results of these capsulated bacteria are mixed with feed for livestock. The results from the mixing of capsulated bacteria in feed are to increase the quality of animal feed so as to provide a good effect on livestock. Scanning electron microscope testing we have done show the results of bacteria have been shrouded in skim milk. It can protect the bacteria so it is more durable in use. The observation of the bacteria showed a sheath on Lactobacillus sp. Preservation of bacteria in this way makes the bacteria more durable for use. As well as skim milk can protect bacteria that are resistant to the outside environment. Results of probiotics in chicken feed showed significant weight gain in chickens. Calculation Anova (P <0.005) shows the average chicken given probiotics her weight increased.Keywords: chicken, probiotics, waste, Lactobacillus sp, bacteria
Procedia PDF Downloads 31920103 Multi-Criteria Decision-Making in Ranking Drinking Water Supply Options (Case Study: Tehran City)
Authors: Mohsen Akhlaghi, Tahereh Ebrahimi
Abstract:
Considering the increasing demand for water and limited resources, there is a possibility of a water crisis in the not-so-distant future. Therefore, to prevent this crisis, other options for drinking water supply should be examined. In this regard, the application of multi-criteria decision-making methods in various aspects of water resource management and planning has always been of great interest to researchers. In this report, six options for supplying drinking water to Tehran City were considered. Then, experts' opinions were collected through matrices and questionnaires, and using the TOPSIS method, which is one of the types of multi-criteria decision-making methods, they were calculated and analyzed. In the TOPSIS method, the options were ranked by calculating their proximity to the ideal (Ci). The closer the numerical value of Ci is to one, the more desirable the option is. Based on this, the option with the optimization pattern of water consumption, with Ci = 0.9787, is the best option among the proposed options for supplying drinking water to Tehran City. The other options, in order of priority, are rainwater harvesting, wastewater reuse, increasing current water supply sources, desalination and its transfer, and transferring water from freshwater sources between basins. In conclusion, the findings of this study highlight the importance of exploring alternative drinking water supply options and utilizing multi-criteria decision-making approaches to address the potential water crisis.Keywords: multi-criteria decision, sustainable development, topsis, water supply
Procedia PDF Downloads 7020102 Adaptive Training Methods Designed to Improve a Shorter Resident Curriculum in Obstetrics and Gynecology
Authors: Philippe Judlin, Olivier Morel
Abstract:
Background: In France, the resident curriculum (RC) in Obstetrics and Gynecology (OBGYN) takes five years. In the course of the last 15 years, this RC has undergone major changes, characterized mainly by successive reductions of work hours. The program used to comprise long and frequent shifts, huge workload, poor supervision and erratic theoretical teaching. A decade ago, the French Ministry of Heath recommended a limitation of shift duration up to 24 hours and a minimum of 11 hours off duty between shifts. Last year, in order to comply with European Union directives, new recommendations have further limited residents’ work hours to 48 hours per week. Methods: Assessment of the residency program adjustments recently made to accommodate the recommendations while improving the training quality in resorting to new methods. Results: The challenge facing program directors was to provide an all-encompassing curriculum to OBGYN residents despite fewer work hours. Program has been dramatically redesigned, and several measures have been put in place: -The resident rotation system has been redesigned. Residents used to make 6-month rotations between 10-12 Departments of OBGYN or Surgery. Fewer Departments, those providing the best teaching, have been kept in the new RC. -Extensive inhouse supervision has been implemented for all kinds of clinical activities. Effectual supervision of residents has proved to be an effective tool to improve the quality of training. -The tutorship system, with academic members individually overseeing residents during their curriculum, has been perfected. It allows a better follow-up of residents’ progresses during the 5-year program. -The set up of an extensive program of lectures encompassing all maters in Obstetrics & Gynecology. These mandatory lectures are available online in a dedicated website. Therefore, face-to-face lectures have been limited in order to fit in the 48-hour limit. -The use of simulation has been significantly increased in obstetrics, materno-fetal medicine and surgery (stressing especially laparoscopic training). -Residents’ feedback has been taken into account in the setup of the new RC. Conclusion: This extensive overhaul of the Obstetrics and Gynecology RC has been in place since last year only. Nevertheless, the new program seems to adequately take into account the new recommendations while providing a better and more consistent teaching to the OBGYN residents.Keywords: education, laparoscopy, residency, simulation
Procedia PDF Downloads 18620101 The Adaptation and Evaluation of a Psychoeducational Program for Patients with Depression in General Practices in Germany
Authors: Feyza Gökce, Jochen Gensichen, Antonius Schneider, Karolina de Valerio, Gabriele Pitschel-Walz
Abstract:
People with depressive symptoms often first consult a General Practitioner (GP) before making use of other treatment options. The present study shows the adaptation and evaluation of a psychoeducational program for patients with depressive symptoms that are treated by GPs in Bavaria, Germany. The adaptation of an existing psychoeducational program, that is used in inpatient psychiatric settings, was performed in exchange with experts (psychotherapists, general practitioners, and a patient representative). As a result, a program consisting of 4 psychoeducational sessions was developed, which is carried out in individual settings in GP practices by the practitioners themselves. This program will be compared to treatment as usual that patients with depression receive by GPs. Data is collected at 3 measurement points (baseline, 3-months-follow-up, 6-months-follow-up) using different questionnaires (BDI-II, D-Lit-R German, FERUS, PAM13-D, PHQ-9, GAD-7, PHQ-15, PC-PTSD-5). In addition to the change in depressive symptoms, changes in depression knowledge, self-efficacy, and patient activation will be analyzed, and the feasibility of the program and the subjective benefit for GPs and patients will be assessed. By now, 84 patients have been recruited by 20 cluster-randomized GP practices, with 73.5% of the participants being female and 26.5% being males. The average age was M= 50.1 (SD= 14.67) years. The change in depression symptoms over the 3-month period will be compared between the two study conditions by using a linear mixed model by the end of data collection (December 2023). The subjective benefit for the patients and GPs will be assessed via feedback questionnaires. Results will be presentable by the beginning of 2024 and will provide indications for further development and barriers to the implementation of such a program for GP practices.Keywords: depression, general practice, psychoeducation, primary care
Procedia PDF Downloads 8420100 Performative Acts Exhibited in Selected Ghanaian Newspaper Headlines
Authors: Charlotte Tetebea Asiamah
Abstract:
This paper sought to highlight the use of performative acts as exhibited in a Ghanaian newspaper headline; the Daily Graphic. The study categorically discusses and analyze thirty headlines on performative acts as captured in the month of June and July, 2024. The paper dwells on J.L Austin and J.R Searle’s theory of speech acts. Although a lot has been done in the area of performative acts, there is still a gap as far as newspaper headlines are concerned. Getting to know performative act’s stand in the domain of newspaper headlines will contribute to the discussion in literature thereby extending the scope of discourse as far as performative acts are concerned. Some of the questions for this study among others are; Do performative acts exhibited in newspaper headlines follow felicity conditions? Are the utterances explicitly stated or otherwise?A qualitative method approach was used in gathering and analyzing data. This approach was chosen in order to gain a depth insight of the study. The headlines were selected using the instrument of document analysis. Out of the numerous headlines, the researcher snapped over 60 headlines after which thirty (30) headlines were carefully selected for the study. The 30 newspaper headlines were purposively selected based on the element of performativity in them which were related to the study. Per the data, the findings depicted that, Performative Acts are exhibited in the Ghanaian Daily Graphic Newspaper headlines. The performative acts are expressed in all of the five categories of performative acts as J.R Searle discussed in his writing. These acts were seen in all the categories of the newspaper headlines; be it, governance or politics, social, international news and sports. It was also observed in the data that, directives were the most used performative act. The performative acts found in the newspaper headlines helped to grab readers attention, it also served as a way of influencing how readers perceive an utterance made by an individual in the headlines.Keywords: explicit, headline, illocutionary, newspaper, performative
Procedia PDF Downloads 1920099 An ANOVA-based Sequential Forward Channel Selection Framework for Brain-Computer Interface Application based on EEG Signals Driven by Motor Imagery
Authors: Forouzan Salehi Fergeni
Abstract:
Converting the movement intents of a person into commands for action employing brain signals like electroencephalogram signals is a brain-computer interface (BCI) system. When left or right-hand motions are imagined, different patterns of brain activity appear, which can be employed as BCI signals for control. To make better the brain-computer interface (BCI) structures, effective and accurate techniques for increasing the classifying precision of motor imagery (MI) based on electroencephalography (EEG) are greatly needed. Subject dependency and non-stationary are two features of EEG signals. So, EEG signals must be effectively processed before being used in BCI applications. In the present study, after applying an 8 to 30 band-pass filter, a car spatial filter is rendered for the purpose of denoising, and then, a method of analysis of variance is used to select more appropriate and informative channels from a category of a large number of different channels. After ordering channels based on their efficiencies, a sequential forward channel selection is employed to choose just a few reliable ones. Features from two domains of time and wavelet are extracted and shortlisted with the help of a statistical technique, namely the t-test. Finally, the selected features are classified with different machine learning and neural network classifiers being k-nearest neighbor, Probabilistic neural network, support-vector-machine, Extreme learning machine, decision tree, Multi-layer perceptron, and linear discriminant analysis with the purpose of comparing their performance in this application. Utilizing a ten-fold cross-validation approach, tests are performed on a motor imagery dataset found in the BCI competition III. Outcomes demonstrated that the SVM classifier got the greatest classification precision of 97% when compared to the other available approaches. The entire investigative findings confirm that the suggested framework is reliable and computationally effective for the construction of BCI systems and surpasses the existing methods.Keywords: brain-computer interface, channel selection, motor imagery, support-vector-machine
Procedia PDF Downloads 5120098 A Systematic Review on Orphan Drugs Pricing, and Prices Challenges
Authors: Seyran Naghdi
Abstract:
Background: Orphan drug development is limited by very high costs attributed to the research and development and small size market. How health policymakers address this challenge to consider both supply and demand sides need to be explored for directing the policies and plans in the right way. The price is an important signal for pharmaceutical companies’ profitability and the patients’ accessibility as well. Objective: This study aims to find out the orphan drugs' price-setting patterns and approaches in health systems through a systematic review of the available evidence. Methods: The Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) approach was used. MedLine, Embase, and Web of Sciences were searched via appropriate search strategies. Through Medical Subject Headings (MeSH), the appropriate terms for pricing were 'cost and cost analysis', and it was 'orphan drug production', and 'orphan drug', for orphan drugs. The critical appraisal was performed by the Joanna-Briggs tool. A Cochrane data extraction form was used to obtain the data about the studies' characteristics, results, and conclusions. Results: Totally, 1,197 records were found. It included 640 hits from Embase, 327 from Web of Sciences, and 230 MedLine. After removing the duplicates, 1,056 studies remained. Of them, 924 studies were removed in the primary screening phase. Of them, 26 studies were included for data extraction. The majority of the studies (>75%) are from developed countries, among them, approximately 80% of the studies are from European countries. Approximately 85% of evidence has been produced in the recent decade. Conclusions: There is a huge variation of price-setting among countries, and this is related to the specific pharmacological market structure and the thresholds that governments want to intervene in the process of pricing. On the other hand, there is some evidence on the availability of spaces to reduce the very high costs of orphan drugs development through an early agreement between pharmacological firms and governments. Further studies need to focus on how the governments could incentivize the companies to agree on providing the drugs at lower prices.Keywords: orphan drugs, orphan drug production, pricing, costs, cost analysis
Procedia PDF Downloads 16320097 Augmented Reality: New Relations with the Architectural Heritage Education
Authors: Carla Maria Furuno Rimkus
Abstract:
The technologies related to virtual reality and augmented reality in combination with mobile technologies, are being more consolidated and used each day. The increasing technological availability along with the decrease of their acquisition and maintenance costs, have favored the expansion of its use in the field of historic heritage. In this context it is focused, in this article, on the potential of mobile applications in the dissemination of the architectural heritage, using the technology of Augmented Reality. From this perspective approach, it is discussed about the process of producing an application for mobile devices on the Android platform, which combines the technologies of geometric modeling with augmented reality (AR) and access to interactive multimedia contents with cultural, social and historic information of the historic building that we take as the object of study: a block with a set of buildings built in the XVIII century, known as "Quarteirão dos Trapiches", which was modeled in 3D, coated with the original texture of its facades and displayed on AR. From this perspective approach, this paper discusses about methodological aspects of the development of this application regarding to the process and the project development tools, and presents our considerations on methodological aspects of developing an application for the Android system, focused on the dissemination of the architectural heritage, in order to encourage the tourist potential of the city in a sustainable way and to contribute to develop the digital documentation of the heritage of the city, meeting a demand of tourists visiting the city and the professionals who work in the preservation and restoration of it, consisting of architects, historians, archaeologists, museum specialists, among others.Keywords: augmented reality, architectural heritage, geometric modeling, mobile applications
Procedia PDF Downloads 47820096 Functionalization of the Surface of Porous Titanium Nickel Alloy
Authors: Gulsharat A. Baigonakova, Ekaterina S. Marchenko, Venera R. Luchsheva
Abstract:
The preferred materials for bone grafting are titanium-nickel alloys. They have a porous, permeable structure similar to that of bone tissue, can withstand long-term physiological stress in the body, and retain the scaffolding function for bone tissue ingrowth. Despite the excellent functional properties of these alloys, there is a possibility of post-operative infectious complications that prevent the newly formed bone tissue from filling the spaces created in a timely manner and prolong the rehabilitation period of patients. In order to minimise such consequences, it is necessary to use biocompatible materials capable of simultaneously fulfilling the function of a long-term functioning implant and an osteoreplacement carrier saturated with drugs. Methods to modify the surface by saturation with bioactive substances, in particular macrocyclic compounds, for the controlled release of drugs, biologically active substances, and cells are becoming increasingly important. This work is dedicated to the functionalisation of the surface of porous titanium nickelide by the deposition of macrocyclic compounds in order to provide titanium nickelide with antibacterial activity and accelerated osteogenesis. The paper evaluates the effect of macrocyclic compound deposition methods on the continuity, structure, and cytocompatibility of the surface properties of porous titanium nickelide. Macrocyclic compounds were deposited on the porous surface of titanium nickelide under the influence of various physical effects. Structural research methods have allowed the evaluation of the surface morphology of titanium nickelide and the nature of the distribution of these compounds. The method of surface functionalisation of titanium nickelide influences the size of the deposited bioactive molecules and the nature of their distribution. The surface functionalisation method developed has enabled titanium nickelide to be deposited uniformly on the inner and outer surfaces of the pores, which will subsequently enable the material to be uniformly saturated with various drugs, including antibiotics and inhibitors. The surface-modified porous titanium nickelide showed high biocompatibility and low cytotoxicity in in vitro studies. The research was carried out with financial support from the Russian Science Foundation under Grant No. 22-72-10037.Keywords: biocompatibility, NiTi, surface, porous structure
Procedia PDF Downloads 8420095 A Comparative Study: Comparison of Two Different Fluorescent Stains -Auramine and Rhodamine- with Ehrlich-Ziehl-Neelsen, Kinyoun Staining, and Culture in the Determination of Acid Resistant Bacilli
Authors: Recep Keşli, Hayriye Tokay, Cengiz Demir, İsmail Ceyhan
Abstract:
Objective: In many countries, tuberculosis (TB) is still one of the most important diseases. Tuberculosis is among top 10 causes of death worldwide. The early diagnosis of active tuberculosis still depends on the presence of acid resistant bacilli (ARB) in stained smears. In this study, we aimed to investigate the diagnostic performances of Erlich Ziehl Neelsen (EZN), Kinyoun and two different fluorescent stains. Methods: The specimens were obtained from the patients who applied to Chest Diseases Departments of Ankara Atatürk Chest Diseases and Thoracic Surgery Training and Research Hospital, and Afyon Kocatepe University, ANS Research and Practice Hospital. The study was carried out in the Medical Microbiology Laboratory, School of Medicine, Afyon Kocatepe University. All the non-sterile specimens were homogenized and decontaminated according to the EUCAST instructions. Samples were inoculated onto the Löwenstein-Jensen agars (bio-Merieux Marcy l'Etoile, France) and then incubated at 37˚C, for 40 days. Four smears were prepared from each specimen. Slides were stained with commercial EZN (BD, Sparks, USA), Kinyoun (SALUBRIS Istanbul, Turkey), Auramine (SALUBRIS Istanbul, Turkey) and Rhodamine (SALUBRIS Istanbul, Turkey) kit. While EZN and Kinyoun stainings were examined by light microscope, Auramine and Rhodamine slides were examined by fluorescence microscopy. Results: A total of 158 respiratory system samples (sputum, broncho alveolar lavage fluid…etc) were enrolled into the study. A hundred and two of the samples that processed were found as culture positive. The sensitivity, specificity, positive predictive, and negative predictive values were detected as 100%, 67.5%, 73.5%, and 100% for EZN, 100%, 70.9%, 77.4%, and 100% for Kinyoun, 100%,77.8%, 84.3%, 100% for Auramine, and 100%, 80% , 86.3%, and 100% for Rhodamine respectively. Conclusions: According to our study auramine and rhodamine staining methods showed the best diagnostic performance among the four investigated staining methods. In conclusion, the fluorochrome staining method may be accepted as the most reliable, rapid and useful method for diagnosis of the mycobacterial infections truly.Keywords: acid resistant bacilli (ARB), auramine, Ehrlich-Ziehl-Neelsen (EZN), Kinyoun, Rhodamine
Procedia PDF Downloads 27720094 New Environmental Culture in Algeria: Eco Design
Authors: S. Tireche, A. Tairi abdelaziz
Abstract:
Environmental damage has increased steadily in recent decades: Depletion of natural resources, destruction of the ozone layer, greenhouse effect, degradation of the quality of life, land use etc. New terms have emerged as: "Prevention rather than cure" or "polluter pays" falls within the principles of common sense, their practical implementation still remains fragmented. Among the avenues to be explored, one of the most promising is certainly one that focuses on product design. Indeed, where better than during the design phase, can reduce the source of future impacts on the environment? What choices or those of design, they influence more on the environmental characteristics of products? The most currently recognized at the international level is the analysis of the life cycle (LCA) and Life Cycle Assessment, subject to International Standardization (ISO 14040-14043). LCA provides scientific and objective assessment of potential impacts of the product or service, considering its entire life cycle. This approach makes it possible to minimize impacts to the source in pollution prevention. It is widely preferable to curative approach, currently majority in the industrial crops, led mostly by a report of pollution. The "product" is to reduce the environmental impacts of a given product, taking into account all or part of its life cycle. Currently, there are emerging tools, known as eco-design. They are intended to establish an environmental profile of the product to improve its environmental performance. They require a quantity sufficient information on the product for each phase of its life cycle: raw material extraction, manufacturing, distribution, usage, end of life (recycling or incineration or deposit) and all stages of transport. The assessment results indicate the sensitive points of the product studied, points on which the developer must act.Keywords: eco design, impact, life cycle analysis (LCA), sustainability
Procedia PDF Downloads 42720093 Constructing a Semi-Supervised Model for Network Intrusion Detection
Authors: Tigabu Dagne Akal
Abstract:
While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.Keywords: intrusion detection, data mining, computer science, data mining
Procedia PDF Downloads 29620092 Role of von Willebrand Factor Antigen as Non-Invasive Biomarker for the Prediction of Portal Hypertensive Gastropathy in Patients with Liver Cirrhosis
Authors: Mohamed El Horri, Amine Mouden, Reda Messaoudi, Mohamed Chekkal, Driss Benlaldj, Malika Baghdadi, Lahcene Benmahdi, Fatima Seghier
Abstract:
Background/aim: Recently, the Von Willebrand factor antigen (vWF-Ag)has been identified as a new marker of portal hypertension (PH) and its complications. Few studies talked about its role in the prediction of esophageal varices. VWF-Ag is considered a non-invasive approach, In order to avoid the endoscopic burden, cost, drawbacks, unpleasant and repeated examinations to the patients. In our study, we aimed to evaluate the ability of this marker in the prediction of another complication of portal hypertension, which is portal hypertensive gastropathy (PHG), the one that is diagnosed also by endoscopic tools. Patients and methods: It is about a prospective study, which include 124 cirrhotic patients with no history of bleeding who underwent screening endoscopy for PH-related complications like esophageal varices (EVs) and PHG. Routine biological tests were performed as well as the VWF-Ag testing by both ELFA and Immunoturbidimetric techniques. The diagnostic performance of our marker was assessed using sensitivity, specificity, positive predictive value, negative predictive value, accuracy, and receiver operating characteristic curves. Results: 124 patients were enrolled in this study, with a mean age of 58 years [CI: 55 – 60 years] and a sex ratio of 1.17. Viral etiologies were found in 50% of patients. Screening endoscopy revealed the presence of PHG in 20.2% of cases, while for EVsthey were found in 83.1% of cases. VWF-Ag levels, were significantly increased in patients with PHG compared to those who have not: 441% [CI: 375 – 506], versus 279% [CI: 253 – 304], respectively (p <0.0001). Using the area under the receiver operating characteristic curve (AUC), vWF-Ag was a good predictor for the presence of PHG. With a value higher than 320% and an AUC of 0.824, VWF-Ag had an 84% sensitivity, 74% specificity, 44.7% positive predictive value, 94.8% negative predictive value, and 75.8% diagnostic accuracy. Conclusion: VWF-Ag is a good non-invasive low coast marker for excluding the presence of PHG in patients with liver cirrhosis. Using this marker as part of a selective screening strategy might reduce the need for endoscopic screening and the coast of the management of these kinds of patients.Keywords: von willebrand factor, portal hypertensive gastropathy, prediction, liver cirrhosis
Procedia PDF Downloads 20520091 Restructuring the College Classroom: Scaffolding Student Learning and Engagement in Higher Education
Authors: Claire Griffin
Abstract:
Recent years have witnessed a surge in the use of innovative teaching approaches to support student engagement and higher-order learning within higher education. This paper seeks to explore the use of collaborative, interactive teaching and learning strategies to support student engagement in a final year undergraduate Developmental Psychology module. In particular, the use of the jigsaw method, in-class presentations and online discussion fora were adopted in a ‘lectorial’ style teaching approach, aimed at scaffolding learning, fostering social interdependence and supporting various levels of student engagement in higher education. Using the ‘Student Course Engagement Questionnaire’, the impact of such teaching strategies on students’ college classroom experience was measured, with additional qualitative student feedback gathered. Results illustrate the positive impact of the teaching methodologies on students’ levels of engagement, with positive implications emerging across the four engagement factors: skills engagement, emotional engagement, participation/interaction engagement and performance engagement. Thematic analysis on students’ qualitative comments also provided greater insight into the positive impact of the ‘lectorial’ teaching approach on students’ classroom experience within higher level education. Implications of the findings are presented in terms of informing effective teaching practices within higher education. Additional avenues for future research and strategy usage will also be discussed, in light of evolving practice and cutting edge literature within the field.Keywords: learning, higher education, scaffolding, student engagement
Procedia PDF Downloads 378