Search results for: deep neural models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9152

Search results for: deep neural models

6422 Adaptor Protein APPL2 Could Be a Therapeutic Target for Improving Hippocampal Neurogenesis and Attenuating Depressant Behaviors and Olfactory Dysfunctions in Chronic Corticosterone-induced Depression

Authors: Jiangang Shen

Abstract:

Olfactory dysfunction is a common symptom companied by anxiety- and depressive-like behaviors in depressive patients. Chronic stress triggers hormone responses and inhibits the proliferation and differentiation of neural stem cells (NSCs) in the hippocampus and subventricular zone (SVZ)-olfactory bulb (OB), contributing to depressive behaviors and olfactory dysfunction. However, the cellular signaling molecules to regulate chronic stress mediated olfactory dysfunction are largely unclear. Adaptor proteins containing the pleckstrin homology domain, phosphotyrosine binding domain, and leucine zipper motif (APPLs) are multifunctional adaptor proteins. Herein, we tested the hypothesis that APPL2 could inhibit hippocampal neurogenesis by affecting glucocorticoid receptor (GR) signaling, subsequently contributing to depressive and anxiety behaviors as well as olfactory dysfunctions. The major discoveries are included: (1) APPL2 Tg mice had enhanced GR phosphorylation under basic conditions but had no different plasma corticosterone (CORT) level and GR phosphorylation under stress stimulation. (2) APPL2 Tg mice had impaired hippocampal neurogenesis and revealed depressive and anxiety behaviors. (3) GR antagonist RU486 reversed the impaired hippocampal neurogenesis in the APPL2 Tg mice. (4) APPL2 Tg mice displayed higher GR activity and less capacity for neurogenesis at the olfactory system with lesser olfactory sensitivity than WT mice. (5) APPL2 negatively regulates olfactory functions by switching fate commitments of NSCs in adult olfactory bulbs via interaction with Notch1 signaling. Furthermore, baicalin, a natural medicinal compound, was found to be a promising agent targeting APPL2/GR signaling and promoting adult neurogenesis in APPL2 Tg mice and chronic corticosterone-induced depression mouse models. Behavioral tests revealed that baicalin had antidepressant and olfactory-improving effects. Taken together, APPL2 is a critical therapeutic target for antidepressant treatment.

Keywords: APPL2, hippocampal neurogenesis, depressive behaviors and olfactory dysfunction, stress

Procedia PDF Downloads 64
6421 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 299
6420 Anti-Inflammatory, Analgesic and Antipyretic Activity of Terminalia arjuna Roxb. Extract in Animal Models

Authors: Linda Chularojmontri, Seewaboon Sireeratawong, Suvara Wattanapitayakul

Abstract:

Terminalia arjuna Roxb. (family Combretaceae) is commonly known as ‘Sa maw thet’ in Thai. The fruit is used in traditional medicine as natural mild laxatives, carminative and expectorant. Aim of the study: This research aims to study the anti-inflammatory, analgesic and antipyretic activities of Terminalia arjuna extract by using animal models in comparison to the reference drugs. Materials and Methods: The anti-inflammatory study was conducted by two experimental animal models namely ethyl phenylpropionate (EPP)-induced ear edema and carrageenan-induced paw edema. The study of analgesic activity used two methods of pain induction including acetic acid and heat-induced pain. In addition, the antipyretic activity study was performed by induced hyperthermia with yeast. Results: The results showed that the oral administration of Terminalia arjuna extract possessed acute anti-inflammatory effect in carrageenan-induced paw edema. Terminalia arjuna extract showed the analgesic activity in acetic acid-induced writhing response and heat-induced pain. This indicates its peripheral effect by inhibiting the biosynthesis and/or release of some pain mediators and some mechanism through Central nervous system. Moreover, Terminalia arjuna extract at the dose of 1000 and 1500 mg/kg body weight showed the antipyretic activity, which might be because of the inhibition of prostaglandins. Conclusion: The findings of this study indicated that the Terminalia arjuna extract possesses the anti-inflammatory, analgesic and antipyretic activities in animals.

Keywords: analgesic activity, anti-inflammatory activity, antipyretic activity, Terminalia arjuna extract

Procedia PDF Downloads 254
6419 A CORDIC Based Design Technique for Efficient Computation of DCT

Authors: Deboraj Muchahary, Amlan Deep Borah Abir J. Mondal, Alak Majumder

Abstract:

A discrete cosine transform (DCT) is described and a technique to compute it using fast Fourier transform (FFT) is developed. In this work, DCT of a finite length sequence is obtained by incorporating CORDIC methodology in radix-2 FFT algorithm. The proposed methodology is simple to comprehend and maintains a regular structure, thereby reducing computational complexity. DCTs are used extensively in the area of digital processing for the purpose of pattern recognition. So the efficient computation of DCT maintaining a transparent design flow is highly solicited.

Keywords: DCT, DFT, CORDIC, FFT

Procedia PDF Downloads 460
6418 Generation of High-Quality Synthetic CT Images from Cone Beam CT Images Using A.I. Based Generative Networks

Authors: Heeba A. Gurku

Abstract:

Introduction: Cone Beam CT(CBCT) images play an integral part in proper patient positioning in cancer patients undergoing radiation therapy treatment. But these images are low in quality. The purpose of this study is to generate high-quality synthetic CT images from CBCT using generative models. Material and Methods: This study utilized two datasets from The Cancer Imaging Archive (TCIA) 1) Lung cancer dataset of 20 patients (with full view CBCT images) and 2) Pancreatic cancer dataset of 40 patients (only 27 patients having limited view images were included in the study). Cycle Generative Adversarial Networks (GAN) and its variant Attention Guided Generative Adversarial Networks (AGGAN) models were used to generate the synthetic CTs. Models were evaluated by visual evaluation and on four metrics, Structural Similarity Index Measure (SSIM), Peak Signal Noise Ratio (PSNR) Mean Absolute Error (MAE) and Root Mean Square Error (RMSE), to compare the synthetic CT and original CT images. Results: For pancreatic dataset with limited view CBCT images, our study showed that in Cycle GAN model, MAE, RMSE, PSNR improved from 12.57to 8.49, 20.94 to 15.29 and 21.85 to 24.63, respectively but structural similarity only marginally increased from 0.78 to 0.79. Similar, results were achieved with AGGAN with no improvement over Cycle GAN. However, for lung dataset with full view CBCT images Cycle GAN was able to reduce MAE significantly from 89.44 to 15.11 and AGGAN was able to reduce it to 19.77. Similarly, RMSE was also decreased from 92.68 to 23.50 in Cycle GAN and to 29.02 in AGGAN. SSIM and PSNR also improved significantly from 0.17 to 0.59 and from 8.81 to 21.06 in Cycle GAN respectively while in AGGAN SSIM increased to 0.52 and PSNR increased to 19.31. In both datasets, GAN models were able to reduce artifacts, reduce noise, have better resolution, and better contrast enhancement. Conclusion and Recommendation: Both Cycle GAN and AGGAN were significantly able to reduce MAE, RMSE and PSNR in both datasets. However, full view lung dataset showed more improvement in SSIM and image quality than limited view pancreatic dataset.

Keywords: CT images, CBCT images, cycle GAN, AGGAN

Procedia PDF Downloads 70
6417 Enhancing Early Detection of Coronary Heart Disease Through Cloud-Based AI and Novel Simulation Techniques

Authors: Md. Abu Sufian, Robiqul Islam, Imam Hossain Shajid, Mahesh Hanumanthu, Jarasree Varadarajan, Md. Sipon Miah, Mingbo Niu

Abstract:

Coronary Heart Disease (CHD) remains a principal cause of global morbidity and mortality, characterized by atherosclerosis—the build-up of fatty deposits inside the arteries. The study introduces an innovative methodology that leverages cloud-based platforms like AWS Live Streaming and Artificial Intelligence (AI) to early detect and prevent CHD symptoms in web applications. By employing novel simulation processes and AI algorithms, this research aims to significantly mitigate the health and societal impacts of CHD. Methodology: This study introduces a novel simulation process alongside a multi-phased model development strategy. Initially, health-related data, including heart rate variability, blood pressure, lipid profiles, and ECG readings, were collected through user interactions with web-based applications as well as API Integration. The novel simulation process involved creating synthetic datasets that mimic early-stage CHD symptoms, allowing for the refinement and training of AI algorithms under controlled conditions without compromising patient privacy. AWS Live Streaming was utilized to capture real-time health data, which was then processed and analysed using advanced AI techniques. The novel aspect of our methodology lies in the simulation of CHD symptom progression, which provides a dynamic training environment for our AI models enhancing their predictive accuracy and robustness. Model Development: it developed a machine learning model trained on both real and simulated datasets. Incorporating a variety of algorithms including neural networks and ensemble learning model to identify early signs of CHD. The model's continuous learning mechanism allows it to evolve adapting to new data inputs and improving its predictive performance over time. Results and Findings: The deployment of our model yielded promising results. In the validation phase, it achieved an accuracy of 92% in predicting early CHD symptoms surpassing existing models. The precision and recall metrics stood at 89% and 91% respectively, indicating a high level of reliability in identifying at-risk individuals. These results underscore the effectiveness of combining live data streaming with AI in the early detection of CHD. Societal Implications: The implementation of cloud-based AI for CHD symptom detection represents a significant step forward in preventive healthcare. By facilitating early intervention, this approach has the potential to reduce the incidence of CHD-related complications, decrease healthcare costs, and improve patient outcomes. Moreover, the accessibility and scalability of cloud-based solutions democratize advanced health monitoring, making it available to a broader population. This study illustrates the transformative potential of integrating technology and healthcare, setting a new standard for the early detection and management of chronic diseases.

Keywords: coronary heart disease, cloud-based ai, machine learning, novel simulation techniques, early detection, preventive healthcare

Procedia PDF Downloads 48
6416 Focus-Latent Dirichlet Allocation for Aspect-Level Opinion Mining

Authors: Mohsen Farhadloo, Majid Farhadloo

Abstract:

Aspect-level opinion mining that aims at discovering aspects (aspect identification) and their corresponding ratings (sentiment identification) from customer reviews have increasingly attracted attention of researchers and practitioners as it provides valuable insights about products/services from customer's points of view. Instead of addressing aspect identification and sentiment identification in two separate steps, it is possible to simultaneously identify both aspects and sentiments. In recent years many graphical models based on Latent Dirichlet Allocation (LDA) have been proposed to solve both aspect and sentiment identifications in a single step. Although LDA models have been effective tools for the statistical analysis of document collections, they also have shortcomings in addressing some unique characteristics of opinion mining. Our goal in this paper is to address one of the limitations of topic models to date; that is, they fail to directly model the associations among topics. Indeed in many text corpora, it is natural to expect that subsets of the latent topics have higher probabilities. We propose a probabilistic graphical model called focus-LDA, to better capture the associations among topics when applied to aspect-level opinion mining. Our experiments on real-life data sets demonstrate the improved effectiveness of the focus-LDA model in terms of the accuracy of the predictive distributions over held out documents. Furthermore, we demonstrate qualitatively that the focus-LDA topic model provides a natural way of visualizing and exploring unstructured collection of textual data.

Keywords: aspect-level opinion mining, document modeling, Latent Dirichlet Allocation, LDA, sentiment analysis

Procedia PDF Downloads 84
6415 Predictive Analytics Algorithms: Mitigating Elementary School Drop Out Rates

Authors: Bongs Lainjo

Abstract:

Educational institutions and authorities that are mandated to run education systems in various countries need to implement a curriculum that considers the possibility and existence of elementary school dropouts. This research focuses on elementary school dropout rates and the ability to replicate various predictive models carried out globally on selected Elementary Schools. The study was carried out by comparing the classical case studies in Africa, North America, South America, Asia and Europe. Some of the reasons put forward for children dropping out include the notion of being successful in life without necessarily going through the education process. Such mentality is coupled with a tough curriculum that does not take care of all students. The system has completely led to poor school attendance - truancy which continuously leads to dropouts. In this study, the focus is on developing a model that can systematically be implemented by school administrations to prevent possible dropout scenarios. At the elementary level, especially the lower grades, a child's perception of education can be easily changed so that they focus on the better future that their parents desire. To deal effectively with the elementary school dropout problem, strategies that are put in place need to be studied and predictive models are installed in every educational system with a view to helping prevent an imminent school dropout just before it happens. In a competency-based curriculum that most advanced nations are trying to implement, the education systems have wholesome ideas of learning that reduce the rate of dropout.

Keywords: elementary school, predictive models, machine learning, risk factors, data mining, classifiers, dropout rates, education system, competency-based curriculum

Procedia PDF Downloads 157
6414 Assessment of Reservoir Quality and Heterogeneity in Middle Buntsandstein Sandstones of Southern Netherlands for Deep Geothermal Exploration

Authors: Husnain Yousaf, Rudy Swennen, Hannes Claes, Muhammad Amjad

Abstract:

In recent years, the Lower Triassic Main Buntsandstein sandstones in the southern Netherlands Basins have become a point of interest for their deep geothermal potential. To identify the most suitable reservoir for geothermal exploration, the diagenesis and factors affecting reservoir quality, such as porosity and permeability, are assessed. This is done by combining point-counted petrographic data with conventional core analysis. The depositional environments play a significant role in determining the distribution of lithofacies, cement, clays, and grain sizes. The position in the basin and proximity to the source areas determine the lateral variability of depositional environments. The stratigraphic distribution of depositional environments is linked to both local topography and climate, where high humidity leads to fluvial deposition and high aridity periods lead to aeolian deposition. The Middle Buntsandstein Sandstones in the southern part of the Netherlands shows high porosity and permeability in most sandstone intervals. There are various controls on reservoir quality in the examined sandstone samples. Grain sizes and total quartz content are the primary factors affecting reservoir quality. Conversely, carbonate and anhydrite cement, clay clasts, and intergranular clay represent a local control and cannot be applied on a regional scale. Similarly, enhanced secondary porosity due to feldspar dissolution is locally restricted and minor. The analysis of textural, mineralogical, and petrophysical data indicates that the aeolian and fluvial sandstones represent a heterogeneous reservoir system. The ephemeral fluvial deposits have an average porosity and permeability of <10% and <1mD, respectively, while the aeolian sandstones exhibit values of >18% and >100mD.

Keywords: reservoir quality, diagenesis, porosity, permeability, depositional environments, Buntsandstein, Netherlands

Procedia PDF Downloads 54
6413 Prototype of an Interactive Toy from Lego Robotics Kits for Children with Autism

Authors: Ricardo A. Martins, Matheus S. da Silva, Gabriel H. F. Iarossi, Helen C. M. Senefonte, Cinthyan R. S. C. de Barbosa

Abstract:

This paper is the development of a concept of the man/robot interaction. More accurately in developing of an autistic child that have more troubles with interaction, here offers an efficient solution, even though simple; however, less studied for this public. This concept is based on code applied thought out the Lego NXT kit, built for the interpretation of the robot, thereby can create this interaction in a constructive way for children suffering with Autism.

Keywords: lego NXT, interaction, BricX, autismo, ANN (Artificial Neural Network), MLP back propagation, hidden layers

Procedia PDF Downloads 550
6412 Assessment of Image Databases Used for Human Skin Detection Methods

Authors: Saleh Alshehri

Abstract:

Human skin detection is a vital step in many applications. Some of the applications are critical especially those related to security. This leverages the importance of a high-performance detection algorithm. To validate the accuracy of the algorithm, image databases are usually used. However, the suitability of these image databases is still questionable. It is suggested that the suitability can be measured mainly by the span the database covers of the color space. This research investigates the validity of three famous image databases.

Keywords: image databases, image processing, pattern recognition, neural networks

Procedia PDF Downloads 251
6411 Patient Care Needs Assessment: An Evidence-Based Process to Inform Quality Care and Decision Making

Authors: Wynne De Jong, Robert Miller, Ross Riggs

Abstract:

Beyond the number of nurses providing care for patients, having nurses with the right skills, experience and education is essential to ensure the best possible outcomes for patients. Research studies continue to link nurse staffing and skill mix with nurse-sensitive patient outcomes; numerous studies clearly show that superior patient outcomes are associated with higher levels of regulated staff. Due to the limited number of tools and processes available to assist nurse leaders with staffing models of care, nurse leaders are constantly faced with the ongoing challenge to ensure their staffing models of care best suit their patient population. In 2009, several hospitals in Ontario, Canada participated in a research study to develop and evaluate an RN/RPN utilization toolkit. The purpose of this study was to develop and evaluate a toolkit for Registered Nurses/Registered Practical Nurses Staff mix decision-making based on the College of Nurses of Ontario, Canada practice standards for the utilization of RNs and RPNs. This paper will highlight how an organization has further developed the Patient Care Needs Assessment (PCNA) questionnaire, a major component of the toolkit. Moreover, it will demonstrate how it has utilized the information from PCNA to clearly identify patient and family care needs, thus providing evidence-based results to assist leaders with matching the best staffing skill mix to their patients.

Keywords: nurse staffing models of care, skill mix, nursing health human resources, patient safety

Procedia PDF Downloads 298
6410 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features

Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh

Abstract:

In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.

Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve

Procedia PDF Downloads 247
6409 Electrophysiological Correlates of Statistical Learning in Children with and without Developmental Language Disorder

Authors: Ana Paula Soares, Alexandrina Lages, Helena Oliveira, Francisco-Javier Gutiérrez-Domínguez, Marisa Lousada

Abstract:

From an early age, exposure to a spoken language allows us to implicitly capture the structure underlying the succession of the speech sounds in that language and to segment it into meaningful units (words). Statistical learning (SL), i.e., the ability to pick up patterns in the sensory environment even without intention or consciousness of doing it, is thus assumed to play a central role in the acquisition of the rule-governed aspects of language and possibly to lie behind the language difficulties exhibited by children with development language disorder (DLD). The research conducted so far has, however, led to inconsistent results, which might stem from the behavioral tasks used to test SL. In a classic SL experiment, participants are first exposed to a continuous stream (e.g., syllables) in which, unbeknownst to the participants, stimuli are grouped into triplets that always appear together in the stream (e.g., ‘tokibu’, ‘tipolu’), with no pauses between each other (e.g., ‘tokibutipolugopilatokibu’) and without any information regarding the task or the stimuli. Following exposure, SL is assessed by asking participants to discriminate between triplets previously presented (‘tokibu’) from new sequences never presented together during exposure (‘kipopi’), i.e., to perform a two-alternative-forced-choice (2-AFC) task. Despite the widespread use of the 2-AFC to test SL, it has come under increasing criticism as it is an offline post-learning task that only assesses the result of the learning that had occurred during the previous exposure phase and that might be affected by other factors beyond the computation of regularities embedded in the input, typically the likelihood two syllables occurring together, a statistic known as transitional probability (TP). One solution to overcome these limitations is to assess SL as exposure to the stream unfolds using online techniques such as event-related potentials (ERP) that is highly sensitive to the time-course of the learning in the brain. Here we collected ERPs to examine the neurofunctional correlates of SL in preschool children with DLD, and chronological-age typical language development (TLD) controls who were exposed to an auditory stream in which eight three-syllable nonsense words, four of which presenting high-TPs and the other four low-TPs, to further analyze whether the ability of DLD and TLD children to extract-word-like units from the steam was modulated by words’ predictability. Moreover, to ascertain if the previous knowledge of the to-be-learned-regularities affected the neural responses to high- and low-TP words, children performed the auditory SL task, firstly, under implicit, and, subsequently, under explicit conditions. Although behavioral evidence of SL was not obtained in either group, the neural responses elicited during the exposure phases of the SL tasks differentiated children with DLD from children with TLD. Specifically, the results indicated that only children from the TDL group showed neural evidence of SL, particularly in the SL task performed under explicit conditions, firstly, for the low-TP, and, subsequently, for the high-TP ‘words’. Taken together, these findings support the view that children with DLD showed deficits in the extraction of the regularities embedded in the auditory input which might underlie the language difficulties.

Keywords: development language disorder, statistical learning, transitional probabilities, word segmentation

Procedia PDF Downloads 181
6408 Revisionism in Literature: Deconstructing Patriarchal Ideals in Margaret Atwood's The Penelopiad

Authors: Essam Abdelhamid Hegazy

Abstract:

This paper aims to read Margaret Atwood's The Penelopiad (2005) via a revisionist and deconstructive approach. This novel is a postmodernist exploration of the grand-narrative myth The Odyssey (800 BC) by Homer, who portrayed the heroic warrior and the faithful wife as the epitome of perfect male and female models _examples whom all must follow and mimic. In Atwood's narrative, the same two hero models are the two great tricksters who are willing to perform any sort of obnoxious act for achieving their goals. This research tries to examine how Atwood tried to synthesize the change in character’s narratives leading to the humanization of the perfect hero and the ideal wife. The researcher has used a multidisciplinary approach where the feminist, revisionist and deconstructive theories were implemented to identify and find out the new interpretations of the myths that center the experiences and perspectives of women. Research findings are that revisionist approach was applied through giving an opportunity to the victimized and the voiceless to speak out and retaliate against their prosecutions.

Keywords: margret atwood, patriarchal, penelopiad, revisionism

Procedia PDF Downloads 67
6407 The Principle of Methodological Rationality and Security of Organisations

Authors: Jan Franciszek Jacko

Abstract:

This investigation presents the principle of methodological rationality of decision making and discusses the impact of an organisation's members' methodologically rational or irrational decisions on its security. This study formulates and partially justifies some research hypotheses regarding the impact. The thinking experiment is used according to Max Weber's ideal types method. Two idealised situations("models") are compared: Model A, whereall decision-makers follow methodologically rational decision-making procedures. Model B, in which these agents follow methodologically irrational decision-making practices. Analysing and comparing the two models will allow the formulation of some research hypotheses regarding the impact of methodologically rational and irrational attitudes of members of an organisation on its security. In addition to the method, phenomenological analyses of rationality and irrationality are applied.

Keywords: methodological rationality, rational decisions, security of organisations, philosophy of economics

Procedia PDF Downloads 126
6406 Identification and Prioritisation of Students Requiring Literacy Intervention and Subsequent Communication with Key Stakeholders

Authors: Emilie Zimet

Abstract:

During networking and NCCD moderation meetings, best practices for identifying students who require Literacy Intervention are often discussed. Once these students are identified, consideration is given to the most effective process for prioritising those who have the greatest need for Literacy Support and the allocation of resources, tracking of intervention effectiveness and communicating with teachers/external providers/parents. Through a workshop, the group will investigate best practices to identify students who require literacy support and strategies to communicate and track their progress. In groups, participants will examine what they do in their settings and then compare with other models, including the researcher’s model, to decide the most effective path to identification and communication. Participants will complete a worksheet at the beginning of the session to deeply consider their current approaches. The participants will be asked to critically analyse their own identification processes for Literacy Intervention, ensuring students are not overlooked if they fall into the borderline category. A cut-off for students to access intervention will be considered so as not to place strain on already stretched resources along with the most effective allocation of resources. Furthermore, communicating learning needs and differentiation strategies to staff is paramount to the success of an intervention, and participants will look at the frequency of communication to share such strategies and updates. At the end of the session, the group will look at creating or evolving models that allow for best practices for the identification and communication of Literacy Interventions. The proposed outcome for this research is to develop a model of identification of students requiring Literacy Intervention that incorporates the allocation of resources and communication to key stakeholders. This will be done by pooling information and discussing a variety of models used in the participant's school settings.

Keywords: identification, student selection, communication, special education, school policy, planning for intervention

Procedia PDF Downloads 38
6405 Double Row Taper Roller Bearing Wheel-end System in Rigid Rear Drive Axle in Heavy Duty SUV Passenger Vehicle

Authors: Mohd Imtiaz S, Saurabh Jain, Pothiraj K.

Abstract:

In today’s highly competitive passenger vehicle market, comfortable driving experience is one of the key parameters significantly weighed by the customer. Smooth ride and handling of the vehicle with exceptionally reliable wheel end solution is a paramount requirement in passenger Sports Utility Vehicle (SUV) vehicles subjected to challenging terrains and loads with rigid rear drive axle configuration. Traditional wheel-end bearing systems in passenger segment rigid rear drive axle utilizes the semi-floating layout, which imparts vertical bending loads and torsion to the axle shafts. The wheel-end bearing is usually a Single or Double Row Deep-Groove Ball Bearing (DRDGBB) or Double Row Angular Contact Ball Bearing (DRACBB). This solution is cost effective and simple in architecture. However, it lacks effectiveness against the heavy loads subjected to a SUV vehicle, especially the axial trust at high-speed cornering. This paper describes the solution of Double Row Taper Roller Bearing (DRTRB) wheel-end for a SUV vehicle in the rigid rear drive axle and improvement in terms of maximizing its load carrying capacity along with better reliability in terms of axial thrust in high-speed cornering. It describes the advantage of geometry of DRTRB over DRDGBB and DRACBB highlighting contact and load flow. The paper also highlights the vehicle level considerations affecting the B10 life of the bearing system for better selection of the DRTRB wheel-ends systems. This paper also describes real time vehicle level results along with theoretical improvements.

Keywords: axial thrust, b10 life, deep-groove ball bearing, taper roller bearing, semi-floating layout.

Procedia PDF Downloads 59
6404 Special Case of Trip Distribution Model and Its Use for Estimation of Detailed Transport Demand in the Czech Republic

Authors: Jiri Dufek

Abstract:

The national model of the Czech Republic has been modified in a detailed way to get detailed travel demand in the municipality level (cities, villages over 300 inhabitants). As a technique for this detailed modelling, three-dimensional procedure for calibrating gravity models, was used. Besides of zone production and attraction, which is usual in gravity models, the next additional parameter for trip distribution was introduced. Usually it is called by “third dimension”. In the model, this parameter is a demand between regions. The distribution procedure involved calculation of appropriate skim matrices and its multiplication by three coefficients obtained by iterative balancing of production, attraction and third dimension. This type of trip distribution was processed in R-project and the results were used in the Czech Republic transport model, created in PTV Vision. This process generated more precise results in local level od the model (towns, villages)

Keywords: trip distribution, three dimension, transport model, municipalities

Procedia PDF Downloads 112
6403 Data-driven Decision-Making in Digital Entrepreneurship

Authors: Abeba Nigussie Turi, Xiangming Samuel Li

Abstract:

Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.

Keywords: startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship

Procedia PDF Downloads 304
6402 Native Language Identification with Cross-Corpus Evaluation Using Social Media Data: ’Reddit’

Authors: Yasmeen Bassas, Sandra Kuebler, Allen Riddell

Abstract:

Native language identification is one of the growing subfields in natural language processing (NLP). The task of native language identification (NLI) is mainly concerned with predicting the native language of an author’s writing in a second language. In this paper, we investigate the performance of two types of features; content-based features vs. content independent features, when they are evaluated on a different corpus (using social media data “Reddit”). In this NLI task, the predefined models are trained on one corpus (TOEFL), and then the trained models are evaluated on different data using an external corpus (Reddit). Three classifiers are used in this task; the baseline, linear SVM, and logistic regression. Results show that content-based features are more accurate and robust than content independent ones when tested within the corpus and across corpus.

Keywords: NLI, NLP, content-based features, content independent features, social media corpus, ML

Procedia PDF Downloads 120
6401 Advantages of Neural Network Based Air Data Estimation for Unmanned Aerial Vehicles

Authors: Angelo Lerro, Manuela Battipede, Piero Gili, Alberto Brandl

Abstract:

Redundancy requirements for UAV (Unmanned Aerial Vehicle) are hardly faced due to the generally restricted amount of available space and allowable weight for the aircraft systems, limiting their exploitation. Essential equipment as the Air Data, Attitude and Heading Reference Systems (ADAHRS) require several external probes to measure significant data as the Angle of Attack or the Sideslip Angle. Previous research focused on the analysis of a patented technology named Smart-ADAHRS (Smart Air Data, Attitude and Heading Reference System) as an alternative method to obtain reliable and accurate estimates of the aerodynamic angles. This solution is based on an innovative sensor fusion algorithm implementing soft computing techniques and it allows to obtain a simplified inertial and air data system reducing external devices. In fact, only one external source of dynamic and static pressures is needed. This paper focuses on the benefits which would be gained by the implementation of this system in UAV applications. A simplification of the entire ADAHRS architecture will bring to reduce the overall cost together with improved safety performance. Smart-ADAHRS has currently reached Technology Readiness Level (TRL) 6. Real flight tests took place on ultralight aircraft equipped with a suitable Flight Test Instrumentation (FTI). The output of the algorithm using the flight test measurements demonstrates the capability for this fusion algorithm to embed in a single device multiple physical and virtual sensors. Any source of dynamic and static pressure can be integrated with this system gaining a significant improvement in terms of versatility.

Keywords: aerodynamic angles, air data system, flight test, neural network, unmanned aerial vehicle, virtual sensor

Procedia PDF Downloads 209
6400 Temperature Control Improvement of Membrane Reactor

Authors: Pornsiri Kaewpradit, Chalisa Pourneaw

Abstract:

Temperature control improvement of a membrane reactor with exothermic and reversible esterification reaction is studied in this work. It is well known that a batch membrane reactor requires different control strategies from a continuous one due to the fact that it is operated dynamically. Due to the effect of the operating temperature, the suitable control scheme has to be designed based reliable predictive model to achieve a desired objective. In the study, the optimization framework has been preliminary formulated in order to determine an optimal temperature trajectory for maximizing a desired product. In model predictive control scheme, a set of predictive models have been initially developed corresponding to the possible operating points of the system. The multiple predictive control moves have been further calculated on-line using the developed models corresponding to current operating point. It is obviously seen in the simulation results that the temperature control has been improved compared to the performance obtained by the conventional predictive controller. Further robustness tests have also been investigated in this study.

Keywords: model predictive control, batch reactor, temperature control, membrane reactor

Procedia PDF Downloads 457
6399 Effect of Urea Deep Placement Technology Adoption on the Production Frontier: Evidence from Irrigation Rice Farmers in the Northern Region of Ghana

Authors: Shaibu Baanni Azumah, William Adzawla

Abstract:

Rice is an important staple crop, with current demand higher than the domestic supply in Ghana. This has led to a high and unfavourable import bill. Therefore, recent policies and interventions in the agricultural sub-sector aim at promoting various improved agricultural technologies in order to improve domestic production and reduce the importation of rice. In this study, we examined the effect of the adoption of Urea Deep Placement (UDP) technology by rice farmers on the position of the production frontier. This involved 200 farmers selected through a multi stage sampling technique in the Northern region of Ghana. A Cobb-Douglas stochastic frontier model was fitted. The result showed that the adoption of UDP technology shifts the output frontier outward and also move the farmers closer to the frontier. Farmers were also operating under diminishing returns to scale which calls for redress. Other factors that significantly influenced rice production were farm size, labour, use of certified seeds and NPK fertilizer. Although there was an opportunity for improvement, the farmers were highly efficient (92%), compared to previous studies. Farmers’ efficiency was improved through increased education, household size, experience, access to credit, and lack of extension service provision by MoFA. The study recommends the revision of Ghana’s agricultural policy to include the UDP technology. Agricultural Extension officers of the Ministry of Food and Agriculture (MoFA) should be trained on the UDP technology to support IFDC’s drive to improve adoption by rice farmers. Rice farmers are also encouraged to expand their farm lands, improve plant population, and also increase the usage of fertilizer to improve yields. Mechanisms through which credit can be made easily accessible and effectively utilised should be identified and promoted.

Keywords: efficiency, rice farmers, stochastic frontier, UDP technology

Procedia PDF Downloads 395
6398 Inhalable Lipid-Coated-Chitosan Nano-Embedded Microdroplets of an Antifungal Drug for Deep Lung Delivery

Authors: Ranjot Kaur, Om P. Katare, Anupama Sharma, Sarah R. Dennison, Kamalinder K. Singh, Bhupinder Singh

Abstract:

Respiratory microbial infections being among the top leading cause of death worldwide are difficult to treat as the microbes reside deep inside the airways, where only a small fraction of drug can access after traditional oral or parenteral routes. As a result, high doses of drugs are required to maintain drug levels above minimum inhibitory concentrations (MIC) at the infection site, unfortunately leading to severe systemic side-effects. Therefore, delivering antimicrobials directly to the respiratory tract provides an attractive way out in such situations. In this context, current study embarks on the systematic development of lung lia pid-modified chitosan nanoparticles for inhalation of voriconazole. Following the principles of quality by design, the chitosan nanoparticles were prepared by ionic gelation method and further coated with major lung lipid by precipitation method. The factor screening studies were performed by fractional factorial design, followed by optimization of the nanoparticles by Box-Behnken Design. The optimized formulation has a particle size range of 170-180nm, PDI 0.3-0.4, zeta potential 14-17, entrapment efficiency 45-50% and drug loading of 3-5%. The presence of a lipid coating was confirmed by FESEM, FTIR, and X-RD. Furthermore, the nanoparticles were found to be safe upto 40µg/ml on A549 and Calu-3 cell lines. The quantitative and qualitative uptake studies also revealed the uptake of nanoparticles in lung epithelial cells. Moreover, the data from Spraytec and next-generation impactor studies confirmed the deposition of nanoparticles in lower airways. Also, the interaction of nanoparticles with DPPC monolayers signifies its biocompatibility with lungs. Overall, the study describes the methodology and potential of lipid-coated chitosan nanoparticles in futuristic inhalation nanomedicine for the management of pulmonary aspergillosis.

Keywords: dipalmitoylphosphatidylcholine, nebulization, DPPC monolayers, quality-by-design

Procedia PDF Downloads 130
6397 Times Series Analysis of Depositing in Industrial Design in Brazil between 1996 and 2013

Authors: Jonas Pedro Fabris, Alberth Almeida Amorim Souza, Maria Emilia Camargo, Suzana Leitão Russo

Abstract:

With the law Nº. 9279, of May 14, 1996, the Brazilian government regulates rights and obligations relating to industrial property considering the economic development of the country as granting patents, trademark registration, registration of industrial designs and other forms of protection copyright. In this study, we show the application of the methodology of Box and Jenkins in the series of deposits of industrial design at the National Institute of Industrial Property for the period from May 1996 to April 2013. First, a graphical analysis of the data was done by observing the behavior of the data and the autocorrelation function. The best model found, based on the analysis of charts and statistical tests suggested by Box and Jenkins methodology, it was possible to determine the model number for the deposit of industrial design, SARIMA (2,1,0)(2,0,0), with an equal to 9.88% MAPE.

Keywords: ARIMA models, autocorrelation, Box and Jenkins Models, industrial design, MAPE, time series

Procedia PDF Downloads 534
6396 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models

Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig

Abstract:

This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.

Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection

Procedia PDF Downloads 280
6395 Analyzing the Impact of Spatio-Temporal Climate Variations on the Rice Crop Calendar in Pakistan

Authors: Muhammad Imran, Iqra Basit, Mobushir Riaz Khan, Sajid Rasheed Ahmad

Abstract:

The present study investigates the space-time impact of climate change on the rice crop calendar in tropical Gujranwala, Pakistan. The climate change impact was quantified through the climatic variables, whereas the existing calendar of the rice crop was compared with the phonological stages of the crop, depicted through the time series of the Normalized Difference Vegetation Index (NDVI) derived from Landsat data for the decade 2005-2015. Local maxima were applied on the time series of NDVI to compute the rice phonological stages. Panel models with fixed and cross-section fixed effects were used to establish the relation between the climatic parameters and the time-series of NDVI across villages and across rice growing periods. Results show that the climatic parameters have significant impact on the rice crop calendar. Moreover, the fixed effect model is a significant improvement over cross-sectional fixed effect models (R-squared equal to 0.673 vs. 0.0338). We conclude that high inter-annual variability of climatic variables cause high variability of NDVI, and thus, a shift in the rice crop calendar. Moreover, inter-annual (temporal) variability of the rice crop calendar is high compared to the inter-village (spatial) variability. We suggest the local rice farmers to adapt this change in the rice crop calendar.

Keywords: Landsat NDVI, panel models, temperature, rainfall

Procedia PDF Downloads 192
6394 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia

Authors: Rohan Bhasin

Abstract:

Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.

Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM

Procedia PDF Downloads 150
6393 A Numerical Hybrid Finite Element Model for Lattice Structures Using 3D/Beam Elements

Authors: Ahmadali Tahmasebimoradi, Chetra Mang, Xavier Lorang

Abstract:

Thanks to the additive manufacturing process, lattice structures are replacing the traditional structures in aeronautical and automobile industries. In order to evaluate the mechanical response of the lattice structures, one has to resort to numerical techniques. Ansys is a globally well-known and trusted commercial software that allows us to model the lattice structures and analyze their mechanical responses using either solid or beam elements. In this software, a script may be used to systematically generate the lattice structures for any size. On the one hand, solid elements allow us to correctly model the contact between the substrates (the supports of the lattice structure) and the lattice structure, the local plasticity, and the junctions of the microbeams. However, their computational cost increases rapidly with the size of the lattice structure. On the other hand, although beam elements reduce the computational cost drastically, it doesn’t correctly model the contact between the lattice structures and the substrates nor the junctions of the microbeams. Also, the notion of local plasticity is not valid anymore. Moreover, the deformed shape of the lattice structure doesn’t correspond to the deformed shape of the lattice structure using 3D solid elements. In this work, motivated by the pros and cons of the 3D and beam models, a numerically hybrid model is presented for the lattice structures to reduce the computational cost of the simulations while avoiding the aforementioned drawbacks of the beam elements. This approach consists of the utilization of solid elements for the junctions and beam elements for the microbeams connecting the corresponding junctions to each other. When the global response of the structure is linear, the results from the hybrid models are in good agreement with the ones from the 3D models for body-centered cubic with z-struts (BCCZ) and body-centered cubic without z-struts (BCC) lattice structures. However, the hybrid models have difficulty to converge when the effect of large deformation and local plasticity are considerable in the BCCZ structures. Furthermore, the effect of the junction’s size of the hybrid models on the results is investigated. For BCCZ lattice structures, the results are not affected by the junction’s size. This is also valid for BCC lattice structures as long as the ratio of the junction’s size to the diameter of the microbeams is greater than 2. The hybrid model can take into account the geometric defects. As a demonstration, the point clouds of two lattice structures are parametrized in a platform called LATANA (LATtice ANAlysis) developed by IRT-SystemX. In this process, for each microbeam of the lattice structures, an ellipse is fitted to capture the effect of shape variation and roughness. Each ellipse is represented by three parameters; semi-major axis, semi-minor axis, and angle of rotation. Having the parameters of the ellipses, the lattice structures are constructed in Spaceclaim (ANSYS) using the geometrical hybrid approach. The results show a negligible discrepancy between the hybrid and 3D models, while the computational cost of the hybrid model is lower than the computational cost of the 3D model.

Keywords: additive manufacturing, Ansys, geometric defects, hybrid finite element model, lattice structure

Procedia PDF Downloads 105