Search results for: atomic models
6120 Proximate and Amino Acid Composition of Amaranthus hybridus (Spinach), Celosia argentea (Cock's Comb) and Solanum nigrum (Black nightshade)
Authors: S. O. Oladeji, I. Saleh, A. U. Adamu, S. A. Fowotade
Abstract:
The proximate composition, trace metal level and amino acid composition of Amaranthus hybridus, Celosia argentea and Solanum nigrum were determined. These vegetables were high in their ash contents. Twelve elements were determined: calcium, chromium, copper, iron, lead, magnesium, nickel, phosphorous, potassium, sodium and zinc using flame photometer, atomic absorption and UV-Visible spectrophotometers. Calcium levels were highest ranged between 145.28±0.38 to 235.62±0.41mg/100g in all the samples followed by phosphorus. Quantitative chromatographic analysis of the vegetables hydrolysates revealed seventeen amino acids with concentration of leucine (6.51 to 6.66±0.21g/16gN) doubling that of isoleucine (2.99 to 3.33±0.21g/16gN) in all the samples while the limiting amino acids were cystine and methionine. The result showed that these vegetables were of high nutritive values and could be adequate used as supplement in diet.Keywords: proximate, amino acids, Amaranthus hybridus, Celosia argentea, Solanum nigrum
Procedia PDF Downloads 3986119 Performance of Fiber-Reinforced Polymer as an Alternative Reinforcement
Authors: Salah E. El-Metwally, Marwan Abdo, Basem Abdel Wahed
Abstract:
Fiber-reinforced polymer (FRP) bars have been proposed as an alternative to conventional steel bars; hence, the use of these non-corrosive and nonmetallic reinforcing bars has increased in various concrete projects. This concrete material is lightweight, has a long lifespan, and needs minor maintenance; however, its non-ductile nature and weak bond with the surrounding concrete create a significant challenge. The behavior of concrete elements reinforced with FRP bars has been the subject of several experimental investigations, even with their high cost. This study aims to numerically assess the viability of using FRP bars, as longitudinal reinforcement, in comparison with traditional steel bars, and also as prestressing tendons instead of the traditional prestressing steel. The nonlinear finite element analysis has been utilized to carry out the current study. Numerical models have been developed to examine the behavior of concrete beams reinforced with FRP bars or tendons against similar models reinforced with either conventional steel or prestressing steel. These numerical models were verified by experimental test results available in the literature. The obtained results revealed that concrete beams reinforced with FRP bars, as passive reinforcement, exhibited less ductility and less stiffness than similar beams reinforced with steel bars. On the other hand, when FRP tendons are employed in prestressing concrete beams, the results show that the performance of these beams is similar to those beams prestressed by conventional active reinforcement but with a difference caused by the two tendon materials’ moduli of elasticity.Keywords: reinforced concrete, prestressed concrete, nonlinear finite element analysis, fiber-reinforced polymer, ductility
Procedia PDF Downloads 116118 Annual Water Level Simulation Using Support Vector Machine
Authors: Maryam Khalilzadeh Poshtegal, Seyed Ahmad Mirbagheri, Mojtaba Noury
Abstract:
In this paper, by application of the input yearly data of rainfall, temperature and flow to the Urmia Lake, the simulation of water level fluctuation were applied by means of three models. According to the climate change investigation the fluctuation of lakes water level are of high interest. This study investigate data-driven models, support vector machines (SVM), SVM method which is a new regression procedure in water resources are applied to the yearly level data of Lake Urmia that is the biggest and the hyper saline lake in Iran. The evaluated lake levels are found to be in good correlation with the observed values. The results of SVM simulation show better accuracy and implementation. The mean square errors, mean absolute relative errors and determination coefficient statistics are used as comparison criteria.Keywords: simulation, water level fluctuation, urmia lake, support vector machine
Procedia PDF Downloads 3656117 Nanostructural Analysis of the Polylactic Acid (PLA) Fibers Functionalized by RF Plasma Treatment
Authors: J. H. O. Nascimento, F. R. Oliveira, K. K. O. S. Silva, J. Neves, V. Teixeira, J. Carneiro
Abstract:
These the aliphatic polyesters such as Polylactic Acid (PLA) in the form of fibers, nanofibers or plastic films, generally possess chemically inert surfaces, free porosity, and surface free energy (ΔG) lesser than 32 mN/m. It is therefore considered a low surface energy material, consequently has a low work of adhesion. For this reason, the products manufactured using these polymers are often subjected to surface treatments in order to change its physic-chemical surface, improving their wettability and the Work of Adhesion (WA). Plasma Radio Frequency low pressure (RF) treatment was performed in order to improve the Work of Adhesion (WA) on PLA fibers. Different parameters, such as, power, ratio of working gas (Argon/Oxygen) and treatment time were used to optimize the plasma conditions to modify the PLA surface properties. With plasma treatment, a significant increase in the work of adhesion on PLA fiber surface was observed. The analysis performed by XPS showed an increase in polar functional groups and the SEM and AFM image revealed a considerable increase in roughness.Keywords: RF plasma, surface modification, PLA fabric, atomic force macroscopic, Nanotechnology
Procedia PDF Downloads 5356116 A Critical Discourse Analysis of Jamaican and Trinidadian News Articles about D/Deafness
Authors: Melissa Angus Baboun
Abstract:
Utilizing a Critical Discourse Analysis (CDA) methodology and a theoretical framework based on disability studies, how Jamaican and Trinidadian newspapers discussed issues relating to the Deaf community were examined. The term deaf was inputted into the search engine tool of the online website for the Jamaica Observer and the Trinidad & Tobago Guardian. All 27 articles that contained the term deaf in its content and were written between August 1, 2017 and November 15, 2017 were chosen for the study. The data analysis was divided into three steps: (1) listing and analysis instances of metaphorical deafness (e.g. fall on deaf ears), (2) categorization of the content of the articles into the models of disability discourse (the medical, socio-cultural, and superscrip models of disability narratives), and (3) the analysis of any additional data found. A total of 42% of the articles pulled for this study did not deal with the Deaf community in any capacity, but rather instances of the use of idiomatic expressions that use deafness as a metaphor for a non-physical, undesirable trait. The most common idiomatic expression found was fall on deaf ears. Regarding the models of disability discourse, eight articles were found to follow the socio-cultural model, two were found to follow the medical model, and two were found to follow the superscrip model. The additional data found in these articles include two instances of the term deaf and mute, an overwhelming use of lower case d for the term deaf, and the misuse of the term translator (to mean interpreter).Keywords: deafness, disability, news coverage, Caribbean newspapers
Procedia PDF Downloads 2326115 Molecular Migration in Polyvinyl Acetate Matrix: Impact of Compatibility, Number of Migrants and Stress on Surface and Internal Microstructure
Authors: O. Squillace, R. L. Thompson
Abstract:
Migration of small molecules to, and across the surface of polymer matrices is a little-studied problem with important industrial applications. Tackifiers in adhesives, flavors in foods and binding agents in paints all present situations where the function of a product depends on the ability of small molecules to migrate through a polymer matrix to achieve the desired properties such as softness, dispersion of fillers, and to deliver an effect that is felt (or tasted) on a surface. It’s been shown that the chemical and molecular structure, surface free energies, phase behavior, close environment and compatibility of the system, influence the migrants’ motion. When differences in behavior, such as occurrence of segregation to the surface or not, are observed it is then of crucial importance to identify and get a better understanding of the driving forces involved in the process of molecular migration. In this aim, experience is meant to be allied with theory in order to deliver a validated theoretical and computational toolkit to describe and predict these phenomena. The systems that have been chosen for this study aim to address the effect of polarity mismatch between the migrants and the polymer matrix and that of a second migrant over the first one. As a non-polar resin polymer, polyvinyl acetate is used as the material to which more or less polar migrants (sorbitol, carvone, octanoic acid (OA), triacetin) are to be added. Through contact angle measurement a surface excess is seen for sorbitol (polar) mixed with PVAc as the surface energy is lowered compare to the one of pure PVAc. This effect is increased upon the addition of carvon or triacetin (non-polars). Surface micro-structures are also evidenced by atomic force microscopy (AFM). Ion beam analysis (Nuclear Reaction Analysis), supplemented by neutron reflectometry can accurately characterize the self-organization of surfactants, oligomers, aromatic molecules in polymer films in order to relate the macroscopic behavior to the length scales that are amenable to simulation. The nuclear reaction analysis (NRA) data for deuterated OA 20% shows the evidence of a surface excess which is enhanced after annealing. The addition of 10% triacetin, as a second migrant, results in the formation of an underlying layer enriched in triacetin below the surface excess of OA. The results show that molecules in polarity mismatch with the matrix tend to segregate to the surface, and this is favored by the addition of a second migrant of the same polarity than the matrix. As studies have been restricted to materials that are model supported films under static conditions in a first step, it is also wished to address the more challenging conditions of materials under controlled stress or strain. To achieve this, a simple rig and PDMS cell have been designed to stretch the material to a defined strain and to probe these mechanical effects by ion beam analysis and atomic force microscopy. This will make a significant step towards exploring the influence of extensional strain on surface segregation, flavor release in cross-linked rubbers.Keywords: polymers, surface segregation, thin films, molecular migration
Procedia PDF Downloads 1326114 LACGC: Business Sustainability Research Model for Generations Consumption, Creation, and Implementation of Knowledge: Academic and Non-Academic
Authors: Satpreet Singh
Abstract:
This paper introduces the new LACGC model to sustain the academic and non-academic business to future educational and organizational generations. The consumption of knowledge and the creation of new knowledge is a strength and focal interest of all academics and Non-academic organizations. Implementing newly created knowledge sustains the businesses to the next generation with growth without detriment. Existing models like the Scholar-practitioner model and Organization knowledge creation models focus specifically on academic or non-academic, not both. LACGC model can be used for both Academic and Non-academic at the domestic or international level. Researchers and scholars play a substantial role in finding literature and practice gaps in academic and non-academic disciplines. LACGC model has unrestricted the number of recurrences because the Consumption, Creation, and implementation of new ideas, disciplines, systems, and knowledge is a never-ending process and must continue from one generation to the next.Keywords: academics, consumption, creation, generations, non-academics, research, sustainability
Procedia PDF Downloads 1966113 Probing Neuron Mechanics with a Micropipette Force Sensor
Authors: Madeleine Anthonisen, M. Hussain Sangji, G. Monserratt Lopez-Ayon, Margaret Magdesian, Peter Grutter
Abstract:
Advances in micromanipulation techniques and real-time particle tracking with nanometer resolution have enabled biological force measurements at scales relevant to neuron mechanics. An approach to precisely control and maneuver neurite-tethered polystyrene beads is presented. Analogous to an Atomic Force Microscope (AFM), this multi-purpose platform is a force sensor with imaging acquisition and manipulation capabilities. A mechanical probe composed of a micropipette with its tip fixed to a functionalized bead is used to incite the formation of a neurite in a sample of rat hippocampal neurons while simultaneously measuring the tension in said neurite as the sample is pulled away from the beaded tip. With optical imaging methods, a force resolution of 12 pN is achieved. Moreover, the advantages of this technique over alternatives such as AFM, namely ease of manipulation which ultimately allows higher throughput investigation of the mechanical properties of neurons, is demonstrated.Keywords: axonal growth, axonal guidance, force probe, pipette micromanipulation, neurite tension, neuron mechanics
Procedia PDF Downloads 3656112 Soap Film Enneper Minimal Surface Model
Authors: Yee Hooi Min, Mohdnasir Abdul Hadi
Abstract:
Tensioned membrane structure in the form of Enneper minimal surface can be considered as a sustainable development for the green environment and technology, it also can be used to support the effectiveness used of energy and the structure. Soap film in the form of Enneper minimal surface model has been studied. The combination of shape and internal forces for the purpose of stiffness and strength is an important feature of membrane surface. For this purpose, form-finding using soap film model has been carried out for Enneper minimal surface models with variables u=v=0.6 and u=v=1.0. Enneper soap film models with variables u=v=0.6 and u=v=1.0 provides an alternative choice for structural engineers to consider the tensioned membrane structure in the form of Enneper minimal surface applied in the building industry. It is expected to become an alternative building material to be considered by the designer.Keywords: Enneper, minimal surface, soap film, tensioned membrane structure
Procedia PDF Downloads 5526111 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security
Authors: Shanshan Zhu, Mohammad Nasim
Abstract:
Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection
Procedia PDF Downloads 406110 Modeling Biomass and Biodiversity across Environmental and Management Gradients in Temperate Grasslands with Deep Learning and Sentinel-1 and -2
Authors: Javier Muro, Anja Linstadter, Florian Manner, Lisa Schwarz, Stephan Wollauer, Paul Magdon, Gohar Ghazaryan, Olena Dubovyk
Abstract:
Monitoring the trade-off between biomass production and biodiversity in grasslands is critical to evaluate the effects of management practices across environmental gradients. New generations of remote sensing sensors and machine learning approaches can model grasslands’ characteristics with varying accuracies. However, studies often fail to cover a sufficiently broad range of environmental conditions, and evidence suggests that prediction models might be case specific. In this study, biomass production and biodiversity indices (species richness and Fishers’ α) are modeled in 150 grassland plots for three sites across Germany. These sites represent a North-South gradient and are characterized by distinct soil types, topographic properties, climatic conditions, and management intensities. Predictors used are derived from Sentinel-1 & 2 and a set of topoedaphic variables. The transferability of the models is tested by training and validating at different sites. The performance of feed-forward deep neural networks (DNN) is compared to a random forest algorithm. While biomass predictions across gradients and sites were acceptable (r2 0.5), predictions of biodiversity indices were poor (r2 0.14). DNN showed higher generalization capacity than random forest when predicting biomass across gradients and sites (relative root mean squared error of 0.5 for DNN vs. 0.85 for random forest). DNN also achieved high performance when using the Sentinel-2 surface reflectance data rather than different combinations of spectral indices, Sentinel-1 data, or topoedaphic variables, simplifying dimensionality. This study demonstrates the necessity of training biomass and biodiversity models using a broad range of environmental conditions and ensuring spatial independence to have realistic and transferable models where plot level information can be upscaled to landscape scale.Keywords: ecosystem services, grassland management, machine learning, remote sensing
Procedia PDF Downloads 2186109 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models
Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg
Abstract:
Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction
Procedia PDF Downloads 3076108 Anti-Inflammatory, Analgesic and Antipyretic Activity of Terminalia arjuna Roxb. Extract in Animal Models
Authors: Linda Chularojmontri, Seewaboon Sireeratawong, Suvara Wattanapitayakul
Abstract:
Terminalia arjuna Roxb. (family Combretaceae) is commonly known as ‘Sa maw thet’ in Thai. The fruit is used in traditional medicine as natural mild laxatives, carminative and expectorant. Aim of the study: This research aims to study the anti-inflammatory, analgesic and antipyretic activities of Terminalia arjuna extract by using animal models in comparison to the reference drugs. Materials and Methods: The anti-inflammatory study was conducted by two experimental animal models namely ethyl phenylpropionate (EPP)-induced ear edema and carrageenan-induced paw edema. The study of analgesic activity used two methods of pain induction including acetic acid and heat-induced pain. In addition, the antipyretic activity study was performed by induced hyperthermia with yeast. Results: The results showed that the oral administration of Terminalia arjuna extract possessed acute anti-inflammatory effect in carrageenan-induced paw edema. Terminalia arjuna extract showed the analgesic activity in acetic acid-induced writhing response and heat-induced pain. This indicates its peripheral effect by inhibiting the biosynthesis and/or release of some pain mediators and some mechanism through Central nervous system. Moreover, Terminalia arjuna extract at the dose of 1000 and 1500 mg/kg body weight showed the antipyretic activity, which might be because of the inhibition of prostaglandins. Conclusion: The findings of this study indicated that the Terminalia arjuna extract possesses the anti-inflammatory, analgesic and antipyretic activities in animals.Keywords: analgesic activity, anti-inflammatory activity, antipyretic activity, Terminalia arjuna extract
Procedia PDF Downloads 2626107 Utilizing Federated Learning for Accurate Prediction of COVID-19 from CT Scan Images
Authors: Jinil Patel, Sarthak Patel, Sarthak Thakkar, Deepti Saraswat
Abstract:
Recently, the COVID-19 outbreak has spread across the world, leading the World Health Organization to classify it as a global pandemic. To save the patient’s life, the COVID-19 symptoms have to be identified. But using an AI (Artificial Intelligence) model to identify COVID-19 symptoms within the allotted time was challenging. The RT-PCR test was found to be inadequate in determining the COVID status of a patient. To determine if the patient has COVID-19 or not, a Computed Tomography Scan (CT scan) of patient is a better alternative. It will be challenging to compile and store all the data from various hospitals on the server, though. Federated learning, therefore, aids in resolving this problem. Certain deep learning models help to classify Covid-19. This paper will have detailed work of certain deep learning models like VGG19, ResNet50, MobileNEtv2, and Deep Learning Aggregation (DLA) along with maintaining privacy with encryption.Keywords: federated learning, COVID-19, CT-scan, homomorphic encryption, ResNet50, VGG-19, MobileNetv2, DLA
Procedia PDF Downloads 716106 Generation of High-Quality Synthetic CT Images from Cone Beam CT Images Using A.I. Based Generative Networks
Authors: Heeba A. Gurku
Abstract:
Introduction: Cone Beam CT(CBCT) images play an integral part in proper patient positioning in cancer patients undergoing radiation therapy treatment. But these images are low in quality. The purpose of this study is to generate high-quality synthetic CT images from CBCT using generative models. Material and Methods: This study utilized two datasets from The Cancer Imaging Archive (TCIA) 1) Lung cancer dataset of 20 patients (with full view CBCT images) and 2) Pancreatic cancer dataset of 40 patients (only 27 patients having limited view images were included in the study). Cycle Generative Adversarial Networks (GAN) and its variant Attention Guided Generative Adversarial Networks (AGGAN) models were used to generate the synthetic CTs. Models were evaluated by visual evaluation and on four metrics, Structural Similarity Index Measure (SSIM), Peak Signal Noise Ratio (PSNR) Mean Absolute Error (MAE) and Root Mean Square Error (RMSE), to compare the synthetic CT and original CT images. Results: For pancreatic dataset with limited view CBCT images, our study showed that in Cycle GAN model, MAE, RMSE, PSNR improved from 12.57to 8.49, 20.94 to 15.29 and 21.85 to 24.63, respectively but structural similarity only marginally increased from 0.78 to 0.79. Similar, results were achieved with AGGAN with no improvement over Cycle GAN. However, for lung dataset with full view CBCT images Cycle GAN was able to reduce MAE significantly from 89.44 to 15.11 and AGGAN was able to reduce it to 19.77. Similarly, RMSE was also decreased from 92.68 to 23.50 in Cycle GAN and to 29.02 in AGGAN. SSIM and PSNR also improved significantly from 0.17 to 0.59 and from 8.81 to 21.06 in Cycle GAN respectively while in AGGAN SSIM increased to 0.52 and PSNR increased to 19.31. In both datasets, GAN models were able to reduce artifacts, reduce noise, have better resolution, and better contrast enhancement. Conclusion and Recommendation: Both Cycle GAN and AGGAN were significantly able to reduce MAE, RMSE and PSNR in both datasets. However, full view lung dataset showed more improvement in SSIM and image quality than limited view pancreatic dataset.Keywords: CT images, CBCT images, cycle GAN, AGGAN
Procedia PDF Downloads 836105 Statistical Models and Time Series Forecasting on Crime Data in Nepal
Authors: Dila Ram Bhandari
Abstract:
Throughout the 20th century, new governments were created where identities such as ethnic, religious, linguistic, caste, communal, tribal, and others played a part in the development of constitutions and the legal system of victim and criminal justice. Acute issues with extremism, poverty, environmental degradation, cybercrimes, human rights violations, crime against, and victimization of both individuals and groups have recently plagued South Asian nations. Everyday massive number of crimes are steadfast, these frequent crimes have made the lives of common citizens restless. Crimes are one of the major threats to society and also for civilization. Crime is a bone of contention that can create a societal disturbance. The old-style crime solving practices are unable to live up to the requirement of existing crime situations. Crime analysis is one of the most important activities of the majority of intelligent and law enforcement organizations all over the world. The South Asia region lacks such a regional coordination mechanism, unlike central Asia of Asia Pacific regions, to facilitate criminal intelligence sharing and operational coordination related to organized crime, including illicit drug trafficking and money laundering. There have been numerous conversations in recent years about using data mining technology to combat crime and terrorism. The Data Detective program from Sentient as a software company, uses data mining techniques to support the police (Sentient, 2017). The goals of this internship are to test out several predictive model solutions and choose the most effective and promising one. First, extensive literature reviews on data mining, crime analysis, and crime data mining were conducted. Sentient offered a 7-year archive of crime statistics that were daily aggregated to produce a univariate dataset. Moreover, a daily incidence type aggregation was performed to produce a multivariate dataset. Each solution's forecast period lasted seven days. Statistical models and neural network models were the two main groups into which the experiments were split. For the crime data, neural networks fared better than statistical models. This study gives a general review of the applied statistics and neural network models. A detailed image of each model's performance on the available data and generalizability is provided by a comparative analysis of all the models on a comparable dataset. Obviously, the studies demonstrated that, in comparison to other models, Gated Recurrent Units (GRU) produced greater prediction. The crime records of 2005-2019 which was collected from Nepal Police headquarter and analysed by R programming. In conclusion, gated recurrent unit implementation could give benefit to police in predicting crime. Hence, time series analysis using GRU could be a prospective additional feature in Data Detective.Keywords: time series analysis, forecasting, ARIMA, machine learning
Procedia PDF Downloads 1646104 Focus-Latent Dirichlet Allocation for Aspect-Level Opinion Mining
Authors: Mohsen Farhadloo, Majid Farhadloo
Abstract:
Aspect-level opinion mining that aims at discovering aspects (aspect identification) and their corresponding ratings (sentiment identification) from customer reviews have increasingly attracted attention of researchers and practitioners as it provides valuable insights about products/services from customer's points of view. Instead of addressing aspect identification and sentiment identification in two separate steps, it is possible to simultaneously identify both aspects and sentiments. In recent years many graphical models based on Latent Dirichlet Allocation (LDA) have been proposed to solve both aspect and sentiment identifications in a single step. Although LDA models have been effective tools for the statistical analysis of document collections, they also have shortcomings in addressing some unique characteristics of opinion mining. Our goal in this paper is to address one of the limitations of topic models to date; that is, they fail to directly model the associations among topics. Indeed in many text corpora, it is natural to expect that subsets of the latent topics have higher probabilities. We propose a probabilistic graphical model called focus-LDA, to better capture the associations among topics when applied to aspect-level opinion mining. Our experiments on real-life data sets demonstrate the improved effectiveness of the focus-LDA model in terms of the accuracy of the predictive distributions over held out documents. Furthermore, we demonstrate qualitatively that the focus-LDA topic model provides a natural way of visualizing and exploring unstructured collection of textual data.Keywords: aspect-level opinion mining, document modeling, Latent Dirichlet Allocation, LDA, sentiment analysis
Procedia PDF Downloads 936103 Predictive Analytics Algorithms: Mitigating Elementary School Drop Out Rates
Authors: Bongs Lainjo
Abstract:
Educational institutions and authorities that are mandated to run education systems in various countries need to implement a curriculum that considers the possibility and existence of elementary school dropouts. This research focuses on elementary school dropout rates and the ability to replicate various predictive models carried out globally on selected Elementary Schools. The study was carried out by comparing the classical case studies in Africa, North America, South America, Asia and Europe. Some of the reasons put forward for children dropping out include the notion of being successful in life without necessarily going through the education process. Such mentality is coupled with a tough curriculum that does not take care of all students. The system has completely led to poor school attendance - truancy which continuously leads to dropouts. In this study, the focus is on developing a model that can systematically be implemented by school administrations to prevent possible dropout scenarios. At the elementary level, especially the lower grades, a child's perception of education can be easily changed so that they focus on the better future that their parents desire. To deal effectively with the elementary school dropout problem, strategies that are put in place need to be studied and predictive models are installed in every educational system with a view to helping prevent an imminent school dropout just before it happens. In a competency-based curriculum that most advanced nations are trying to implement, the education systems have wholesome ideas of learning that reduce the rate of dropout.Keywords: elementary school, predictive models, machine learning, risk factors, data mining, classifiers, dropout rates, education system, competency-based curriculum
Procedia PDF Downloads 1746102 3D Linear and Cyclic Homo-Peptide Crystals Forged by Supramolecular Swelling Self-Assembly
Authors: Wenliang Song, Yu Zhang, Hua Jin, Il Kim
Abstract:
The self-assembly of the polypeptide (PP) into well-defined structures at different length scales is both biomimetic relevant and fundamentally interesting. Although there are various reports of nanostructures fabricated by the self-assembly of various PPs, directed self-assembly of PP into three-dimensional (3D) hierarchical structure has proven to be difficult, despite their importance for biological applications. Herein, an efficient method has been developed through living polymerization of phenylalanine N-Carboxy anhydride (NCA) towards the linear and cyclic polyphenylalanine, and the new invented swelling methodology can form diverse hierarchical polypeptide crystals. The solvent-dependent self-assembly behaviors of these homopolymers were characterized by high-resolution imaging tools such as atomic force microscopy, transmission electron microscopy, scanning electron microscope. The linear and cyclic polypeptide formed 3D nano hierarchical shapes, such as a sphere, cubic, stratiform and hexagonal star in different solvents. Notably, a crystalline packing model was proposed to explain the formation of 3D nanostructures based on the various diffraction patterns, looking forward to give an insight for their dissimilar shape inflection during the self-assembly process.Keywords: self-assembly, polypeptide, bio-polymer, crystalline polymer
Procedia PDF Downloads 2386101 Patient Care Needs Assessment: An Evidence-Based Process to Inform Quality Care and Decision Making
Authors: Wynne De Jong, Robert Miller, Ross Riggs
Abstract:
Beyond the number of nurses providing care for patients, having nurses with the right skills, experience and education is essential to ensure the best possible outcomes for patients. Research studies continue to link nurse staffing and skill mix with nurse-sensitive patient outcomes; numerous studies clearly show that superior patient outcomes are associated with higher levels of regulated staff. Due to the limited number of tools and processes available to assist nurse leaders with staffing models of care, nurse leaders are constantly faced with the ongoing challenge to ensure their staffing models of care best suit their patient population. In 2009, several hospitals in Ontario, Canada participated in a research study to develop and evaluate an RN/RPN utilization toolkit. The purpose of this study was to develop and evaluate a toolkit for Registered Nurses/Registered Practical Nurses Staff mix decision-making based on the College of Nurses of Ontario, Canada practice standards for the utilization of RNs and RPNs. This paper will highlight how an organization has further developed the Patient Care Needs Assessment (PCNA) questionnaire, a major component of the toolkit. Moreover, it will demonstrate how it has utilized the information from PCNA to clearly identify patient and family care needs, thus providing evidence-based results to assist leaders with matching the best staffing skill mix to their patients.Keywords: nurse staffing models of care, skill mix, nursing health human resources, patient safety
Procedia PDF Downloads 3146100 Revisionism in Literature: Deconstructing Patriarchal Ideals in Margaret Atwood's The Penelopiad
Authors: Essam Abdelhamid Hegazy
Abstract:
This paper aims to read Margaret Atwood's The Penelopiad (2005) via a revisionist and deconstructive approach. This novel is a postmodernist exploration of the grand-narrative myth The Odyssey (800 BC) by Homer, who portrayed the heroic warrior and the faithful wife as the epitome of perfect male and female models _examples whom all must follow and mimic. In Atwood's narrative, the same two hero models are the two great tricksters who are willing to perform any sort of obnoxious act for achieving their goals. This research tries to examine how Atwood tried to synthesize the change in character’s narratives leading to the humanization of the perfect hero and the ideal wife. The researcher has used a multidisciplinary approach where the feminist, revisionist and deconstructive theories were implemented to identify and find out the new interpretations of the myths that center the experiences and perspectives of women. Research findings are that revisionist approach was applied through giving an opportunity to the victimized and the voiceless to speak out and retaliate against their prosecutions.Keywords: margret atwood, patriarchal, penelopiad, revisionism
Procedia PDF Downloads 786099 The Principle of Methodological Rationality and Security of Organisations
Authors: Jan Franciszek Jacko
Abstract:
This investigation presents the principle of methodological rationality of decision making and discusses the impact of an organisation's members' methodologically rational or irrational decisions on its security. This study formulates and partially justifies some research hypotheses regarding the impact. The thinking experiment is used according to Max Weber's ideal types method. Two idealised situations("models") are compared: Model A, whereall decision-makers follow methodologically rational decision-making procedures. Model B, in which these agents follow methodologically irrational decision-making practices. Analysing and comparing the two models will allow the formulation of some research hypotheses regarding the impact of methodologically rational and irrational attitudes of members of an organisation on its security. In addition to the method, phenomenological analyses of rationality and irrationality are applied.Keywords: methodological rationality, rational decisions, security of organisations, philosophy of economics
Procedia PDF Downloads 1386098 Identification and Prioritisation of Students Requiring Literacy Intervention and Subsequent Communication with Key Stakeholders
Authors: Emilie Zimet
Abstract:
During networking and NCCD moderation meetings, best practices for identifying students who require Literacy Intervention are often discussed. Once these students are identified, consideration is given to the most effective process for prioritising those who have the greatest need for Literacy Support and the allocation of resources, tracking of intervention effectiveness and communicating with teachers/external providers/parents. Through a workshop, the group will investigate best practices to identify students who require literacy support and strategies to communicate and track their progress. In groups, participants will examine what they do in their settings and then compare with other models, including the researcher’s model, to decide the most effective path to identification and communication. Participants will complete a worksheet at the beginning of the session to deeply consider their current approaches. The participants will be asked to critically analyse their own identification processes for Literacy Intervention, ensuring students are not overlooked if they fall into the borderline category. A cut-off for students to access intervention will be considered so as not to place strain on already stretched resources along with the most effective allocation of resources. Furthermore, communicating learning needs and differentiation strategies to staff is paramount to the success of an intervention, and participants will look at the frequency of communication to share such strategies and updates. At the end of the session, the group will look at creating or evolving models that allow for best practices for the identification and communication of Literacy Interventions. The proposed outcome for this research is to develop a model of identification of students requiring Literacy Intervention that incorporates the allocation of resources and communication to key stakeholders. This will be done by pooling information and discussing a variety of models used in the participant's school settings.Keywords: identification, student selection, communication, special education, school policy, planning for intervention
Procedia PDF Downloads 456097 Assessment of Chemical and Physical Properties of Surface Water Resources in Flood Affected Area
Authors: Siti Hajar Ya’acob, Nor Sayzwani Sukri, Farah Khaliz Kedri, Rozidaini Mohd Ghazi, Nik Raihan Nik Yusoff, Aweng A/L Eh Rak
Abstract:
Flood event that occurred in mid-December 2014 in East Coast of Peninsular Malaysia has driven attention from the public nationwide. Apart from loss and damage of properties and belongings, the massive flood event has introduced environmental disturbances on surface water resources in such flood affected area. A study has been conducted to measure the physical and chemical composition of Galas River and Pergau River prior to identification the flood impact towards environmental deterioration in surrounding area. Samples that have been collected were analyzed in-situ using YSI portable instrument and also in the laboratory for acid digestion and heavy metals analysis using Atomic Absorption Spectroscopy (AAS). Results showed that range of temperature (0C), DO (mg/L), Ec (µs/cm), TDS (mg/L), turbidity (NTU), pH, and salinity were 25.05-26.65, 1.51-5.85, 0.032-0.054, 0.022-0.035, 23.2-76.4, 3.46-7.31, and 0.01-0.02 respectively. The results from this study could be used as a primary database to evaluate the status of water quality of the respective river after the massive flood.Keywords: flood, river, heavy metals, AAS
Procedia PDF Downloads 3796096 Special Case of Trip Distribution Model and Its Use for Estimation of Detailed Transport Demand in the Czech Republic
Authors: Jiri Dufek
Abstract:
The national model of the Czech Republic has been modified in a detailed way to get detailed travel demand in the municipality level (cities, villages over 300 inhabitants). As a technique for this detailed modelling, three-dimensional procedure for calibrating gravity models, was used. Besides of zone production and attraction, which is usual in gravity models, the next additional parameter for trip distribution was introduced. Usually it is called by “third dimension”. In the model, this parameter is a demand between regions. The distribution procedure involved calculation of appropriate skim matrices and its multiplication by three coefficients obtained by iterative balancing of production, attraction and third dimension. This type of trip distribution was processed in R-project and the results were used in the Czech Republic transport model, created in PTV Vision. This process generated more precise results in local level od the model (towns, villages)Keywords: trip distribution, three dimension, transport model, municipalities
Procedia PDF Downloads 1256095 Data-driven Decision-Making in Digital Entrepreneurship
Authors: Abeba Nigussie Turi, Xiangming Samuel Li
Abstract:
Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.Keywords: startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship
Procedia PDF Downloads 3266094 Evaluation of Pollution in Underground Water from ODO-NLA and OGIJO Metropolis Industrial Areas in Ikorodu
Authors: Zaccheaus Olasupo Apotiola
Abstract:
This study evaluates the level of pollution in underground water from Ogijo and Odo-nla areas in lkorodu, Lagos State. Water sample were collected around various industries and transported in ice packs to the laboratory. Temperature and pH was determined on site, physicochemical parameters and total plate were determined using standard methods, while heavy metal concentration was determined using Atomic Absorption spectrophotometry method. The temperature was observed at a range of 20-28 oC, the pH was observed at a range of 5.64 to 6.91 mol/l and were significantly different (P < 0.05) from one another. The chloride content was observed at a range 70.92 to 163.10 mg/l there was no significant difference (P > 0.05) between sample 40 GAJ and ISUP, but there was significant difference (P < 0.05) between other samples. The acidity value varied from 11.0 – 34.5 (mg/l), the samples had no alkalinity. The Total plate count was found at 20-125 cfu/ml. Asernic, Lead, Cadmium, and Mercury concentration ranged between 0.03 - 0.09, 0.04 - 0.11, 0.00 -0.00, and 0.00 – 0.00(mg/l) respectively. However there was significant difference (p < 0.05) between all samples except for sample 4OGA, 5OGAJ, and 3SUTN that were not significantly different (P > 0.05). The results revealed all samples are not safe for human consumption as the levels of Asernic and Lead are above the maximum value of (0.01 mg/l) recommended by NIS 554 and WHO.Keywords: arsenic, cadmium, lead mercury, WHO
Procedia PDF Downloads 5186093 Native Language Identification with Cross-Corpus Evaluation Using Social Media Data: ’Reddit’
Authors: Yasmeen Bassas, Sandra Kuebler, Allen Riddell
Abstract:
Native language identification is one of the growing subfields in natural language processing (NLP). The task of native language identification (NLI) is mainly concerned with predicting the native language of an author’s writing in a second language. In this paper, we investigate the performance of two types of features; content-based features vs. content independent features, when they are evaluated on a different corpus (using social media data “Reddit”). In this NLI task, the predefined models are trained on one corpus (TOEFL), and then the trained models are evaluated on different data using an external corpus (Reddit). Three classifiers are used in this task; the baseline, linear SVM, and logistic regression. Results show that content-based features are more accurate and robust than content independent ones when tested within the corpus and across corpus.Keywords: NLI, NLP, content-based features, content independent features, social media corpus, ML
Procedia PDF Downloads 1366092 Temperature Control Improvement of Membrane Reactor
Authors: Pornsiri Kaewpradit, Chalisa Pourneaw
Abstract:
Temperature control improvement of a membrane reactor with exothermic and reversible esterification reaction is studied in this work. It is well known that a batch membrane reactor requires different control strategies from a continuous one due to the fact that it is operated dynamically. Due to the effect of the operating temperature, the suitable control scheme has to be designed based reliable predictive model to achieve a desired objective. In the study, the optimization framework has been preliminary formulated in order to determine an optimal temperature trajectory for maximizing a desired product. In model predictive control scheme, a set of predictive models have been initially developed corresponding to the possible operating points of the system. The multiple predictive control moves have been further calculated on-line using the developed models corresponding to current operating point. It is obviously seen in the simulation results that the temperature control has been improved compared to the performance obtained by the conventional predictive controller. Further robustness tests have also been investigated in this study.Keywords: model predictive control, batch reactor, temperature control, membrane reactor
Procedia PDF Downloads 4666091 Hybrid Model: An Integration of Machine Learning with Traditional Scorecards
Authors: Golnush Masghati-Amoli, Paul Chin
Abstract:
Over the past recent years, with the rapid increases in data availability and computing power, Machine Learning (ML) techniques have been called on in a range of different industries for their strong predictive capability. However, the use of Machine Learning in commercial banking has been limited due to a special challenge imposed by numerous regulations that require lenders to be able to explain their analytic models, not only to regulators but often to consumers. In other words, although Machine Leaning techniques enable better prediction with a higher level of accuracy, in comparison with other industries, they are adopted less frequently in commercial banking especially for scoring purposes. This is due to the fact that Machine Learning techniques are often considered as a black box and fail to provide information on why a certain risk score is given to a customer. In order to bridge this gap between the explain-ability and performance of Machine Learning techniques, a Hybrid Model is developed at Dun and Bradstreet that is focused on blending Machine Learning algorithms with traditional approaches such as scorecards. The Hybrid Model maximizes efficiency of traditional scorecards by merging its practical benefits, such as explain-ability and the ability to input domain knowledge, with the deep insights of Machine Learning techniques which can uncover patterns scorecard approaches cannot. First, through development of Machine Learning models, engineered features and latent variables and feature interactions that demonstrate high information value in the prediction of customer risk are identified. Then, these features are employed to introduce observed non-linear relationships between the explanatory and dependent variables into traditional scorecards. Moreover, instead of directly computing the Weight of Evidence (WoE) from good and bad data points, the Hybrid Model tries to match the score distribution generated by a Machine Learning algorithm, which ends up providing an estimate of the WoE for each bin. This capability helps to build powerful scorecards with sparse cases that cannot be achieved with traditional approaches. The proposed Hybrid Model is tested on different portfolios where a significant gap is observed between the performance of traditional scorecards and Machine Learning models. The result of analysis shows that Hybrid Model can improve the performance of traditional scorecards by introducing non-linear relationships between explanatory and target variables from Machine Learning models into traditional scorecards. Also, it is observed that in some scenarios the Hybrid Model can be almost as predictive as the Machine Learning techniques while being as transparent as traditional scorecards. Therefore, it is concluded that, with the use of Hybrid Model, Machine Learning algorithms can be used in the commercial banking industry without being concerned with difficulties in explaining the models for regulatory purposes.Keywords: machine learning algorithms, scorecard, commercial banking, consumer risk, feature engineering
Procedia PDF Downloads 132