Search results for: computer-based testing
1587 Electrokinetic Application for the Improvement of Soft Clays
Authors: Abiola Ayopo Abiodun, Zalihe Nalbantoglu
Abstract:
The electrokinetic application (EKA), a relatively modern chemical treatment has a potential for in-situ ground improvement in an open field or under existing structures. It utilizes a low electrical gradient to transport electrolytic chemical ions between bespoke electrodes inserted in the fine-grained, low permeable soft soils. The paper investigates the efficacy of the EKA as a mitigation technique for the soft clay beds. The laboratory model of the EKA comprises of rectangular plexiglass test tank, electrolytes compartments, geosynthetic electrodes and direct electric current supply. Within this setup, the EK effects resulted from the exchange of ions between anolyte (anodic) and catholyte (cathodic) ends through the tested soil were examined by basic experimental laboratory testing methods. As such, the treated soft soil properties were investigated as a function of the anode-to-cathode distances and curing periods. The test results showed that there have been some changes in the physical and engineering properties of the treated soft soils. The significant changes in the physicochemical and electrical properties suggested that their corresponding changes can be utilized as a monitoring technique to evaluate the improvement in the engineering properties EK treated soft clay soils.Keywords: electrokinetic, electrolytes, exchange ions, geosynthetic electrodes, soft soils
Procedia PDF Downloads 3141586 Isothermal Solid-Phase Amplification System for Detection of Yersinia pestis
Authors: Olena Mayboroda, Angel Gonzalez Benito, Jonathan Sabate Del Rio, Marketa Svobodova, Sandra Julich, Herbert Tomaso, Ciara K. O'Sullivan, Ioanis Katakis
Abstract:
DNA amplification is required for most molecular diagnostic applications but conventional PCR has disadvantages for field testing. Isothermal amplification techniques are being developed to respond to this problem. One of them is the Recombinase Polymerase Amplification (RPA) that operates at isothermal conditions without sacrificing specificity and sensitivity in easy-to-use formats. In this work RPA was used for the optical detection of solid-phase amplification of the potential biowarfare agent Yersinia pestis. Thiolated forward primers were immobilized on the surface of maleimide-activated microtitre plates for the quantitative detection of synthetic and genomic DNA, with elongation occurring only in the presence of the specific template DNA and solution phase reverse primers. Quantitative detection was achieved via the use of biotinylated reverse primers and post-amplification addition of streptavidin-HRP conjugate. The overall time of amplification and detection was less than 1 hour at a constant temperature of 37oC. Single-stranded and double-stranded DNA sequences were detected achieving detection limits of 4.04*10-13 M and 3.14*10-16 M, respectively. The system demonstrated high specificity with negligible responses to non-specific targets.Keywords: recombinase polymerase amplification, Yersinia pestis, solid-phase detection, ELONA
Procedia PDF Downloads 3031585 A Follow–Up Study of Bachelor of Science Graduates in Applied Statistics from Suan Sunandha Rajabhat University during the 1999-2012 Academic Years
Authors: Somruedee Pongsena
Abstract:
The purpose of this study is to follow up on the graduated students of Bachelor of Science in Applied Statistics from Suan Sunandha Rajabhat University (SSRU) during the 1999 – 2012 academic years and to provide the fundamental guideline for developing the current curriculum according to Thai Qualifications Framework for Higher Education (TQF: HEd). The sample was collected from 75 graduates by interview and online questionnaire. The content covered 5 subjects: ethics and moral, knowledge, cognitive skills, interpersonal skills and responsibility, numerical analysis as well as communication and information technology skills. Data were analyzed by using statistical methods as percentiles, means, standard deviation, t-tests, and F-tests. The findings showed that samples were mostly females younger than 26 years old. The majority of graduates had income in the range of 10,001-20,000 Baht and their experience range was 2-5 years. In addition, overall opinions from receiving knowledge to apply to work were at agree; mean score was 3.97 and standard deviation was 0.40. In terms of opinion difference, the hypothesis' testing results indicate gender only had different opinion at a significant level of 0.05.Keywords: follow-up, graduates, knowledge, opinion, work performance.
Procedia PDF Downloads 2111584 A Computational Fluid Dynamics Study of Turbulence Flow and Parameterization of an Aerofoil
Authors: Mohamed Z. M. Duwahir, Shian Gao
Abstract:
The main objective of this project was to introduce and test a new scheme for parameterization of subsonic aerofoil, using a function called Shape Function. Python programming was used to create a user interactive environment for geometry generation of aerofoil using NACA and Shape Function methodologies. Two aerofoils, NACA 0012 and NACA 1412, were generated using this function. Testing the accuracy of the Shape Function scheme was done by Linear Square Fitting using Python and CFD modelling the aerofoil in Fluent. NACA 0012 (symmetrical aerofoil) was better approximated using Shape Function than NACA 1412 (cambered aerofoil). The second part of the project involved comparing two turbulent models, k-ε and Spalart-Allmaras (SA), in Fluent by modelling the aerofoils NACA 0012 and NACA 1412 in conditions of Reynolds number of 3 × 106. It was shown that SA modelling is better for aerodynamic purpose. The experimental coefficient of lift (Cl) and coefficient of drag (Cd) were compared with empirical wind tunnel data for a range of angle of attack (AOA). As a further step, this project involved drawing and meshing 3D wings in Gambit. The 3D wing flow was solved and compared with 2D aerofoil section experimental results and wind tunnel data.Keywords: CFD simulation, shape function, turbulent modelling, aerofoil
Procedia PDF Downloads 3581583 Field Deployment of Corrosion Inhibitor Developed for Sour Oil and Gas Carbon Steel Pipelines
Authors: Jeremy Moloney
Abstract:
A major oil and gas operator in western Canada producing approximately 50,000 BOE per day of sour fluids was experiencing increased water production along with decreased oil production over several years. The higher water volumes being produced meant an increase in the operator’s incumbent corrosion inhibitor (CI) chemical requirements but with reduced oil production revenues. Thus, a cost-effective corrosion inhibitor solution was sought to deliver enhanced corrosion mitigation of the carbon steel pipeline infrastructure but at reduced chemical injection dose rates. This paper presents the laboratory work conducted on the development of a corrosion inhibitor under the operator’s simulated sour operating conditions and then subsequent field testing of the product. The new CI not only provided extremely good levels of general and localized corrosion inhibition and outperformed the incumbent CI under the laboratory test conditions but did so at vastly lower concentrations. In turn, the novel CI product facilitated field chemical injection rates to be optimized and reduced by 40% compared with the incumbent whilst maintaining superior corrosion protection resulting in significant cost savings and associated sustainability benefits for the operator.Keywords: carbon steel, sour gas, hydrogen sulphide, localized corrosion, pitting, corrosion inhibitor
Procedia PDF Downloads 851582 Perceptions of Educators on the Learners’ Youngest Age for the Introduction of ICTs in Schools: A Personality Theory Approach
Authors: Kayode E. Oyetade, Seraphin D. Eyono Obono
Abstract:
Age ratings are very helpful in providing parents with relevant information for the purchase and use of digital technologies by the children; this is why the non-definition of age ratings for the use of ICT's by children in schools is a major concern; and this problem serves as a motivation for this study whose aim is to examine the factors affecting the perceptions of educators on the learners’ youngest age for the introduction of ICT's in schools. This aim is achieved through two types of research objectives: the identification and design of theories and models on age ratings, and the empirical testing of such theories and models in a survey of educators from the Camperdown district of the South African KwaZulu-Natal province. A questionnaire is used for the collection of the data of this survey whose validity and reliability is checked in SPSS prior to its descriptive and correlative quantitative analysis. The main hypothesis supporting this research is the association between the demographics of educators, their personality, and their perceptions on the learners’ youngest age for the introduction of ICT's in schools; as claimed by existing research; except that the present study looks at personality from three dimensions: self-actualized personalities, fully functioning personalities, and healthy personalities. This hypothesis was fully confirmed by the empirical study conducted by this research except for the demographic factor where only the educators’ grade or class was found to be associated with the personality of educators.Keywords: age ratings, educators, e-learning, personality theories
Procedia PDF Downloads 2371581 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming
Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero
Abstract:
Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up
Procedia PDF Downloads 2441580 Imp_hist-Si: Improved Hybrid Image Segmentation Technique for Satellite Imagery to Decrease the Segmentation Error Rate
Authors: Neetu Manocha
Abstract:
Image segmentation is a technique where a picture is parted into distinct parts having similar features which have a place with similar items. Various segmentation strategies have been proposed as of late by prominent analysts. But, after ultimate thorough research, the novelists have analyzed that generally, the old methods do not decrease the segmentation error rate. Then author finds the technique HIST-SI to decrease the segmentation error rates. In this technique, cluster-based and threshold-based segmentation techniques are merged together. After then, to improve the result of HIST-SI, the authors added the method of filtering and linking in this technique named Imp_HIST-SI to decrease the segmentation error rates. The goal of this research is to find a new technique to decrease the segmentation error rates and produce much better results than the HIST-SI technique. For testing the proposed technique, a dataset of Bhuvan – a National Geoportal developed and hosted by ISRO (Indian Space Research Organisation) is used. Experiments are conducted using Scikit-image & OpenCV tools of Python, and performance is evaluated and compared over various existing image segmentation techniques for several matrices, i.e., Mean Square Error (MSE) and Peak Signal Noise Ratio (PSNR).Keywords: satellite image, image segmentation, edge detection, error rate, MSE, PSNR, HIST-SI, linking, filtering, imp_HIST-SI
Procedia PDF Downloads 1401579 A Slip Transmission through Alpha/Beta Boundaries in a Titanium Alloy (Ti-6Al-4V)
Authors: Rayan B. M. Ameen, Ian P. Jones, Yu Lung Chiu
Abstract:
Single alpha-beta colony micro-pillars have been manufactured from a polycrystalline commercial Ti-6Al-4V sample using Focused Ion Beam (FIB). Each pillar contained two alpha lamellae separated by a thin fillet of beta phase. A nano-indenter was then used to conduct uniaxial micro-compression tests on Ti alloy single crystals, using a diamond flat tip as a compression platen. By controlling the crystal orientation along the micro-pillar using Electron back scattering diffraction (EBSD) different slip systems have been selectively activated. The advantage of the micro-compression method over conventional mechanical testing techniques is the ability to localize a single crystal volume which is characterizable after deformation. By matching the stress-strain relations resulting from micro-compression experiments to TEM (Transmission Electron Microscopy) studies of slip transmission mechanisms through the α-β interfaces, some proper constitutive material parameters such as the role of these interfaces in determining yield, strain-hardening behaviour, initial dislocation density and the critical resolved shear stress are suggested.Keywords: α/β-Ti alloy, focused ion beam, micro-mechanical test, nano-indentation, transmission electron diffraction, plastic flow
Procedia PDF Downloads 3851578 Statistical Randomness Testing of Some Second Round Candidate Algorithms of CAESAR Competition
Authors: Fatih Sulak, Betül A. Özdemir, Beyza Bozdemir
Abstract:
In order to improve symmetric key research, several competitions had been arranged by organizations like National Institute of Standards and Technology (NIST) and International Association for Cryptologic Research (IACR). In recent years, the importance of authenticated encryption has rapidly increased because of the necessity of simultaneously enabling integrity, confidentiality and authenticity. Therefore, at January 2013, IACR announced the Competition for Authenticated Encryption: Security, Applicability, and Robustness (CAESAR Competition) which will select secure and efficient algorithms for authenticated encryption. Cryptographic algorithms are anticipated to behave like random mappings; hence, it is important to apply statistical randomness tests to the outputs of the algorithms. In this work, the statistical randomness tests in the NIST Test Suite and the other recently designed randomness tests are applied to six second round algorithms of the CAESAR Competition. It is observed that AEGIS achieves randomness after 3 rounds, Ascon permutation function achieves randomness after 1 round, Joltik encryption function achieves randomness after 9 rounds, Morus state update function achieves randomness after 3 rounds, Pi-cipher achieves randomness after 1 round, and Tiaoxin achieves randomness after 1 round.Keywords: authenticated encryption, CAESAR competition, NIST test suite, statistical randomness tests
Procedia PDF Downloads 3151577 Influence of Silicon Carbide Particle Size and Thermo-Mechanical Processing on Dimensional Stability of Al 2124SiC Nanocomposite
Authors: Mohamed M. Emara, Heba Ashraf
Abstract:
This study is to investigation the effect of silicon carbide (SiC) particle size and thermo-mechanical processing on dimensional stability of aluminum alloy 2124. Three combinations of SiC weight fractions are investigated, 2.5, 5, and 10 wt. % with different SiC particle sizes (25 μm, 5 μm, and 100nm) were produced using mechanical ball mill. The standard testing samples were fabricated using powder metallurgy technique. Both samples, prior and after extrusion, were heated from room temperature up to 400ºC in a dilatometer at different heating rates, that is, 10, 20, and 40ºC/min. The analysis showed that for all materials, there was an increase in length change as temperature increased and the temperature sensitivity of aluminum alloy decreased in the presence of both micro and nano-sized silicon carbide. For all conditions, nanocomposites showed better dimensional stability compared to conventional Al 2124/SiC composites. The after extrusion samples showed better thermal stability and less temperature sensitivity for the aluminum alloy for both micro and nano-sized silicon carbide.Keywords: aluminum 2124 metal matrix composite, SiC nano-sized reinforcements, powder metallurgy, extrusion mechanical ball mill, dimensional stability
Procedia PDF Downloads 5261576 Experimental Investigation on the Mechanical Behaviour of Three-Leaf Masonry Walls under In-Plane Loading
Authors: Osama Amer, Yaser Abdel-Aty, Mohamed Abd El Hady
Abstract:
The present paper illustrates an experimental approach to provide understanding of the mechanical behavior and failure mechanisms of different typologies of unreinforced three-leaf masonry walls of historical Islamic architectural heritage in Egypt. The main objective of this study is to investigate the propagation of possible cracking, ultimate load, deformations and failure mechanisms. Experimental data on interface-shear and compression tests on large scale three-leaf masonry wallets are provided. The wallets were built basically of Egyptian limestone and modified lime mortar. External wallets were built of stone blocks while the inner leaf was built of rubble limestone. Different loading conditions and dimensions of core layer for two types of collar joints (with and without shear keys) are considered in the tests. Mechanical properties of the constituent materials of masonry were tested and a database of characteristic properties was created. The results of the experiments will highlight the properties, force-displacement curves, stress distribution of multiple-leaf masonry walls contributing to the derivation of rational design rules and validation of numerical models.Keywords: masonry, three-leaf walls, mechanical behavior, testing, architectural heritage
Procedia PDF Downloads 2911575 Sensitivity and Specificity of Clinical Testing for Digital Nerve Injury
Authors: Guy Rubin, Ravit Shay, Nimrod Rozen
Abstract:
The accuracy of a diagnostic test used to classify a patient as having disease or being disease-free is a valuable piece of information to be used by the physician when making treatment decisions. Finger laceration, suspected to have nerve injury is a challenging decision for the treating surgeon. The purpose of this study was to evaluate the sensitivity, specificity and predictive values of six clinical tests in the diagnosis of digital nerve injury. The six clinical tests included light touch, pin prick, static and dynamic 2-point discrimination, Semmes Weinstein monofilament and wrinkle test. Data comparing pre-surgery examination with post-surgery results of 42 patients with 52 digital nerve injury was evaluated. The subjective examinations, light touch, pin prick, static and dynamic 2-point discrimination and Semmes-Weinstein monofilament were not sensitive (57.6, 69.7, 42.4, 40 and 66.8% respectively) and specific (36.8, 36.8, 47.4, 42.1 and 31.6% respectively). Wrinkle test, the only objective examination, was the most sensitive (78.1%) and specific (55.6%). This result gives no pre-operative examination the ability to predict the result of explorative surgery.Keywords: digital nerve, injury, nerve examination, Semmes-Weinstein monofilamen, sensitivity, specificity, two point discrimination, wrinkle test
Procedia PDF Downloads 3441574 An Update on Linezolid against Methicillin-Resistant Staphylococcus Aureus Clinical Isolates from Pakistan
Authors: Tayaba Dastgeer, Farhan Rasheed, Muhammad Saeed, Maqsood Ahmad, Zia Ashraf, Abdul Waheed, Muhammad Kamran, Mohsin Khurshid
Abstract:
Objectives: The study aimed to determine the efficacy of linezolid against clinical isolates of methicillin-resistant staphylococcus aureus (MRSA). Methodology: This cross-sectional study was conducted in the microbiology department of Allama Iqbal Medical College Lahore from August 2017 to September 2019. Isolates were confirmed as MRSA via the presence of the mec-A gene. Confirmed MRSA isolates were processed for susceptibility testing against different antimicrobials, especially linezolid, via the disc diffusion method. Zone sizes were interpreted according to CLSI guidelines. Results: Various types of clinical samples were included in the study; however, the highest frequency of MRSA isolates was found in pus samples, followed by other clinical samples. Among hospitalized patients, most MRSA isolates were obtained from patients in the surgical ward. Of 243 mec-A gene detected isolates, Vancomycin and linezolid showed 100% susceptibility, chloramphenicol showed declining resistance 78 (32.09%), and emerging sensitivity 165 (67.90%) against MRSA. Conclusion: Linezolid is a very efficient drug against MRSA, but the use of this novel drug must be conserved for vancomycin-resistant Staphylococcus aureus or when more resistant pathogens are suspected.Keywords: MRSA, chloramphenicol, linezolid, nosocomial infections
Procedia PDF Downloads 971573 Solid-Liquid-Polymer Mixed Matrix Membrane Using Liquid Additive Adsorbed on Activated Carbon Dispersed in Polymeric Membrane for CO2/CH4 Separation
Authors: P. Chultheera, T. Rirksomboon, S. Kulprathipanja, C. Liu, W. Chinsirikul, N. Kerddonfag
Abstract:
Gas separation by selective transport through polymeric membranes is one of the rapid growing branches of membrane technology. However, the tradeoff between the permeability and selectivity is one of the critical challenges encountered by pure polymer membranes, which in turn limits their large-scale application. To enhance gas separation performances, mixed matrix membranes (MMMs) have been developed. In this study, MMMs were prepared by a solution-coating method and tested for CO2/CH4 separation through permeability and selectivity using a membrane testing unit at room temperature and a pressure of 100 psig. The fabricated MMMs were composed of silicone rubber dispersed with the activated carbon individually absorbed with polyethylene glycol (PEG) as a liquid additive. PEG emulsified silicone rubber MMMs showed superior gas separation on cellulose acetate membrane with both high permeability and selectivity compared with silicone rubber membrane and alone support membrane. However, the MMMs performed limited stability resulting from the undesirable PEG leakage. To stabilize the MMMs, PEG was then incorporated into activated carbon by adsorption. It was found that the incorporation of solid and liquid was effective to improve the separation performance of MMMs.Keywords: mixed matrix membrane, membrane, CO₂/CH₄ separation, activated carbon
Procedia PDF Downloads 3421572 Numinous Luminosity: A Mixed Methods Study of Mystical Light Experiences
Authors: J. R. Dinsmore, R. W. Hood
Abstract:
Experiences of a divine or mystical light are frequently reported in religious/spiritual experiences today, most notably in the context of mystical and near-death experiences. Light of a transcendental nature and its experiences of it are also widely present and highly valued in many religious and mystical traditions. Despite the significance of this luminosity to the topic of religious experience, efforts to study the phenomenon empirically have been minimal and scattered. This mixed methods study developed and validated a questionnaire for the measurement of numinous luminosity experience and investigated the dimensions and effects of this novel construct using both quantitative and qualitative methodologies. A sequential explanatory design (participant selection model) was used, which involved a scale development phase, followed by a correlational study testing hypotheses about its effects on beliefs and well-being derived from the literature, and lastly, a phenomenological study of a sample selected from the correlational phase results. The outcomes of the study are a unified theoretical model of numinous luminosity experience across multiple experiential contexts, initial correlational findings regarding the possible mechanism of its reported positive transformational effects, and a valid and reliable instrument for its further empirical study.Keywords: religious experience, mystical experience, near-death experience, scale development, questionnaire, divine light, mystical light, mystical luminosity
Procedia PDF Downloads 951571 Analysis Of Fine Motor Skills in Chronic Neurodegenerative Models of Huntington’s Disease and Amyotrophic Lateral Sclerosis
Authors: T. Heikkinen, J. Oksman, T. Bragge, A. Nurmi, O. Kontkanen, T. Ahtoniemi
Abstract:
Motor impairment is an inherent phenotypic feature of several chronic neurodegenerative diseases, and pharmacological therapies aimed to counterbalance the motor disability have a great market potential. Animal models of chronic neurodegenerative diseases display a number deteriorating motor phenotype during the disease progression. There is a wide array of behavioral tools to evaluate motor functions in rodents. However, currently existing methods to study motor functions in rodents are often limited to evaluate gross motor functions only at advanced stages of the disease phenotype. The most commonly applied traditional motor assays used in CNS rodent models, lack the sensitivity to capture fine motor impairments or improvements. Fine motor skill characterization in rodents provides a more sensitive tool to capture more subtle motor dysfunctions and therapeutic effects. Importantly, similar approach, kinematic movement analysis, is also used in clinic, and applied both in diagnosis and determination of therapeutic response to pharmacological interventions. The aim of this study was to apply kinematic gait analysis, a novel and automated high precision movement analysis system, to characterize phenotypic deficits in three different chronic neurodegenerative animal models, a transgenic mouse model (SOD1 G93A) for amyotrophic lateral sclerosis (ALS), and R6/2 and Q175KI mouse models for Huntington’s disease (HD). The readouts from walking behavior included gait properties with kinematic data, and body movement trajectories including analysis of various points of interest such as movement and position of landmarks in the torso, tail and joints. Mice (transgenic and wild-type) from each model were analyzed for the fine motor kinematic properties at young ages, prior to the age when gross motor deficits are clearly pronounced. Fine motor kinematic Evaluation was continued in the same animals until clear motor dysfunction with conventional motor assays was evident. Time course analysis revealed clear fine motor skill impairments in each transgenic model earlier than what is seen with conventional gross motor tests. Motor changes were quantitatively analyzed for up to ~80 parameters, and the largest data sets of HD models were further processed with principal component analysis (PCA) to transform the pool of individual parameters into a smaller and focused set of mutually uncorrelated gait parameters showing strong genotype difference. Kinematic fine motor analysis of transgenic animal models described in this presentation show that this method isa sensitive, objective and fully automated tool that allows earlier and more sensitive detection of progressive neuromuscular and CNS disease phenotypes. As a result of the analysis a comprehensive set of fine motor parameters for each model is created, and these parameters provide better understanding of the disease progression and enhanced sensitivity of this assay for therapeutic testing compared to classical motor behavior tests. In SOD1 G93A, R6/2, and Q175KI mice, the alterations in gait were evident already several weeks earlier than with traditional gross motor assays. Kinematic testing can be applied to a wider set of motor readouts beyond gait in order to study whole body movement patterns such as with relation to joints and various body parts longitudinally, providing a sophisticated and translatable method for disseminating motor components in rodent disease models and evaluating therapeutic interventions.Keywords: Gait analysis, kinematic, motor impairment, inherent feature
Procedia PDF Downloads 3551570 Consumer Behaviour Model for Apparel E-Tailers Using Structural Equation Modelling
Authors: Halima Akhtar, Abhijeet Chandra
Abstract:
The paper attempts to analyze the factors that influence the Consumer Behavior to purchase apparel through the internet. The intentions to buy apparels online were based on in terms of user style, orientation, size and reputation of the merchant, social influence, perceived information utility, perceived ease of use, perceived pleasure and attractiveness and perceived trust and risk. The basic framework used was Technology acceptance model to explain apparels acceptance. A survey was conducted to gather the data from 200 people. The measures and hypotheses were analyzed using Correlation testing and would be further validated by the Structural Equation Modelling. The implications of the findings for theory and practice could be used by marketers of online apparel websites. Based on the values obtained, we can conclude that the factors such as social influence, Perceived information utility, attractiveness and trust influence the decision for a user to buy apparels online. The major factors which are found to influence an online apparel buying decision are ease of use, attractiveness that a website can offer and the trust factor which a user shares with the website.Keywords: E-tailers, consumer behaviour, technology acceptance model, structural modelling
Procedia PDF Downloads 1861569 Determination of the Shear Strength of Wastes Using Back-Analyses from Observed Failures
Authors: Sadek Salah
Abstract:
The determination of the strength characteristics of waste materials is essential when evaluating the stability of waste fills during initial placement and at the time of closure and rehabilitation of the landfill. Significant efforts, mostly experimental, have been deployed to date in attempts to quantify the mechanical properties of municipal wastes various stages of decomposition. Even though the studies and work done so far have helped in setting baseline parameters and characteristics for waste materials, inherent concerns remain as to the scalability of the findings between the laboratory and the field along with questions as to the suitability of the actual test conditions. These concerns are compounded by the complexity of the problem itself with significant variability in composition, placement conditions, and levels of decay of the various constituents of the waste fills. A complimentary, if not necessarily an alternative approach is to rely on field observations of behavior and instability of such materials. This paper describes an effort at obtaining relevant shear strength parameters from back-analyses of failures which have been observed at a major un-engineered waste fill along the Mediterranean shoreline. Results from the limit-equilibrium failure back-analyses are presented and compared to results from laboratory-scale testing on comparable waste materials.Keywords: solid waste, shear strength, landfills, slope stability
Procedia PDF Downloads 2421568 Importance of Standards in Engineering and Technology Education
Authors: Ahmed S. Khan, Amin Karim
Abstract:
During the past several decades, the economy of each nation has been significantly affected by globalization and technology. Government regulations and private sector standards affect a majority of world trade. Countries have been working together to establish international standards in almost every field. As a result, workers in all sectors need to have an understanding of standards. Engineering and technology students must not only possess an understanding of engineering standards and applicable government codes, but also learn to apply them in designing, developing, testing and servicing products, processes and systems. Accreditation Board for Engineering & Technology (ABET) criteria for engineering and technology education require students to learn and apply standards in their class projects. This paper is a follow-up of a 2006-2009 NSF initiative awarded to IEEE to help develop tutorials and case study modules for students and encourage standards education at college campuses. It presents the findings of a faculty/institution survey conducted through various U.S.-based listservs representing the major engineering and technology disciplines. The intent of the survey was to the gauge the status of use of standards and regulations in engineering and technology coursework and to identify benchmark practices. In light of survey findings, recommendations are made to standards development organizations, industry, and academia to help enhance the use of standards in engineering and technology curricula.Keywords: standards, regulations, ABET, IEEE, engineering, technology curricula
Procedia PDF Downloads 2881567 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic
Authors: Fei Gao, Rodolfo C. Raga Jr.
Abstract:
This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle
Procedia PDF Downloads 751566 Application of Deep Neural Networks to Assess Corporate Credit Rating
Authors: Parisa Golbayani, Dan Wang, Ionut¸ Florescu
Abstract:
In this work we implement machine learning techniques to financial statement reports in order to asses company’s credit rating. Specifically, the work analyzes the performance of four neural network architectures (MLP, CNN, CNN2D, LSTM) in predicting corporate credit rating as issued by Standard and Poor’s. The paper focuses on companies from the energy, financial, and healthcare sectors in the US. The goal of this analysis is to improve application of machine learning algorithms to credit assessment. To accomplish this, the study investigates three questions. First, we investigate if the algorithms perform better when using a selected subset of important features or whether better performance is obtained by allowing the algorithms to select features themselves. Second, we address the temporal aspect inherent in financial data and study whether it is important for the results obtained by a machine learning algorithm. Third, we aim to answer if one of the four particular neural network architectures considered consistently outperforms the others, and if so under which conditions. This work frames the problem as several case studies to answer these questions and analyze the results using ANOVA and multiple comparison testing procedures.Keywords: convolutional neural network, long short term memory, multilayer perceptron, credit rating
Procedia PDF Downloads 2351565 Demographic Characteristics as a Determinant of the use of Health Care Services: Case of Nsukka, Southwest Nigeria
Authors: Beatrice Adeoye
Abstract:
Studies have associated social and demographic characteristics as strong determinants of utilization of health care services; however, not much has been done to explore the dynamics of these variables in Nigeria. This empirical study explores the link between demographic factors and the future use of health care services in Nsukka, southeast Nigeria. A total of 543 respondents were selected using multi-stage sampling technique. The findings of the study showed that majority (56.9%) of the respondents were female while 43.1% were male. More of the respondents were married (50.3%) while 41.80/0 of the respondents were between ages 26-35. Testing the demographic characteristics regarding where people will prefer to go first for treatment with multiple regression, It is only Sex as a demographic variable that indicates positive association for future occurrence to where people will prefer to go first for treatment with 0.08 significance. Age and education indicates no association considering their level of significance. This result shows that sex is one of the determinant factors of where and when people will go for treatment. This is pointing out the realities regarding African society where in the family setting, it is the father that dictates the cause of action. Also to buttress these findings, cross tabulating age with who determines where and when to go for treatment, findings show that majority (58.9%) within age 26-35 said their spouses decide on where and when to go for treatment. Findings showed that patriarchy still plays an important role in the utilization of health care delivery among the people studied.Keywords: Demographic characters, Determinant, Health Care, treatment, self-medication, symptom,
Procedia PDF Downloads 3851564 Identification System for Grading Banana in Food Processing Industry
Authors: Ebenezer O. Olaniyi, Oyebade K. Oyedotun, Khashman Adnan
Abstract:
In the food industry high quality production is required within a limited time to meet up with the demand in the society. In this research work, we have developed a model which can be used to replace the human operator due to their low output in production and slow in making decisions as a result of an individual differences in deciding the defective and healthy banana. This model can perform the vision attributes of human operators in deciding if the banana is defective or healthy for food production based. This research work is divided into two phase, the first phase is the image processing where several image processing techniques such as colour conversion, edge detection, thresholding and morphological operation were employed to extract features for training and testing the network in the second phase. These features extracted in the first phase were used in the second phase; the classification system phase where the multilayer perceptron using backpropagation neural network was employed to train the network. After the network has learned and converges, the network was tested with feedforward neural network to determine the performance of the network. From this experiment, a recognition rate of 97% was obtained and the time taken for this experiment was limited which makes the system accurate for use in the food industry.Keywords: banana, food processing, identification system, neural network
Procedia PDF Downloads 4701563 CVOIP-FRU: Comprehensive VoIP Forensics Report Utility
Authors: Alejandro Villegas, Cihan Varol
Abstract:
Voice over Internet Protocol (VoIP) products is an emerging technology that can contain forensically important information for a criminal activity. Without having the user name and passwords, this forensically important information can still be gathered by the investigators. Although there are a few VoIP forensic investigative applications available in the literature, most of them are particularly designed to collect evidence from the Skype product. Therefore, in order to assist law enforcement with collecting forensically important information from variety of Betamax VoIP tools, CVOIP-FRU framework is developed. CVOIP-FRU provides a data gathering solution that retrieves usernames, contact lists, as well as call and SMS logs from Betamax VoIP products. It is a scripting utility that searches for data within the registry, logs and the user roaming profiles in Windows and Mac OSX operating systems. Subsequently, it parses the output into readable text and html formats. One superior way of CVOIP-FRU compared to the other applications that due to intelligent data filtering capabilities and cross platform scripting back end of CVOIP-FRU, it is expandable to include other VoIP solutions as well. Overall, this paper reveals the exploratory analysis performed in order to find the key data paths and locations, the development stages of the framework, and the empirical testing and quality assurance of CVOIP-FRU.Keywords: betamax, digital forensics, report utility, VoIP, VoIPBuster, VoIPWise
Procedia PDF Downloads 2971562 Capacity Building of Extension Agents for Sustainable Dissemination of Agricultural Information and Technologies in Developing Countries
Authors: Michael T. Ajayi, Oluwakemi E. Fapojuwo
Abstract:
Farmers are in need of regular and relevant information relating to new technologies. Production of extension materials has been found to be useful in facilitating the process. Extension materials help to provide information to reach large numbers of farmers quickly and economically. However, as good as extension materials are, previous materials produced are not used by farmers. The reasons for this include lack of involvement of farmers in the production of the extension materials, most of the extension materials are not relevant to the farmers’ environments, the agricultural extension agents lack capacity to prepare the materials, and many extension agents lack commitment. These problems led to this innovative capacity building of extension agents. This innovative approach involves five stages. The first stage is the diagnostic survey of farmers’ environment to collect useful information. The second stage is the development and production of draft extension materials. The third stage is the field testing and evaluation of draft materials by the same farmers that were involved at the diagnostic stage. The fourth stage is the revision of the draft extension materials by incorporating suggestions from farmers. The fifth stage is the action plans. This process improves the capacity of agricultural extension agents in the preparation of extension materials and also promotes engagement of farmers and beneficiaries in the process. The process also makes farmers assume some level of ownership of the exercise and the extension materials.Keywords: capacity building, extension agents, dissemination, information/technologies
Procedia PDF Downloads 3601561 Modification Encryption Time and Permutation in Advanced Encryption Standard Algorithm
Authors: Dalal N. Hammod, Ekhlas K. Gbashi
Abstract:
Today, cryptography is used in many applications to achieve high security in data transmission and in real-time communications. AES has long gained global acceptance and is used for securing sensitive data in various industries but has suffered from slow processing and take a large time to transfer data. This paper suggests a method to enhance Advance Encryption Standard (AES) Algorithm based on time and permutation. The suggested method (MAES) is based on modifying the SubByte and ShiftRrows in the encryption part and modification the InvSubByte and InvShiftRows in the decryption part. After the implementation of the proposal and testing the results, the Modified AES achieved good results in accomplishing the communication with high performance criteria in terms of randomness, encryption time, storage space, and avalanche effects. The proposed method has good randomness to ciphertext because this method passed NIST statistical tests against attacks; also, (MAES) reduced the encryption time by (10 %) than the time of the original AES; therefore, the modified AES is faster than the original AES. Also, the proposed method showed good results in memory utilization where the value is (54.36) for the MAES, but the value for the original AES is (66.23). Also, the avalanche effects used for calculating diffusion property are (52.08%) for the modified AES and (51.82%) percentage for the original AES.Keywords: modified AES, randomness test, encryption time, avalanche effects
Procedia PDF Downloads 2481560 Amplifying Sine Unit-Convolutional Neural Network: An Efficient Deep Architecture for Image Classification and Feature Visualizations
Authors: Jamshaid Ul Rahman, Faiza Makhdoom, Dianchen Lu
Abstract:
Activation functions play a decisive role in determining the capacity of Deep Neural Networks (DNNs) as they enable neural networks to capture inherent nonlinearities present in data fed to them. The prior research on activation functions primarily focused on the utility of monotonic or non-oscillatory functions, until Growing Cosine Unit (GCU) broke the taboo for a number of applications. In this paper, a Convolutional Neural Network (CNN) model named as ASU-CNN is proposed which utilizes recently designed activation function ASU across its layers. The effect of this non-monotonic and oscillatory function is inspected through feature map visualizations from different convolutional layers. The optimization of proposed network is offered by Adam with a fine-tuned adjustment of learning rate. The network achieved promising results on both training and testing data for the classification of CIFAR-10. The experimental results affirm the computational feasibility and efficacy of the proposed model for performing tasks related to the field of computer vision.Keywords: amplifying sine unit, activation function, convolutional neural networks, oscillatory activation, image classification, CIFAR-10
Procedia PDF Downloads 1111559 The Long-Term Leaching Behaviour of 137Cs, 60Co and 152Eu Radionuclides Incorporated in Mortar Matrices Made from Natural Aggregates and Recycled Aggregates
Authors: R. Deju, M. Mincu, D. Gurau
Abstract:
During the interim storage or final disposal of low level waste, migration/diffusion of radionuclides can occur when the waste comes in contact with water. The long-term leaching behaviour into surrounding fluid (demineralized water) of 137Cs, 60Co and 152Eu radionuclides, artificially incorporated in mortar matrices made from natural aggregates (river sand) and recycled radioactive concrete was studied. Results presented in this work are obtained in two years of mortar testing and will be used for the safety increasing in the storage of low level radioactive waste. The study involved the influence of curing time, type and size distribution of the aggregates on leaching behaviour. The mortar samples were immersed in distilled water for 30 days. The leached activity of the mortar samples was measured on samples from the immersing water and analyzed through a gamma-ray spectrometry method using an HPGe detector with a GESPECOR code for efficiency evaluation. The long-term leaching behaviour of the radionuclides was evaluated from the leaching data calculating the apparent diffusion coefficient.Keywords: gamma spectrometry, leaching behavior, reuse and recycling of radioactive concrete, waste management
Procedia PDF Downloads 2481558 Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction
Authors: Priyadarsini Samal, Rajesh Singla
Abstract:
Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found.Keywords: brain computer interface, electroencephalogram, regression model, stress, word search
Procedia PDF Downloads 187