Search results for: de-noising techniques
5443 Implementation and Challenges of Assessment Methods in the Case of Physical Education Class in Some Selected Preparatory Schools of Kirkos Sub-City
Authors: Kibreab Alene Fenite
Abstract:
The purpose of this study is to investigate the implementation and challenges of different assessment methods for physical education class in some selected preparatory schools of kirkos sub city. The participants in this study are teachers, students, department heads and school principals from 4 selected schools. Of the total 8 schools offering in kirkos sub city 4 schools (Dandi Boru, Abiyot Kirse, Assay, and Adey Ababa) are selected by using simple random sampling techniques and from these schools all (100%) of teachers, 100% of department heads and school principals are taken as a sample as their number is manageable. From the total 2520 students, 252 (10%) of students are selected using simple random sampling. Accordingly, 13 teachers, 252 students, 4 department heads and 4 school principals are taken as a sample from the 4 selected schools purposefully. As a method of data gathering tools; questionnaire and interview are employed. To analyze the collected data, both quantitative and qualitative methods are used. The result of the study revealed that assessment in physical education does not implement properly: lack of sufficient materials, inadequate time allotment, large class size, and lack of collaboration and working together of teachers towards assessing the performance of students, absence of guidelines to assess the physical education subject, no different assessment method that is implementing on students with disabilities in line with their special need are found as major challenges in implementing the current assessment method of physical education. To overcome these problems the following recommendations have been forwarded. These are: the necessary facilities and equipment should be available; In order to make reliable, accurate, objective and relevant assessment, teachers of physical education should be familiarized with different assessment techniques; Physical education assessment guidelines should be prepared, and guidelines should include different types of assessment methods; qualified teachers should be employed, and different teaching room must be build.Keywords: assessment, challenges, equipment, guidelines, implementation, performance
Procedia PDF Downloads 2805442 Structural Properties of Surface Modified PVA: Zn97Pr3O Polymer Nanocomposite Free Standing Films
Authors: Pandiyarajan Thangaraj, Mangalaraja Ramalinga Viswanathan, Karthikeyan Balasubramanian, Héctor D. Mansilla, José Ruiz
Abstract:
Rare earth ions doped semiconductor nanostructures gained much attention due to their novel physical and chemical properties which lead to potential applications in laser technology as inexpensive luminescent materials. Doping of rare earth ions into ZnO semiconductor alter its electronic structure and emission properties. Surface modification (polymer covering) is one of the simplest techniques to modify the emission characteristics of host materials. The present work reports the synthesis and structural properties of PVA:Zn97Pr3O polymer nanocomposite free standing films. To prepare Pr3+ doped ZnO nanostructures and PVA:Zn97Pr3O polymer nanocomposite free standing films, the colloidal chemical and solution casting techniques were adopted, respectively. The formation of PVA:Zn97Pr3O films were confirmed through X-ray diffraction (XRD), absorption and Fourier transform infrared (FTIR) spectroscopy analyses. XRD measurements confirm the prepared materials are crystalline having hexagonal wurtzite structure. Polymer composite film exhibits the diffraction peaks of both PVA and ZnO structures. TEM images reveal the pure and Pr3+ doped ZnO nanostructures exhibit sheet like morphology. Optical absorption spectra show free excitonic absorption band of ZnO at 370 nm and, the PVA:Zn97Pr3O polymer film shows absorption bands at ~282 and 368 nm and these arise due to the presence of carbonyl containing structures connected to the PVA polymeric chains, mainly at the ends and free excitonic absorption of ZnO nanostructures, respectively. Transmission spectrum of as prepared film shows 57 to 69% of transparency in the visible and near IR region. FTIR spectral studies confirm the presence of A1 (TO) and E1 (TO) modes of Zn-O bond vibration and the formation of polymer composite materials.Keywords: rare earth doped ZnO, polymer composites, structural characterization, surface modification
Procedia PDF Downloads 3625441 Indian Premier League (IPL) Score Prediction: Comparative Analysis of Machine Learning Models
Authors: Rohini Hariharan, Yazhini R, Bhamidipati Naga Shrikarti
Abstract:
In the realm of cricket, particularly within the context of the Indian Premier League (IPL), the ability to predict team scores accurately holds significant importance for both cricket enthusiasts and stakeholders alike. This paper presents a comprehensive study on IPL score prediction utilizing various machine learning algorithms, including Support Vector Machines (SVM), XGBoost, Multiple Regression, Linear Regression, K-nearest neighbors (KNN), and Random Forest. Through meticulous data preprocessing, feature engineering, and model selection, we aimed to develop a robust predictive framework capable of forecasting team scores with high precision. Our experimentation involved the analysis of historical IPL match data encompassing diverse match and player statistics. Leveraging this data, we employed state-of-the-art machine learning techniques to train and evaluate the performance of each model. Notably, Multiple Regression emerged as the top-performing algorithm, achieving an impressive accuracy of 77.19% and a precision of 54.05% (within a threshold of +/- 10 runs). This research contributes to the advancement of sports analytics by demonstrating the efficacy of machine learning in predicting IPL team scores. The findings underscore the potential of advanced predictive modeling techniques to provide valuable insights for cricket enthusiasts, team management, and betting agencies. Additionally, this study serves as a benchmark for future research endeavors aimed at enhancing the accuracy and interpretability of IPL score prediction models.Keywords: indian premier league (IPL), cricket, score prediction, machine learning, support vector machines (SVM), xgboost, multiple regression, linear regression, k-nearest neighbors (KNN), random forest, sports analytics
Procedia PDF Downloads 535440 Revalidation and Hormonization of Existing IFCC Standardized Hepatic, Cardiac, and Thyroid Function Tests by Precison Optimization and External Quality Assurance Programs
Authors: Junaid Mahmood Alam
Abstract:
Revalidating and harmonizing clinical chemistry analytical principles and optimizing methods through quality control programs and assessments is the preeminent means to attain optimal outcome within the clinical laboratory services. Present study reports revalidation of our existing IFCC regularized analytical methods, particularly hepatic and thyroid function tests, by optimization of precision analyses and processing through external and internal quality assessments and regression determination. Parametric components of hepatic (Bilirubin ALT, γGT, ALP), cardiac (LDH, AST, Trop I) and thyroid/pituitary (T3, T4, TSH, FT3, FT4) function tests were used to validate analytical techniques on automated chemistry and immunological analyzers namely Hitachi 912, Cobas 6000 e601, Cobas c501, Cobas e411 with UV kinetic, colorimetric dry chemistry principles and Electro-Chemiluminescence immunoassay (ECLi) techniques. Process of validation and revalidation was completed with evaluating and assessing the precision analyzed Preci-control data of various instruments plotting against each other with regression analyses R2. Results showed that: Revalidation and optimization of respective parameters that were accredited through CAP, CLSI and NEQAPP assessments depicted 99.0% to 99.8% optimization, in addition to the methodology and instruments used for analyses. Regression R2 analysis of BilT was 0.996, whereas that of ALT, ALP, γGT, LDH, AST, Trop I, T3, T4, TSH, FT3, and FT4 exhibited R2 0.998, 0.997, 0.993, 0.967, 0.970, 0.980, 0.976, 0.996, 0.997, 0.997, and R2 0.990, respectively. This confirmed marked harmonization of analytical methods and instrumentations thus revalidating optimized precision standardization as per IFCC recommended guidelines. It is concluded that practices of revalidating and harmonizing the existing or any new services should be followed by all clinical laboratories, especially those associated with tertiary care hospital. This is will ensure deliverance of standardized, proficiency tested, optimized services for prompt and better patient care that will guarantee maximum patients’ confidence.Keywords: revalidation, standardized, IFCC, CAP, harmonized
Procedia PDF Downloads 2695439 Integration of Educational Data Mining Models to a Web-Based Support System for Predicting High School Student Performance
Authors: Sokkhey Phauk, Takeo Okazaki
Abstract:
The challenging task in educational institutions is to maximize the high performance of students and minimize the failure rate of poor-performing students. An effective method to leverage this task is to know student learning patterns with highly influencing factors and get an early prediction of student learning outcomes at the timely stage for setting up policies for improvement. Educational data mining (EDM) is an emerging disciplinary field of data mining, statistics, and machine learning concerned with extracting useful knowledge and information for the sake of improvement and development in the education environment. The study is of this work is to propose techniques in EDM and integrate it into a web-based system for predicting poor-performing students. A comparative study of prediction models is conducted. Subsequently, high performing models are developed to get higher performance. The hybrid random forest (Hybrid RF) produces the most successful classification. For the context of intervention and improving the learning outcomes, a feature selection method MICHI, which is the combination of mutual information (MI) and chi-square (CHI) algorithms based on the ranked feature scores, is introduced to select a dominant feature set that improves the performance of prediction and uses the obtained dominant set as information for intervention. By using the proposed techniques of EDM, an academic performance prediction system (APPS) is subsequently developed for educational stockholders to get an early prediction of student learning outcomes for timely intervention. Experimental outcomes and evaluation surveys report the effectiveness and usefulness of the developed system. The system is used to help educational stakeholders and related individuals for intervening and improving student performance.Keywords: academic performance prediction system, educational data mining, dominant factors, feature selection method, prediction model, student performance
Procedia PDF Downloads 1065438 Comparative Analysis of the Performance Between Public and Private Companies: Explanatory Factors
Authors: Atziri Moreno Vite, David Silva Gutiérrez
Abstract:
Oil companies have become the key player in the world energy scenario thanks to their strong control of the level of hydrocarbon reserves and production. The present research aims to identify the main factors that explain the results of these companies through an in-depth review of the specialized literature and to analyze the results of these companies by means of econometric analysis with techniques such as Data Envelopment Analysis (DEA). The results show the relevance and impact of factors such as the level of employment or investment of the company.Keywords: oil companies, performance, determinants, productive
Procedia PDF Downloads 1245437 Study of Land Use Changes around an Archaeological Site Using Satellite Imagery Analysis: A Case Study of Hathnora, Madhya Pradesh, India
Authors: Pranita Shivankar, Arun Suryawanshi, Prabodhachandra Deshmukh, S. V. C. Kameswara Rao
Abstract:
Many undesirable significant changes in landscapes and the regions in the vicinity of historically important structures occur as impacts due to anthropogenic activities over a period of time. A better understanding of such influences using recently developed satellite remote sensing techniques helps in planning the strategies for minimizing the negative impacts on the existing environment. In 1982, a fossilized hominid skull cap was discovered at a site located along the northern bank of the east-west flowing river Narmada in the village Hathnora. Close to the same site, the presence of Late Acheulian and Middle Palaeolithic tools have been discovered in the immediately overlying pebbly gravel, suggesting that the ‘Narmada skull’ may be from the Middle Pleistocene age. The reviews of recently carried out research studies relevant to hominid remains all over the world from Late Acheulian and Middle Palaeolithic sites suggest succession and contemporaneity of cultures there, enhancing the importance of Hathnora as a rare precious site. In this context, the maximum likelihood classification using digital interpretation techniques was carried out for this study area using the satellite imagery from Landsat ETM+ for the year 2006 and Landsat TM (OLI and TIRS) for the year 2016. The overall accuracy of Land Use Land Cover (LULC) classification of 2016 imagery was around 77.27% based on ground truth data. The significant reduction in the main river course and agricultural activities and increase in the built-up area observed in remote sensing data analysis are undoubtedly the outcome of human encroachments in the vicinity of the eminent heritage site.Keywords: cultural succession, digital interpretation, Hathnora, Homo Sapiens, Late Acheulian, Middle Palaeolithic
Procedia PDF Downloads 1725436 Examining Statistical Monitoring Approach against Traditional Monitoring Techniques in Detecting Data Anomalies during Conduct of Clinical Trials
Authors: Sheikh Omar Sillah
Abstract:
Introduction: Monitoring is an important means of ensuring the smooth implementation and quality of clinical trials. For many years, traditional site monitoring approaches have been critical in detecting data errors but not optimal in identifying fabricated and implanted data as well as non-random data distributions that may significantly invalidate study results. The objective of this paper was to provide recommendations based on best statistical monitoring practices for detecting data-integrity issues suggestive of fabrication and implantation early in the study conduct to allow implementation of meaningful corrective and preventive actions. Methodology: Electronic bibliographic databases (Medline, Embase, PubMed, Scopus, and Web of Science) were used for the literature search, and both qualitative and quantitative studies were sought. Search results were uploaded into Eppi-Reviewer Software, and only publications written in the English language from 2012 were included in the review. Gray literature not considered to present reproducible methods was excluded. Results: A total of 18 peer-reviewed publications were included in the review. The publications demonstrated that traditional site monitoring techniques are not efficient in detecting data anomalies. By specifying project-specific parameters such as laboratory reference range values, visit schedules, etc., with appropriate interactive data monitoring, statistical monitoring can offer early signals of data anomalies to study teams. The review further revealed that statistical monitoring is useful to identify unusual data patterns that might be revealing issues that could impact data integrity or may potentially impact study participants' safety. However, subjective measures may not be good candidates for statistical monitoring. Conclusion: The statistical monitoring approach requires a combination of education, training, and experience sufficient to implement its principles in detecting data anomalies for the statistical aspects of a clinical trial.Keywords: statistical monitoring, data anomalies, clinical trials, traditional monitoring
Procedia PDF Downloads 755435 Aerial Photogrammetry-Based Techniques to Rebuild the 30-Years Landform Changes of a Landslide-Dominated Watershed in Taiwan
Authors: Yichin Chen
Abstract:
Taiwan is an island characterized by an active tectonics and high erosion rates. Monitoring the dynamic landscape of Taiwan is an important issue for disaster mitigation, geomorphological research, and watershed management. Long-term and high spatiotemporal landform data is essential for quantifying and simulating the geomorphological processes and developing warning systems. Recently, the advances in unmanned aerial vehicle (UAV) and computational photogrammetry technology have provided an effective way to rebuild and monitor the topography changes in high spatio-temporal resolutions. This study rebuilds the 30-years landform change in the Aiyuzi watershed in 1986-2017 by using the aerial photogrammetry-based techniques. The Aiyuzi watershed, located in central Taiwan and has an area of 3.99 Km², is famous for its frequent landslide and debris flow disasters. This study took the aerial photos by using UAV and collected multi-temporal historical, stereo photographs, taken by the Aerial Survey Office of Taiwan’s Forestry Bureau. To rebuild the orthoimages and digital surface models (DSMs), Pix4DMapper, a photogrammetry software, was used. Furthermore, to control model accuracy, a set of ground control points was surveyed by using eGPS. The results show that the generated DSMs have the ground sampling distance (GSD) of ~10 cm and ~0.3 cm from the UAV’s and historical photographs, respectively, and vertical error of ~1 m. By comparing the DSMs, there are many deep-seated landslides (with depth over 20 m) occurred on the upstream in the Aiyuzi watershed. Even though a large amount of sediment is delivered from the landslides, the steep main channel has sufficient capacity to transport sediment from the channel and to erode the river bed to ~20 m in depth. Most sediments are transported to the outlet of watershed and deposits on the downstream channel. This case study shows that UAV and photogrammetry technology are useful for topography change monitoring effectively.Keywords: aerial photogrammetry, landslide, landform change, Taiwan
Procedia PDF Downloads 1575434 Comparison of Regional and Local Indwelling Catheter Techniques to Prolong Analgesia in Total Knee Arthroplasty Procedures: Continuous Peripheral Nerve Block and Continuous Periarticular Infiltration
Authors: Jared Cheves, Amanda DeChent, Joyce Pan
Abstract:
Total knee replacements (TKAs) are one of the most common but painful surgical procedures performed in the United States. Currently, the gold standard for postoperative pain management is the utilization of opioids. However, in the wake of the opioid epidemic, the healthcare system is attempting to reduce opioid consumption by trialing innovative opioid sparing analgesic techniques such as continuous peripheral nerve blocks (CPNB) and continuous periarticular infiltration (CPAI). The alleviation of pain, particularly during the first 72 hours postoperatively, is of utmost importance due to its association with delayed recovery, impaired rehabilitation, immunosuppression, the development of chronic pain, the development of rebound pain, and decreased patient satisfaction. While both CPNB and CPAI are being used today, there is limited evidence comparing the two to the current standard of care or to each other. An extensive literature review was performed to explore the safety profiles and effectiveness of CPNB and CPAI in reducing reported pain scores and decreasing opioid consumption. The literature revealed the usage of CPNB contributed to lower pain scores and decreased opioid use when compared to opioid-only control groups. Additionally, CPAI did not improve pain scores or decrease opioid consumption when combined with a multimodal analgesic (MMA) regimen. When comparing CPNB and CPAI to each other, neither unanimously lowered pain scores to a greater degree, but the literature indicates that CPNB decreased opioid consumption more than CPAI. More research is needed to further cement the efficacy of CPNB and CPAI as standard components of MMA in TKA procedures. In addition, future research can also focus on novel catheter-free applications to reduce the complications of continuous catheter analgesics.Keywords: total knee arthroplasty, continuous peripheral nerve blocks, continuous periarticular infiltration, opioid, multimodal analgesia
Procedia PDF Downloads 965433 Long-Term Results of Coronary Bifurcation Stenting with Drug Eluting Stents
Authors: Piotr Muzyk, Beata Morawiec, Mariusz Opara, Andrzej Tomasik, Brygida Przywara-Chowaniec, Wojciech Jachec, Ewa Nowalany-Kozielska, Damian Kawecki
Abstract:
Background: Coronary bifurcation is one of the most complex lesion in patients with coronary ar-tery disease. Provisional T-stenting is currently one of the recommended techniques. The aim was to assess optimal methods of treatment in the era of drug-eluting stents (DES). Methods: The regis-try consisted of data from 1916 patients treated with coronary percutaneous interventions (PCI) using either first- or second-generation DES. Patients with bifurcation lesion entered the analysis. Major adverse cardiac and cardiovascular events (MACCE) were assessed at one year of follow-up and comprised of death, acute myocardial infarction (AMI), repeated PCI (re-PCI) of target ves-sel and stroke. Results: Of 1916 registry patients, 204 patients (11%) were diagnosed with bifurcation lesion >50% and entered the analysis. The most commonly used technique was provi-sional T-stenting (141 patients, 69%). Optimization with kissing-balloons technique was performed in 45 patients (22%). In 59 patients (29%) second-generation DES was implanted, while in 112 pa-tients (55%), first-generation DES was used. In 33 patients (16%) both types of DES were used. The procedure success rate (TIMI 3 flow) was achieved in 98% of patients. In one-year follow-up, there were 39 MACCE (19%) (9 deaths, 17 AMI, 16 re-PCI and 5 strokes). Provisional T-stenting resulted in similar rate of MACCE to other techniques (16% vs. 5%, p=0.27) and similar occurrence of re-PCI (6% vs. 2%, p=0.78). The results of post-PCI kissing-balloon technique gave equal out-comes with 3% vs. 16% of MACCE in patients in whom no optimization technique was used (p=0.39). The type of implanted DES (second- vs. first-generation) had no influence on MACCE (4% vs 14%, respectively, p=0.12) and re-PCI (1.7% vs. 51% patients, respectively, p=0.28). Con-clusions: The treatment of bifurcation lesions with PCI represent high-risk procedures with high rate of MACCE. Stenting technique, optimization of PCI and the generation of implanted stent should be personalized for each case to balance risk of the procedure. In this setting, the operator experience might be the factor of better outcome, which should be further investigated.Keywords: coronary bifurcation, drug eluting stents, long-term follow-up, percutaneous coronary interventions
Procedia PDF Downloads 2045432 Comparison of Inexpensive Cell Disruption Techniques for an Oleaginous Yeast
Authors: Scott Nielsen, Luca Longanesi, Chris Chuck
Abstract:
Palm oil is obtained from the flesh and kernel of the fruit of oil palms and is the most productive and inexpensive oil crop. The global demand for palm oil is approximately 75 million metric tonnes, a 29% increase in global production of palm oil since 2016. This expansion of oil palm cultivation has resulted in mass deforestation, vast biodiversity destruction and increasing net greenhouse gas emissions. One possible alternative is to produce a saturated oil, similar to palm, from microbes such as oleaginous yeast. The yeasts can be cultured on sugars derived from second-generation sources and do not compete with tropical forests for land. One highly promising oleaginous yeast for this application is Metschnikowia pulcherrima. However, recent techno-economic modeling has shown that cell lysis and standard lipid extraction are major contributors to the cost of the oil. Typical cell disruption techniques to extract either single cell oils or proteins have been based around bead-beating, homogenization and acid lysis. However, these can have a detrimental effect on lipid quality and are energy-intensive. In this study, a vortex separator, which produces high sheer with minimal energy input, was investigated as a potential low energy method of lysing cells. This was compared to four more traditional methods (thermal lysis, acid lysis, alkaline lysis, and osmotic lysis). For each method, the yeast loading was also examined at 1 g/L, 10 g/L and 100 g/L. The quality of the cell disruption was measured by optical cell density, cell counting and the particle size distribution profile comparison over a 2-hour period. This study demonstrates that the vortex separator is highly effective at lysing the cells and could potentially be used as a simple apparatus for lipid recovery in an oleaginous yeast process. The further development of this technology could potentially reduce the overall cost of microbial lipids in the future.Keywords: palm oil substitute, metschnikowia pulcherrima, cell disruption, cell lysis
Procedia PDF Downloads 2055431 The Strategy for Detection of Catecholamines in Body Fluids: Optical Sensor
Authors: Joanna Cabaj, Sylwia Baluta, Karol Malecha, Kamila Drzozga
Abstract:
Catecholamines are the principal neurotransmitters that mediate a variety of the central nervous system functions, such as motor control, cognition, emotion, memory processing, and endocrine modulation. Dysfunctions in catecholamine neurotransmission are induced in some neurologic and neuropsychiatric diseases. Changeable neurotransmitters level in biological fluids can be a marker of several neurological disorders. Because of its significance in analytical techniques and diagnostics, sensitive and selective detection of neurotransmitters is increasingly attracting a lot of attention in different areas of bio-analysis or biomedical research. Recently, fluorescent techniques for detection of catecholamines have attracted interests due to their reasonable cost, convenient control, as well as maneuverability in biological environments. Nevertheless, with the observed need for a sensitive and selective catecholamines sensor, the development of a convenient method for this neurotransmitter is still at its basic level. The manipulation of nanostructured materials in conjunction with biological molecules has led to the development of a new class of hybrid modified biosensors in which both enhancement of charge transport and biological activity preservation may be obtained. Immobilization of biomaterials on electrode surfaces is the crucial step in fabricating electrochemical as well as optical biosensors and bioelectronic devices. Continuing systematic investigation in the manufacturing of enzyme–conducting sensitive systems, here is presented a convenient fluorescence sensing strategy for catecholamines detection based on FRET (fluorescence resonance energy transfer) phenomena observed for, i.e., complexes of Fe²⁺ and epinephrine. The biosensor was constructed using low temperature co-fired ceramics technology (LTCC). This sensing system used the catalytical oxidation of catecholamines and quench of the strong luminescence of obtained complexes due to FRET. The detection process was based on the oxidation of substrate in the presence of the enzyme–laccase/tyrosinase.Keywords: biosensor, conducting polymer, enzyme, FRET, LTCC
Procedia PDF Downloads 2575430 A Comparative Analysis of Clustering Approaches for Understanding Patterns in Health Insurance Uptake: Evidence from Sociodemographic Kenyan Data
Authors: Nelson Kimeli Kemboi Yego, Juma Kasozi, Joseph Nkruzinza, Francis Kipkogei
Abstract:
The study investigated the low uptake of health insurance in Kenya despite efforts to achieve universal health coverage through various health insurance schemes. Unsupervised machine learning techniques were employed to identify patterns in health insurance uptake based on sociodemographic factors among Kenyan households. The aim was to identify key demographic groups that are underinsured and to provide insights for the development of effective policies and outreach programs. Using the 2021 FinAccess Survey, the study clustered Kenyan households based on their health insurance uptake and sociodemographic features to reveal patterns in health insurance uptake across the country. The effectiveness of k-prototypes clustering, hierarchical clustering, and agglomerative hierarchical clustering in clustering based on sociodemographic factors was compared. The k-prototypes approach was found to be the most effective at uncovering distinct and well-separated clusters in the Kenyan sociodemographic data related to health insurance uptake based on silhouette, Calinski-Harabasz, Davies-Bouldin, and Rand indices. Hence, it was utilized in uncovering the patterns in uptake. The results of the analysis indicate that inclusivity in health insurance is greatly related to affordability. The findings suggest that targeted policy interventions and outreach programs are necessary to increase health insurance uptake in Kenya, with the ultimate goal of achieving universal health coverage. The study provides important insights for policymakers and stakeholders in the health insurance sector to address the low uptake of health insurance and to ensure that healthcare services are accessible and affordable to all Kenyans, regardless of their socio-demographic status. The study highlights the potential of unsupervised machine learning techniques to provide insights into complex health policy issues and improve decision-making in the health sector.Keywords: health insurance, unsupervised learning, clustering algorithms, machine learning
Procedia PDF Downloads 1385429 Genetically Modified Organisms
Authors: Mudrika Singhal
Abstract:
The research paper is basically about how the genetically modified organisms evolved and their significance in today’s world. It also highlights about the various pros and cons of the genetically modified organisms and the progress of India in this field. A genetically modified organism is the one whose genetic material has been altered using genetic engineering techniques. They have a wide range of uses such as transgenic plants, genetically modified mammals such as mouse and also in insects and aquatic life. Their use is rooted back to the time around 12,000 B.C. when humans domesticated plants and animals. At that humans used genetically modified organisms produced by the procedure of selective breeding and not by genetic engineering techniques. Selective breeding is the procedure in which selective traits are bred in plants and animals and then are domesticated. Domestication of wild plants into a suitable cultigen is a well known example of this technique. GMOs have uses in varied fields ranging from biological and medical research, production of pharmaceutical drugs to agricultural fields. The first organisms to be genetically modified were the microbes because of their simpler genetics. At present the genetically modified protein insulin is used to treat diabetes. In the case of plants transgenic plants, genetically modified crops and cisgenic plants are the examples of genetic modification. In the case of mammals, transgenic animals such as mice, rats etc. serve various purposes such as researching human diseases, improvement in animal health etc. Now coming upon the pros and cons related to the genetically modified organisms, pros include crops with higher yield, less growth time and more predictable in comparison to traditional breeding. Cons include that they are dangerous to mammals such as rats, these products contain protein which would trigger allergic reactions. In India presently, group of GMOs include GM microorganisms, transgenic crops and animals. There are varied applications in the field of healthcare and agriculture. In the nutshell, the research paper is about the progress in the field of genetic modification, taking along the effects in today’s world.Keywords: applications, mammals, transgenic, engineering and technology
Procedia PDF Downloads 5975428 Rapid Algorithm for GPS Signal Acquisition
Authors: Fabricio Costa Silva, Samuel Xavier de Souza
Abstract:
A Global Positioning System (GPS) receiver is responsible to determine position, velocity and timing information by using satellite information. To get this information are necessary to combine an incoming and a locally generated signal. The procedure called acquisition need to found two information, the frequency and phase of the incoming signal. This is very time consuming, so there are several techniques to reduces the computational complexity, but each of then put projects issues in conflict. I this papers we present a method that can reduce the computational complexity by reducing the search space and paralleling the search.Keywords: GPS, acquisition, complexity, parallelism
Procedia PDF Downloads 5385427 Advanced Stability Criterion for Time-Delayed Systems of Neutral Type and Its Application
Authors: M. J. Park, S. H. Lee, C. H. Lee, O. M. Kwon
Abstract:
This paper investigates stability problem for linear systems of neutral type with time-varying delay. By constructing various Lyapunov-Krasovskii functional, and utilizing some mathematical techniques, the sufficient stability conditions for the systems are established in terms of linear matrix inequalities (LMIs), which can be easily solved by various effective optimization algorithms. Finally, some illustrative examples are given to show the effectiveness of the proposed criterion.Keywords: neutral systems, time-delay, stability, Lyapnov method, LMI
Procedia PDF Downloads 3485426 Effects of Plyometric Exercises on Agility, Power and Speed Improvement of U-17 Female Sprinters in Case of Burayu Athletics Project, Oromia, Ethiopia
Authors: Abdeta Bayissa Mekessa
Abstract:
The purpose of this study was to examine the effects of plyometric exercises on agility, power, and speed and improvement of U-17 female sprinters in the case of the Burayu Athletics project. The true experimental research design was employed for conducting this study. The total populations of the study were 14 U-17 female sprinters from Burayu athletics project. The populations were small in numbers; therefore, the researcher took all as a sample by using comprehensive sampling techniques. These subjects were classified into the Experimental group (N=7) and the Control group (N=7) by using simple random sampling techniques. The Experimental group participated in plyometric training for 8 weeks, 3 days per week and 60 minutes duration per day in addition to their regular training. But, the control groups were following their only regular training program. The variables selected for the purpose of this study were agility, power and speed. The tests were the Illinois agility test, standing long jump test, and 30m sprint test, respectively. Both groups were tested before (pre-test) and after (post-test) 8 weeks of plyometric training. For data analysis, the researcher used SPSS version 26.0 software. The collected data was analyzed using a paired sample t-test to observe the difference between the pre-test and post-test results of the plyometric exercises of the study. The significant level of p<0.05 was considered. The result of the study shows that after 8 weeks of plyometric training, significant improvements were found in Agility (MD=0.45, p<0.05), power (MD=-1.157, P<0.05) and speed (MD=0.37, P<0.05) for experimental group subjects. On the other hand, there was no significant change (P>0.05) in those variables in the control groups. Finally, the findings of the study showed that eight (8) weeks of plyometric exercises had a positive effect on agility, power and speed improvement of female sprinters. Therefore, Athletics coaches and athletes are highly recommended to include plyometric exercise in their training program.Keywords: ploymetric exercise, speed power, aglity, female sprinter
Procedia PDF Downloads 385425 Basic Evaluation for Polyetherimide Membrane Using Spectroscopy Techniques
Authors: Hanan Alenezi
Abstract:
Membrane performance depends on the kind of solvent used in preparation. A membrane made by Polyetherimide (PEI) was evaluated for gas separation using X-Ray Diffraction (XRD), Scanning electron microscope (SEM), and Energy Dispersive X-Ray Spectroscopy (EDS). The purity and the thickness are detected to evaluate the membrane in order to optimize PEI membrane preparation.Keywords: Energy Dispersive X-Ray Spectroscopy (EDS), Membrane, Polyetherimide PEI, Scanning electron microscope (SEM), Solvent, X-Ray Diffraction (XRD)
Procedia PDF Downloads 1835424 Pedagogical Tools In The 21st Century
Authors: M. Aherrahrou
Abstract:
Moroccan education is currently facing many difficulties and problems due to traditional methods of teaching. Neuro -Linguistic Programming (NLP) appears to hold much potential for education at all levels. In this paper, the major aim is to explore the effect of certain Neuro -Linguistic Programming techniques in one educational institution in Morocco. Quantitative and Qualitative methods are used. The findings prove the effectiveness of this new approach regarding Moroccan education, and it is a promising tool to improve the quality of learning.Keywords: learning and teaching environment, Neuro- Linguistic Programming, education, quality of learning
Procedia PDF Downloads 3555423 Personalizing Human Physical Life Routines Recognition over Cloud-based Sensor Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Pervasive computing is a growing research field that aims to acknowledge human physical life routines (HPLR) based on body-worn sensors such as MEMS sensors-based technologies. The use of these technologies for human activity recognition is progressively increasing. On the other hand, personalizing human life routines using numerous machine-learning techniques has always been an intriguing topic. In contrast, various methods have demonstrated the ability to recognize basic movement patterns. However, it still needs to be improved to anticipate the dynamics of human living patterns. This study introduces state-of-the-art techniques for recognizing static and dy-namic patterns and forecasting those challenging activities from multi-fused sensors. Further-more, numerous MEMS signals are extracted from one self-annotated IM-WSHA dataset and two benchmarked datasets. First, we acquired raw data is filtered with z-normalization and denoiser methods. Then, we adopted statistical, local binary pattern, auto-regressive model, and intrinsic time scale decomposition major features for feature extraction from different domains. Next, the acquired features are optimized using maximum relevance and minimum redundancy (mRMR). Finally, the artificial neural network is applied to analyze the whole system's performance. As a result, we attained a 90.27% recognition rate for the self-annotated dataset, while the HARTH and KU-HAR achieved 83% on nine living activities and 90.94% on 18 static and dynamic routines. Thus, the proposed HPLR system outperformed other state-of-the-art systems when evaluated with other methods in the literature.Keywords: artificial intelligence, machine learning, gait analysis, local binary pattern (LBP), statistical features, micro-electro-mechanical systems (MEMS), maximum relevance and minimum re-dundancy (MRMR)
Procedia PDF Downloads 205422 A Review on Investigating the Relations between Water Harvesting and Water Conflicts
Authors: B. Laurita
Abstract:
The importance of Water Harvesting (WH) as an effective mean to deal with water scarcity is universally recognized. The collection and storage of rainwater, floodwater or quick runoff and their conversion to productive uses can ensure water availability for domestic and agricultural use, enabling a lower exploitation of the aquifer, preventing erosion events and providing significant ecosystem services. At the same time, it has been proven that it can reduce the insurgence of water conflicts if supported by a cooperative process of planning and management. On the other hand, the construction of water harvesting structures changes the hydrological regime, affecting upstream-downstream dynamics and changing water allocation, often causing contentions. Furthermore, dynamics existing between water harvesting and water conflict are not properly investigated yet. Thus, objective of this study is to analyze the relations between water harvesting and the insurgence of water conflicts, providing a solid theoretical basis and foundations for future studies. Two search engines were selected in order to perform the study: Google Scholar and Scopus. Separate researches were conducted on the mutual influences between water conflicts and the four main water harvesting techniques: rooftop harvesting, surface harvesting, underground harvesting, runoff harvesting. Some of the aforementioned water harvesting techniques have been developed and implemented on scales ranging from the small, household-sided ones, to gargantuan dam systems. Instead of focusing on the collisions related to large-scale systems, this review is aimed to look for and collect examples of the effects that the implementation of small water harvesting systems has had on the access to the water resource and on water governance. The present research allowed to highlight that in the studies that have been conducted up to now, water harvesting, and in particular those structures that allow the collection and storage of water for domestic use, is usually recognized as a positive, palliative element during contentions. On the other hand, water harvesting can worsen and, in some cases, even generate conflicts for water management. This shows the necessity of studies that consider both benefits and negative influences of water harvesting, analyzing its role respectively as triggering or as mitigating factor of conflicting situations.Keywords: arid areas, governance, water conflicts, water harvesting
Procedia PDF Downloads 2035421 Experimental Modeling of Spray and Water Sheet Formation Due to Wave Interactions with Vertical and Slant Bow-Shaped Model
Authors: Armin Bodaghkhani, Bruce Colbourne, Yuri S. Muzychka
Abstract:
The process of spray-cloud formation and flow kinematics produced from breaking wave impact on vertical and slant lab-scale bow-shaped models were experimentally investigated. Bubble Image Velocimetry (BIV) and Image Processing (IP) techniques were applied to study the various types of wave-model impacts. Different wave characteristics were generated in a tow tank to investigate the effects of wave characteristics, such as wave phase velocity, wave steepness on droplet velocities, and behavior of the process of spray cloud formation. The phase ensemble-averaged vertical velocity and turbulent intensity were computed. A high-speed camera and diffused LED backlights were utilized to capture images for further post processing. Various pressure sensors and capacitive wave probes were used to measure the wave impact pressure and the free surface profile at different locations of the model and wave-tank, respectively. Droplet sizes and velocities were measured using BIV and IP techniques to trace bubbles and droplets in order to measure their velocities and sizes by correlating the texture in these images. The impact pressure and droplet size distributions were compared to several previously experimental models, and satisfactory agreements were achieved. The distribution of droplets in front of both models are demonstrated. Due to the highly transient process of spray formation, the drag coefficient for several stages of this transient displacement for various droplet size ranges and different Reynolds number were calculated based on the ensemble average method. From the experimental results, the slant model produces less spray in comparison with the vertical model, and the droplet velocities generated from the wave impact with the slant model have a lower velocity as compared with the vertical model.Keywords: spray charachteristics, droplet size and velocity, wave-body interactions, bubble image velocimetry, image processing
Procedia PDF Downloads 3005420 Ancient Iran Water Technologies
Authors: Akbar Khodavirdizadeh, Ali Nemati Babaylou, Hassan Moomivand
Abstract:
The history of human access to water technique has been one of the factors in the formation of human civilizations in the ancient world. The technique that makes surface water and groundwater accessible to humans on the ground has been a clever technique in human life to reach the water. In this study, while examining the water technique of ancient Iran using the Qanats technique, the water supply system of different regions of the ancient world were also studied and compared. Six groups of the ancient region of ancient Greece (Archaic 480-750 BC and Classical 223-480 BC), Urartu in Tuspa (600-850 BC), Petra (106-168 BC), Ancient Rome (265 BC), and the ancient United States (1450 BC) and ancient Iranian water technologies were studied under water supply systems. Past water technologies in these areas: water transmission systems in primary urban centers, use of water structures in water control, use of bridges in water transfer, construction of waterways for water transfer, storage of rainfall, construction of various types of pottery- ceramic, lead, wood and stone pipes have been used in water transfer, flood control, water reservoirs, dams, channel, wells, and Qanat. The central plateau of Iran is one of the arid and desert regions. Archaeological, geomorphological, and paleontological studies of the central region of the Iranian plateau showed that without the use of Qanats, the possibility of urban civilization in this region was difficult and even impossible. Zarch aqueduct is the most important aqueduct in Yazd region. Qanat of Zarch is a plain Qanat with a gallery length of 80 km; its mother well is 85 m deep and has 2115 well shafts. The main purpose of building the Qanat of Zārch was to access the groundwater source and transfer it to the surface of the ground. Regarding the structure of the aqueduct and the technique of transferring water from the groundwater source to the surface, it has a great impact on being different from other water techniques in the ancient world. The results show that the use of water technologies in ancient is very important to understand the history of humanity in the use of hydraulic techniques.Keywords: ancient water technologies, groundwaters, qanat, human history, Ancient Iran
Procedia PDF Downloads 1125419 Brain-Computer Interfaces That Use Electroencephalography
Authors: Arda Ozkurt, Ozlem Bozkurt
Abstract:
Brain-computer interfaces (BCIs) are devices that output commands by interpreting the data collected from the brain. Electroencephalography (EEG) is a non-invasive method to measure the brain's electrical activity. Since it was invented by Hans Berger in 1929, it has led to many neurological discoveries and has become one of the essential components of non-invasive measuring methods. Despite the fact that it has a low spatial resolution -meaning it is able to detect when a group of neurons fires at the same time-, it is a non-invasive method, making it easy to use without possessing any risks. In EEG, electrodes are placed on the scalp, and the voltage difference between a minimum of two electrodes is recorded, which is then used to accomplish the intended task. The recordings of EEGs include, but are not limited to, the currents along dendrites from synapses to the soma, the action potentials along the axons connecting neurons, and the currents through the synaptic clefts connecting axons with dendrites. However, there are some sources of noise that may affect the reliability of the EEG signals as it is a non-invasive method. For instance, the noise from the EEG equipment, the leads, and the signals coming from the subject -such as the activity of the heart or muscle movements- affect the signals detected by the electrodes of the EEG. However, new techniques have been developed to differentiate between those signals and the intended ones. Furthermore, an EEG device is not enough to analyze the data from the brain to be used by the BCI implication. Because the EEG signal is very complex, to analyze it, artificial intelligence algorithms are required. These algorithms convert complex data into meaningful and useful information for neuroscientists to use the data to design BCI devices. Even though for neurological diseases which require highly precise data, invasive BCIs are needed; non-invasive BCIs - such as EEGs - are used in many cases to help disabled people's lives or even to ease people's lives by helping them with basic tasks. For example, EEG is used to detect before a seizure occurs in epilepsy patients, which can then prevent the seizure with the help of a BCI device. Overall, EEG is a commonly used non-invasive BCI technique that has helped develop BCIs and will continue to be used to detect data to ease people's lives as more BCI techniques will be developed in the future.Keywords: BCI, EEG, non-invasive, spatial resolution
Procedia PDF Downloads 715418 Tuning of Fixed Wing Micro Aerial Vehicles Using Tethered Setup
Authors: Shoeb Ahmed Adeel, Vivek Paul, K. Prajwal, Michael Fenelon
Abstract:
Techniques have been used to tether and stabilize a multi-rotor MAV but carrying out the same process to a fixed wing MAV is a novel method which can be utilized in order to reduce damage occurring to the fixed wing MAVs while conducting flight test trials and PID tuning. A few sensors and on board controller is required to carry out this experiment in horizontal and vertical plane of the vehicle. Here we will be discussing issues such as sensitivity of the air vehicle, endurance and external load of the string acting on the vehicle.Keywords: MAV, PID tuning, tethered flight, UAV
Procedia PDF Downloads 6355417 Human Factors Interventions for Risk and Reliability Management of Defence Systems
Authors: Chitra Rajagopal, Indra Deo Kumar, Ila Chauhan, Ruchi Joshi, Binoy Bhargavan
Abstract:
Reliability and safety are essential for the success of mission-critical and safety-critical defense systems. Humans are part of the entire life cycle of defense systems development and deployment. The majority of industrial accidents or disasters are attributed to human errors. Therefore, considerations of human performance and human reliability are critical in all complex systems, including defense systems. Defense systems are operating from the ground, naval and aerial platforms in diverse conditions impose unique physical and psychological challenges to the human operators. Some of the safety and mission-critical defense systems with human-machine interactions are fighter planes, submarines, warships, combat vehicles, aerial and naval platforms based missiles, etc. Human roles and responsibilities are also going through a transition due to the infusion of artificial intelligence and cyber technologies. Human operators, not accustomed to such challenges, are more likely to commit errors, which may lead to accidents or loss events. In such a scenario, it is imperative to understand the human factors in defense systems for better systems performance, safety, and cost-effectiveness. A case study using Task Analysis (TA) based methodology for assessment and reduction of human errors in the Air and Missile Defense System in the context of emerging technologies were presented. Action-oriented task analysis techniques such as Hierarchical Task Analysis (HTA) and Operator Action Event Tree (OAET) along with Critical Action and Decision Event Tree (CADET) for cognitive task analysis was used. Human factors assessment based on the task analysis helps in realizing safe and reliable defense systems. These techniques helped in the identification of human errors during different phases of Air and Missile Defence operations, leading to meet the requirement of a safe, reliable and cost-effective mission.Keywords: defence systems, reliability, risk, safety
Procedia PDF Downloads 1355416 Determining Optimum Locations for Runoff Water Harvesting in W. Watir, South Sinai, Using RS, GIS, and WMS Techniques
Authors: H. H. Elewa, E. M. Ramadan, A. M. Nosair
Abstract:
Rainfall water harvesting is considered as an important tool for overcoming water scarcity in arid and semi-arid region. Wadi Watir in the southeastern part of Sinai Peninsula is considered as one of the main and active basins in the Gulf of Aqaba drainage system. It is characterized by steep hills mainly consist of impermeable rocks, whereas the streambeds are covered by a highly permeable mixture of gravel and sand. A comprehensive approach involving the integration of geographic information systems, remote sensing and watershed modeling was followed to identify the RWH capability in this area. Eight thematic layers, viz volume of annual flood, overland flow distance, maximum flow distance, rock or soil infiltration, drainage frequency density, basin area, basin slope and basin length were used as a multi-parametric decision support system for conducting weighted spatial probability models (WSPMs) to determine the potential areas for the RWH. The WSPMs maps classified the area into five RWH potentiality classes ranging from the very low to very high. Three performed WSPMs' scenarios for W. Watir reflected identical results among their maps for the high and very high RWH potentiality classes, which are the most suitable ones for conducting surface water harvesting techniques. There is also a reasonable match with respect to the potentiality of runoff harvesting areas with a probability of moderate, low and very low among the three scenarios. WSPM results have shown that the high and very high classes, which are the most suitable for the RWH are representing approximately 40.23% of the total area of the basin. Accordingly, several locations were decided for the establishment of water harvesting dams and cisterns to improve the water conditions and living environment in the study area.Keywords: Sinai, Wadi Watir, remote sensing, geographic information systems, watershed modeling, runoff water harvesting
Procedia PDF Downloads 3575415 Landfill Site Selection Using Multi-Criteria Decision Analysis A Case Study for Gulshan-e-Iqbal Town, Karachi
Authors: Javeria Arain, Saad Malik
Abstract:
The management of solid waste is a crucial and essential aspect of urban environmental management especially in a city with an ever increasing population such as Karachi. The total amount of municipal solid waste generated from Gulshan e Iqbal town on average is 444.48 tons per day and landfill sites are a widely accepted solution for final disposal of this waste. However, an improperly selected site can have immense environmental, economical and ecological impacts. To select an appropriate landfill site a number of factors should be kept into consideration to minimize the potential hazards of solid waste. The purpose of this research is to analyse the study area for the construction of an appropriate landfill site for disposal of municipal solid waste generated from Gulshan e-Iqbal Town by using geospatial techniques considering hydrological, geological, social and geomorphological factors. This was achieved using analytical hierarchy process and fuzzy analysis as a decision support tool with integration of geographic information sciences techniques. Eight most critical parameters, relevant to the study area, were selected. After generation of thematic layers for each parameter, overlay analysis was performed in ArcGIS 10.0 software. The results produced by both methods were then compared with each other and the final suitability map using AHP shows that 19% of the total area is Least Suitable, 6% is Suitable but avoided, 46% is Moderately Suitable, 26% is Suitable, 2% is Most Suitable and 1% is Restricted. In comparison the output map of fuzzy set theory is not in crisp logic rather it provides an output map with a range of 0-1, where 0 indicates least suitable and 1 indicates most suitable site. Considering the results it is deduced that the northern part of the city is appropriate for constructing the landfill site though a final decision for an optimal site could be made after field survey and considering economical and political factors.Keywords: Analytical Hierarchy Process (AHP), fuzzy set theory, Geographic Information Sciences (GIS), Multi-Criteria Decision Analysis (MCDA)
Procedia PDF Downloads 5045414 Use of Transportation Networks to Optimize The Profit Dynamics of the Product Distribution
Authors: S. Jayasinghe, R. B. N. Dissanayake
Abstract:
Optimization modelling together with the Network models and Linear Programming techniques is a powerful tool in problem solving and decision making in real world applications. This study developed a mathematical model to optimize the net profit by minimizing the transportation cost. This model focuses the transportation among decentralized production plants to a centralized distribution centre and then the distribution among island wide agencies considering the customer satisfaction as a requirement. This company produces basically 9 types of food items with 82 different varieties and 4 types of non-food items with 34 different varieties. Among 6 production plants, 4 were located near the city of Mawanella and the other 2 were located in Galewala and Anuradhapura cities which are 80 km and 150 km away from Mawanella respectively. The warehouse located in the Mawanella was the main production plant and also the only distribution plant. This plant distributes manufactured products to 39 agencies island-wide. The average values and average amount of the goods for 6 consecutive months from May 2013 to October 2013 were collected and then average demand values were calculated. The following constraints are used as the necessary requirement to satisfy the optimum condition of the model; there was one source, 39 destinations and supply and demand for all the agencies are equal. Using transport cost for a kilometer, total transport cost was calculated. Then the model was formulated using distance and flow of the distribution. Network optimization and linear programming techniques were used to originate the model while excel solver is used in solving. Results showed that company requires total transport cost of Rs. 146, 943, 034.50 to fulfil the customers’ requirement for a month. This is very much less when compared with data without using the model. Model also proved that company can reduce their transportation cost by 6% when distributing to island-wide customers. Company generally satisfies their customers’ requirements by 85%. This satisfaction can be increased up to 97% by using this model. Therefore this model can be used by other similar companies in order to reduce the transportation cost.Keywords: mathematical model, network optimization, linear programming
Procedia PDF Downloads 346