Search results for: permittivity measurement techniques
6578 Study of Land Use Changes around an Archaeological Site Using Satellite Imagery Analysis: A Case Study of Hathnora, Madhya Pradesh, India
Authors: Pranita Shivankar, Arun Suryawanshi, Prabodhachandra Deshmukh, S. V. C. Kameswara Rao
Abstract:
Many undesirable significant changes in landscapes and the regions in the vicinity of historically important structures occur as impacts due to anthropogenic activities over a period of time. A better understanding of such influences using recently developed satellite remote sensing techniques helps in planning the strategies for minimizing the negative impacts on the existing environment. In 1982, a fossilized hominid skull cap was discovered at a site located along the northern bank of the east-west flowing river Narmada in the village Hathnora. Close to the same site, the presence of Late Acheulian and Middle Palaeolithic tools have been discovered in the immediately overlying pebbly gravel, suggesting that the ‘Narmada skull’ may be from the Middle Pleistocene age. The reviews of recently carried out research studies relevant to hominid remains all over the world from Late Acheulian and Middle Palaeolithic sites suggest succession and contemporaneity of cultures there, enhancing the importance of Hathnora as a rare precious site. In this context, the maximum likelihood classification using digital interpretation techniques was carried out for this study area using the satellite imagery from Landsat ETM+ for the year 2006 and Landsat TM (OLI and TIRS) for the year 2016. The overall accuracy of Land Use Land Cover (LULC) classification of 2016 imagery was around 77.27% based on ground truth data. The significant reduction in the main river course and agricultural activities and increase in the built-up area observed in remote sensing data analysis are undoubtedly the outcome of human encroachments in the vicinity of the eminent heritage site.Keywords: cultural succession, digital interpretation, Hathnora, Homo Sapiens, Late Acheulian, Middle Palaeolithic
Procedia PDF Downloads 1766577 Statistical Quality Control on Assignable Causes of Variation on Cement Production in Ashaka Cement PLC Gombe State
Authors: Hamisu Idi
Abstract:
The present study focuses on studying the impact of influencer recommendation in the quality of cement production. Exploratory research was done on monthly basis, where data were obtained from secondary source i.e. the record kept by an automated recompilation machine. The machine keeps all the records of the mills downtime which the process manager checks for validation and refer the fault (if any) to the department responsible for maintenance or measurement taking so as to prevent future occurrence. The findings indicated that the product of the Ashaka Cement Plc. were considered as qualitative, since all the production processes were found to be in control (preset specifications) with the exception of the natural cause of variation which is normal in the production process as it will not affect the outcome of the product. It is reduced to the bearest minimum since it cannot be totally eliminated. It is also hopeful that the findings of this study would be of great assistance to the management of Ashaka cement factory and the process manager in particular at various levels in the monitoring and implementation of statistical process control. This study is therefore of great contribution to the knowledge in this regard and it is hopeful that it would open more research in that direction.Keywords: cement, quality, variation, assignable cause, common cause
Procedia PDF Downloads 2686576 Optimized Preprocessing for Accurate and Efficient Bioassay Prediction with Machine Learning Algorithms
Authors: Jeff Clarine, Chang-Shyh Peng, Daisy Sang
Abstract:
Bioassay is the measurement of the potency of a chemical substance by its effect on a living animal or plant tissue. Bioassay data and chemical structures from pharmacokinetic and drug metabolism screening are mined from and housed in multiple databases. Bioassay prediction is calculated accordingly to determine further advancement. This paper proposes a four-step preprocessing of datasets for improving the bioassay predictions. The first step is instance selection in which dataset is categorized into training, testing, and validation sets. The second step is discretization that partitions the data in consideration of accuracy vs. precision. The third step is normalization where data are normalized between 0 and 1 for subsequent machine learning processing. The fourth step is feature selection where key chemical properties and attributes are generated. The streamlined results are then analyzed for the prediction of effectiveness by various machine learning algorithms including Pipeline Pilot, R, Weka, and Excel. Experiments and evaluations reveal the effectiveness of various combination of preprocessing steps and machine learning algorithms in more consistent and accurate prediction.Keywords: bioassay, machine learning, preprocessing, virtual screen
Procedia PDF Downloads 2786575 Examining Statistical Monitoring Approach against Traditional Monitoring Techniques in Detecting Data Anomalies during Conduct of Clinical Trials
Authors: Sheikh Omar Sillah
Abstract:
Introduction: Monitoring is an important means of ensuring the smooth implementation and quality of clinical trials. For many years, traditional site monitoring approaches have been critical in detecting data errors but not optimal in identifying fabricated and implanted data as well as non-random data distributions that may significantly invalidate study results. The objective of this paper was to provide recommendations based on best statistical monitoring practices for detecting data-integrity issues suggestive of fabrication and implantation early in the study conduct to allow implementation of meaningful corrective and preventive actions. Methodology: Electronic bibliographic databases (Medline, Embase, PubMed, Scopus, and Web of Science) were used for the literature search, and both qualitative and quantitative studies were sought. Search results were uploaded into Eppi-Reviewer Software, and only publications written in the English language from 2012 were included in the review. Gray literature not considered to present reproducible methods was excluded. Results: A total of 18 peer-reviewed publications were included in the review. The publications demonstrated that traditional site monitoring techniques are not efficient in detecting data anomalies. By specifying project-specific parameters such as laboratory reference range values, visit schedules, etc., with appropriate interactive data monitoring, statistical monitoring can offer early signals of data anomalies to study teams. The review further revealed that statistical monitoring is useful to identify unusual data patterns that might be revealing issues that could impact data integrity or may potentially impact study participants' safety. However, subjective measures may not be good candidates for statistical monitoring. Conclusion: The statistical monitoring approach requires a combination of education, training, and experience sufficient to implement its principles in detecting data anomalies for the statistical aspects of a clinical trial.Keywords: statistical monitoring, data anomalies, clinical trials, traditional monitoring
Procedia PDF Downloads 856574 Aerial Photogrammetry-Based Techniques to Rebuild the 30-Years Landform Changes of a Landslide-Dominated Watershed in Taiwan
Authors: Yichin Chen
Abstract:
Taiwan is an island characterized by an active tectonics and high erosion rates. Monitoring the dynamic landscape of Taiwan is an important issue for disaster mitigation, geomorphological research, and watershed management. Long-term and high spatiotemporal landform data is essential for quantifying and simulating the geomorphological processes and developing warning systems. Recently, the advances in unmanned aerial vehicle (UAV) and computational photogrammetry technology have provided an effective way to rebuild and monitor the topography changes in high spatio-temporal resolutions. This study rebuilds the 30-years landform change in the Aiyuzi watershed in 1986-2017 by using the aerial photogrammetry-based techniques. The Aiyuzi watershed, located in central Taiwan and has an area of 3.99 Km², is famous for its frequent landslide and debris flow disasters. This study took the aerial photos by using UAV and collected multi-temporal historical, stereo photographs, taken by the Aerial Survey Office of Taiwan’s Forestry Bureau. To rebuild the orthoimages and digital surface models (DSMs), Pix4DMapper, a photogrammetry software, was used. Furthermore, to control model accuracy, a set of ground control points was surveyed by using eGPS. The results show that the generated DSMs have the ground sampling distance (GSD) of ~10 cm and ~0.3 cm from the UAV’s and historical photographs, respectively, and vertical error of ~1 m. By comparing the DSMs, there are many deep-seated landslides (with depth over 20 m) occurred on the upstream in the Aiyuzi watershed. Even though a large amount of sediment is delivered from the landslides, the steep main channel has sufficient capacity to transport sediment from the channel and to erode the river bed to ~20 m in depth. Most sediments are transported to the outlet of watershed and deposits on the downstream channel. This case study shows that UAV and photogrammetry technology are useful for topography change monitoring effectively.Keywords: aerial photogrammetry, landslide, landform change, Taiwan
Procedia PDF Downloads 1596573 Evaluation of Soil Thermal-Entropy Properties with a Single-Probe Heat-Pulse Technique
Authors: Abdull Halim Abdull, Nasiman Sapari, Mohammad Haikal Asyraf Bin Anuar
Abstract:
Although soil thermal properties are required in many areas to improve oil recovery, they are seldom measured on a routine basis. Reasons for this are unclear, but may be related to a lack of suitable instrumentation and entropy theory. We integrate single probe thermal gradient for the radial conduction of a short-duration heat pulse away from a single electrode source, and compared it with the theory for an instantaneously heated line source. By measuring the temperature response at a short distance from the line source, and applying short-duration heat-pulse theory, we can extract all the entropy properties, the thermal diffusivity, heat capacity, and conductivity, from a single heat-pulse measurement. Results of initial experiments carried out on air-dry sand and clay materials indicate that this heat-pulse method yields soil thermal properties that compare well with thermal properties measured by single electrode.Keywords: entropy, single probe thermal gradient, soil thermal, probe heat
Procedia PDF Downloads 4516572 The Analysis of Noise Harmfulness in Public Utility Facilities
Authors: Monika Sobolewska, Aleksandra Majchrzak, Bartlomiej Chojnacki, Katarzyna Baruch, Adam Pilch
Abstract:
The main purpose of the study is to perform the measurement and analysis of noise harmfulness in public utility facilities. The World Health Organization reports that the number of people suffering from hearing impairment is constantly increasing. The most alarming is the number of young people occurring in the statistics. The majority of scientific research in the field of hearing protection and noise prevention concern industrial and road traffic noise as the source of health problems. As the result, corresponding standards and regulations defining noise level limits are enforced. However, there is another field uncovered by profound research – leisure time. Public utility facilities such as clubs, shopping malls, sport facilities or concert halls – they all generate high-level noise, being out of proper juridical control. Among European Union Member States, the highest legislative act concerning noise prevention is the Environmental Noise Directive 2002/49/EC. However, it omits the problem discussed above and even for traffic, railway and aircraft noise it does not set limits or target values, leaving these issues to the discretion of the Member State authorities. Without explicit and uniform regulations, noise level control at places designed for relaxation and entertainment is often in the responsibility of people having little knowledge of hearing protection, unaware of the risk the noise pollution poses. Exposure to high sound levels in clubs, cinemas, at concerts and sports events may result in a progressive hearing loss, especially among young people, being the main target group of such facilities and events. The first step to change this situation and to raise the general awareness is to perform reliable measurements the results of which will emphasize the significance of the problem. This project presents the results of more than hundred measurements, performed in most types of public utility facilities in Poland. As the most suitable measuring instrument for such a research, personal noise dosimeters were used to collect the data. Each measurement is presented in the form of numerical results including equivalent and peak sound pressure levels and a detailed description considering the type of the sound source, size and furnishing of the room and the subjective sound level evaluation. In the absence of a straight reference point for the interpretation of the data, the limits specified in EU Directive 2003/10/EC were used for comparison. They set the maximum sound level values for workers in relation to their working time length. The analysis of the examined problem leads to the conclusion that during leisure time, people are exposed to noise levels significantly exceeding safe values. As the hearing problems are gradually progressing, most people underplay the problem, ignoring the first symptoms. Therefore, an effort has to be made to specify the noise regulations for public utility facilities. Without any action, in the foreseeable future the majority of Europeans will be dealing with serious hearing damage, which will have a negative impact on the whole societies.Keywords: hearing protection, noise level limits, noise prevention, noise regulations, public utility facilities
Procedia PDF Downloads 2276571 Comparison of Regional and Local Indwelling Catheter Techniques to Prolong Analgesia in Total Knee Arthroplasty Procedures: Continuous Peripheral Nerve Block and Continuous Periarticular Infiltration
Authors: Jared Cheves, Amanda DeChent, Joyce Pan
Abstract:
Total knee replacements (TKAs) are one of the most common but painful surgical procedures performed in the United States. Currently, the gold standard for postoperative pain management is the utilization of opioids. However, in the wake of the opioid epidemic, the healthcare system is attempting to reduce opioid consumption by trialing innovative opioid sparing analgesic techniques such as continuous peripheral nerve blocks (CPNB) and continuous periarticular infiltration (CPAI). The alleviation of pain, particularly during the first 72 hours postoperatively, is of utmost importance due to its association with delayed recovery, impaired rehabilitation, immunosuppression, the development of chronic pain, the development of rebound pain, and decreased patient satisfaction. While both CPNB and CPAI are being used today, there is limited evidence comparing the two to the current standard of care or to each other. An extensive literature review was performed to explore the safety profiles and effectiveness of CPNB and CPAI in reducing reported pain scores and decreasing opioid consumption. The literature revealed the usage of CPNB contributed to lower pain scores and decreased opioid use when compared to opioid-only control groups. Additionally, CPAI did not improve pain scores or decrease opioid consumption when combined with a multimodal analgesic (MMA) regimen. When comparing CPNB and CPAI to each other, neither unanimously lowered pain scores to a greater degree, but the literature indicates that CPNB decreased opioid consumption more than CPAI. More research is needed to further cement the efficacy of CPNB and CPAI as standard components of MMA in TKA procedures. In addition, future research can also focus on novel catheter-free applications to reduce the complications of continuous catheter analgesics.Keywords: total knee arthroplasty, continuous peripheral nerve blocks, continuous periarticular infiltration, opioid, multimodal analgesia
Procedia PDF Downloads 1016570 Long-Term Results of Coronary Bifurcation Stenting with Drug Eluting Stents
Authors: Piotr Muzyk, Beata Morawiec, Mariusz Opara, Andrzej Tomasik, Brygida Przywara-Chowaniec, Wojciech Jachec, Ewa Nowalany-Kozielska, Damian Kawecki
Abstract:
Background: Coronary bifurcation is one of the most complex lesion in patients with coronary ar-tery disease. Provisional T-stenting is currently one of the recommended techniques. The aim was to assess optimal methods of treatment in the era of drug-eluting stents (DES). Methods: The regis-try consisted of data from 1916 patients treated with coronary percutaneous interventions (PCI) using either first- or second-generation DES. Patients with bifurcation lesion entered the analysis. Major adverse cardiac and cardiovascular events (MACCE) were assessed at one year of follow-up and comprised of death, acute myocardial infarction (AMI), repeated PCI (re-PCI) of target ves-sel and stroke. Results: Of 1916 registry patients, 204 patients (11%) were diagnosed with bifurcation lesion >50% and entered the analysis. The most commonly used technique was provi-sional T-stenting (141 patients, 69%). Optimization with kissing-balloons technique was performed in 45 patients (22%). In 59 patients (29%) second-generation DES was implanted, while in 112 pa-tients (55%), first-generation DES was used. In 33 patients (16%) both types of DES were used. The procedure success rate (TIMI 3 flow) was achieved in 98% of patients. In one-year follow-up, there were 39 MACCE (19%) (9 deaths, 17 AMI, 16 re-PCI and 5 strokes). Provisional T-stenting resulted in similar rate of MACCE to other techniques (16% vs. 5%, p=0.27) and similar occurrence of re-PCI (6% vs. 2%, p=0.78). The results of post-PCI kissing-balloon technique gave equal out-comes with 3% vs. 16% of MACCE in patients in whom no optimization technique was used (p=0.39). The type of implanted DES (second- vs. first-generation) had no influence on MACCE (4% vs 14%, respectively, p=0.12) and re-PCI (1.7% vs. 51% patients, respectively, p=0.28). Con-clusions: The treatment of bifurcation lesions with PCI represent high-risk procedures with high rate of MACCE. Stenting technique, optimization of PCI and the generation of implanted stent should be personalized for each case to balance risk of the procedure. In this setting, the operator experience might be the factor of better outcome, which should be further investigated.Keywords: coronary bifurcation, drug eluting stents, long-term follow-up, percutaneous coronary interventions
Procedia PDF Downloads 2066569 Laser-Ultrasonic Method for Measuring the Local Elastic Moduli of Porosity Isotropic Composite Materials
Authors: Alexander A. Karabutov, Natalia B. Podymova, Elena B. Cherepetskaya, Vladimir A. Makarov, Yulia G. Sokolovskaya
Abstract:
The laser-ultrasonic method is realized for quantifying the influence of porosity on the local Young’s modulus of isotropic composite materials. The method is based on a laser generation of ultrasound pulses combined with measurement of the phase velocity of longitudinal and shear acoustic waves in samples. The main advantage of this method compared with traditional ultrasonic research methods is the efficient generation of short and powerful probing acoustic pulses required for reliable testing of ultrasound absorbing and scattering heterogeneous materials. Using as an example samples of a metal matrix composite with reinforcing microparticles of silicon carbide in various concentrations, it is shown that to provide an effective increase in Young’s modulus with increasing concentration of microparticles, the porosity of the final sample should not exceed 2%.Keywords: laser ultrasonic, longitudinal and shear ultrasonic waves, porosity, composite, local elastic moduli
Procedia PDF Downloads 3506568 Comparison of Inexpensive Cell Disruption Techniques for an Oleaginous Yeast
Authors: Scott Nielsen, Luca Longanesi, Chris Chuck
Abstract:
Palm oil is obtained from the flesh and kernel of the fruit of oil palms and is the most productive and inexpensive oil crop. The global demand for palm oil is approximately 75 million metric tonnes, a 29% increase in global production of palm oil since 2016. This expansion of oil palm cultivation has resulted in mass deforestation, vast biodiversity destruction and increasing net greenhouse gas emissions. One possible alternative is to produce a saturated oil, similar to palm, from microbes such as oleaginous yeast. The yeasts can be cultured on sugars derived from second-generation sources and do not compete with tropical forests for land. One highly promising oleaginous yeast for this application is Metschnikowia pulcherrima. However, recent techno-economic modeling has shown that cell lysis and standard lipid extraction are major contributors to the cost of the oil. Typical cell disruption techniques to extract either single cell oils or proteins have been based around bead-beating, homogenization and acid lysis. However, these can have a detrimental effect on lipid quality and are energy-intensive. In this study, a vortex separator, which produces high sheer with minimal energy input, was investigated as a potential low energy method of lysing cells. This was compared to four more traditional methods (thermal lysis, acid lysis, alkaline lysis, and osmotic lysis). For each method, the yeast loading was also examined at 1 g/L, 10 g/L and 100 g/L. The quality of the cell disruption was measured by optical cell density, cell counting and the particle size distribution profile comparison over a 2-hour period. This study demonstrates that the vortex separator is highly effective at lysing the cells and could potentially be used as a simple apparatus for lipid recovery in an oleaginous yeast process. The further development of this technology could potentially reduce the overall cost of microbial lipids in the future.Keywords: palm oil substitute, metschnikowia pulcherrima, cell disruption, cell lysis
Procedia PDF Downloads 2106567 The Strategy for Detection of Catecholamines in Body Fluids: Optical Sensor
Authors: Joanna Cabaj, Sylwia Baluta, Karol Malecha, Kamila Drzozga
Abstract:
Catecholamines are the principal neurotransmitters that mediate a variety of the central nervous system functions, such as motor control, cognition, emotion, memory processing, and endocrine modulation. Dysfunctions in catecholamine neurotransmission are induced in some neurologic and neuropsychiatric diseases. Changeable neurotransmitters level in biological fluids can be a marker of several neurological disorders. Because of its significance in analytical techniques and diagnostics, sensitive and selective detection of neurotransmitters is increasingly attracting a lot of attention in different areas of bio-analysis or biomedical research. Recently, fluorescent techniques for detection of catecholamines have attracted interests due to their reasonable cost, convenient control, as well as maneuverability in biological environments. Nevertheless, with the observed need for a sensitive and selective catecholamines sensor, the development of a convenient method for this neurotransmitter is still at its basic level. The manipulation of nanostructured materials in conjunction with biological molecules has led to the development of a new class of hybrid modified biosensors in which both enhancement of charge transport and biological activity preservation may be obtained. Immobilization of biomaterials on electrode surfaces is the crucial step in fabricating electrochemical as well as optical biosensors and bioelectronic devices. Continuing systematic investigation in the manufacturing of enzyme–conducting sensitive systems, here is presented a convenient fluorescence sensing strategy for catecholamines detection based on FRET (fluorescence resonance energy transfer) phenomena observed for, i.e., complexes of Fe²⁺ and epinephrine. The biosensor was constructed using low temperature co-fired ceramics technology (LTCC). This sensing system used the catalytical oxidation of catecholamines and quench of the strong luminescence of obtained complexes due to FRET. The detection process was based on the oxidation of substrate in the presence of the enzyme–laccase/tyrosinase.Keywords: biosensor, conducting polymer, enzyme, FRET, LTCC
Procedia PDF Downloads 2626566 A Comparative Analysis of Clustering Approaches for Understanding Patterns in Health Insurance Uptake: Evidence from Sociodemographic Kenyan Data
Authors: Nelson Kimeli Kemboi Yego, Juma Kasozi, Joseph Nkruzinza, Francis Kipkogei
Abstract:
The study investigated the low uptake of health insurance in Kenya despite efforts to achieve universal health coverage through various health insurance schemes. Unsupervised machine learning techniques were employed to identify patterns in health insurance uptake based on sociodemographic factors among Kenyan households. The aim was to identify key demographic groups that are underinsured and to provide insights for the development of effective policies and outreach programs. Using the 2021 FinAccess Survey, the study clustered Kenyan households based on their health insurance uptake and sociodemographic features to reveal patterns in health insurance uptake across the country. The effectiveness of k-prototypes clustering, hierarchical clustering, and agglomerative hierarchical clustering in clustering based on sociodemographic factors was compared. The k-prototypes approach was found to be the most effective at uncovering distinct and well-separated clusters in the Kenyan sociodemographic data related to health insurance uptake based on silhouette, Calinski-Harabasz, Davies-Bouldin, and Rand indices. Hence, it was utilized in uncovering the patterns in uptake. The results of the analysis indicate that inclusivity in health insurance is greatly related to affordability. The findings suggest that targeted policy interventions and outreach programs are necessary to increase health insurance uptake in Kenya, with the ultimate goal of achieving universal health coverage. The study provides important insights for policymakers and stakeholders in the health insurance sector to address the low uptake of health insurance and to ensure that healthcare services are accessible and affordable to all Kenyans, regardless of their socio-demographic status. The study highlights the potential of unsupervised machine learning techniques to provide insights into complex health policy issues and improve decision-making in the health sector.Keywords: health insurance, unsupervised learning, clustering algorithms, machine learning
Procedia PDF Downloads 1486565 Relation of Radar and Hail Parameters in the Continetal Part of Croatia
Authors: Damir Počakal
Abstract:
Continental part Croatia is exposed, mainly in the summer months, to the frequent occurrence of severe thunderstorms and hail. In the 1960s, aiming to protect and reduce the damage, an operational hail suppression system was introduced in that area. The current protected area is 26800 km2 and has about 580 hail suppression stations (rockets and ground generators) which are managed with 8 radar centres (S-band radars). In order to obtain objective and precise hailstone measurement for different research studies, hailpads were installed on all this stations in 2001. Additionally the dense hailpad network with the dimensions of 20 km x 30 km (1 hailpad per 4 km2), was established in the area with the highest average number of days with hail in Croatia in 2002. This paper presents analysis of relation between radar measured parameters of Cb cells in the time of hail fall with physical parameters of hail (max. diameter, number of hail stones and kinetic energy) measured on hailpads in period 2002 -2014. In addition are compared radar parameters of Cb cells with and without hail on the ground located at the same time over the polygon area.Keywords: Cb cell, hail, radar, hailpad
Procedia PDF Downloads 2986564 Determination of the Local Elastic Moduli of Shungite by Laser Ultrasonic Spectroscopy
Authors: Elena B. Cherepetskaya, Alexander A.Karabutov, Vladimir A. Makarov, Elena A. Mironova, Ivan A. Shibaev
Abstract:
In our study, the object of laser ultrasonic testing was plane-parallel plate of shungit (length 41 mm, width 31 mm, height 15 mm, medium exchange density 2247 kg/m3). We used laser-ultrasonic defectoscope with wideband opto-acoustic transducer in our investigation of the velocities of longitudinal and shear elastic ultrasound waves. The duration of arising elastic pulses was less than 100 ns. Under known material thickness, the values of the velocities were determined by the time delay of the pulses reflected from the bottom surface of the sample with respect to reference pulses. The accuracy of measurement was 0.3% in the case of longitudinal wave velocity and 0.5% in the case of shear wave velocity (scanning pitch along the surface was 2 mm). On the base of found velocities of elastic waves, local elastic moduli of shungit (Young modulus, shear modulus and Poisson's ratio) were uniquely determined.Keywords: laser ultrasonic testing , local elastic moduli, shear wave velocity, shungit
Procedia PDF Downloads 3136563 An Investigation of the Weak Localization, Electron-Electron Interaction and the Superconducting Fluctuations in a Weakly Disordered Granular Aluminum Film
Authors: Rukshana Pervin
Abstract:
We report a detailed study on the transport properties of a 40 nm thick granular aluminum film. As measured by temperature-dependent resistance R(T), a resistance peak is observed before the transition to superconductivity, which indicates that the diffusion channel is subjected to weak localization and electron-electron interaction, and the superconductor channel is subjected to SC fluctuations (SCFs). The zero-magnetic field transport measurement demonstrated that Electron-Electron Interaction (EEI), weak localization, and SCFs are closely related in this granular aluminum film. The characteristic temperature at which SCFs emerge on the sample is determined by measuring the R(T) during cooling. The SCF of the film is studied in terms of the direct contribution of the Aslamazov-Larkin's fluctuation Cooper pair density and the indirect contribution of the Maki-Thomson's quasiparticle pair density. In this sample, the rise in R(T) above the SCF characteristic temperature indicates the WL and/or EEI. Comparative analyses are conducted on how the EEI and WL contribute to the upturn in R(T).Keywords: fluctuation superconductivity, weak localization, thermal deposition, electron-electron interaction
Procedia PDF Downloads 586562 Genetically Modified Organisms
Authors: Mudrika Singhal
Abstract:
The research paper is basically about how the genetically modified organisms evolved and their significance in today’s world. It also highlights about the various pros and cons of the genetically modified organisms and the progress of India in this field. A genetically modified organism is the one whose genetic material has been altered using genetic engineering techniques. They have a wide range of uses such as transgenic plants, genetically modified mammals such as mouse and also in insects and aquatic life. Their use is rooted back to the time around 12,000 B.C. when humans domesticated plants and animals. At that humans used genetically modified organisms produced by the procedure of selective breeding and not by genetic engineering techniques. Selective breeding is the procedure in which selective traits are bred in plants and animals and then are domesticated. Domestication of wild plants into a suitable cultigen is a well known example of this technique. GMOs have uses in varied fields ranging from biological and medical research, production of pharmaceutical drugs to agricultural fields. The first organisms to be genetically modified were the microbes because of their simpler genetics. At present the genetically modified protein insulin is used to treat diabetes. In the case of plants transgenic plants, genetically modified crops and cisgenic plants are the examples of genetic modification. In the case of mammals, transgenic animals such as mice, rats etc. serve various purposes such as researching human diseases, improvement in animal health etc. Now coming upon the pros and cons related to the genetically modified organisms, pros include crops with higher yield, less growth time and more predictable in comparison to traditional breeding. Cons include that they are dangerous to mammals such as rats, these products contain protein which would trigger allergic reactions. In India presently, group of GMOs include GM microorganisms, transgenic crops and animals. There are varied applications in the field of healthcare and agriculture. In the nutshell, the research paper is about the progress in the field of genetic modification, taking along the effects in today’s world.Keywords: applications, mammals, transgenic, engineering and technology
Procedia PDF Downloads 5996561 Understanding Beginning Writers' Narrative Writing with a Multidimensional Assessment Approach
Authors: Huijing Wen, Daibao Guo
Abstract:
Writing is thought to be the most complex facet of language arts. Assessing writing is difficult and subjective, and there are few scientifically validated assessments exist. Research has proposed evaluating writing using a multidimensional approach, including both qualitative and quantitative measures of handwriting, spelling and prose. Given that narrative writing has historically been a staple of literacy instruction in primary grades and is one of the three major genres Common Core State Standards required students to acquire starting in kindergarten, it is essential for teachers to understand how to measure beginning writers writing development and sources of writing difficulties through narrative writing. Guided by the theoretical models of early written expression and using empirical data, this study examines ways teachers can enact a comprehensive approach to understanding beginning writer’s narrative writing through three writing rubrics developed for a Curriculum-based Measurement (CBM). The goal is to help classroom teachers structure a framework for assessing early writing in primary classrooms. Participants in this study included 380 first-grade students from 50 classrooms in 13 schools in three school districts in a Mid-Atlantic state. Three writing tests were used to assess first graders’ writing skills in relation to both transcription (i.e., handwriting fluency and spelling tests) and translational skills (i.e., a narrative prompt). First graders were asked to respond to a narrative prompt in 20 minutes. Grounded in theoretical models of earlier expression and empirical evidence of key contributors to early writing, all written samples to the narrative prompt were coded three ways for different dimensions of writing: length, quality, and genre elements. To measure the quality of the narrative writing, a traditional holistic rating rubric was developed by the researchers based on the CCSS and the general traits of good writing. Students' genre knowledge was measured by using a separate analytic rubric for narrative writing. Findings showed that first-graders had emerging and limited transcriptional and translational skills with a nascent knowledge of genre conventions. The findings of the study provided support for the Not-So-Simple View of Writing in that fluent written expression, measured by length and other important linguistic resources measured by the overall quality and genre knowledge rubrics, are fundamental in early writing development. Our study echoed previous research findings on children's narrative development. The study has practical classroom application as it informs writing instruction and assessment. It offered practical guidelines for classroom instruction by providing teachers with a better understanding of first graders' narrative writing skills and knowledge of genre conventions. Understanding students’ narrative writing provides teachers with more insights into specific strategies students might use during writing and their understanding of good narrative writing. Additionally, it is important for teachers to differentiate writing instruction given the individual differences shown by our multiple writing measures. Overall, the study shed light on beginning writers’ narrative writing, indicating the complexity of early writing development.Keywords: writing assessment, early writing, beginning writers, transcriptional skills, translational skills, primary grades, simple view of writing, writing rubrics, curriculum-based measurement
Procedia PDF Downloads 826560 Surface Geodesic Derivative Pattern for Deformable Textured 3D Object Comparison: Application to Expression and Pose Invariant 3D Face Recognition
Authors: Farshid Hajati, Soheila Gheisari, Ali Cheraghian, Yongsheng Gao
Abstract:
This paper presents a new Surface Geodesic Derivative Pattern (SGDP) for matching textured deformable 3D surfaces. SGDP encodes micro-pattern features based on local surface higher-order derivative variation. It extracts local information by encoding various distinctive textural relationships contained in a geodesic neighborhood, hence fusing texture and range information of a surface at the data level. Geodesic texture rings are encoded into local patterns for similarity measurement between non-rigid 3D surfaces. The performance of the proposed method is evaluated extensively on the Bosphorus and FRGC v2 face databases. Compared to existing benchmarks, experimental results show the effectiveness and superiority of combining the texture and 3D shape data at the earliest level in recognizing typical deformable faces under expression, illumination, and pose variations.Keywords: 3D face recognition, pose, expression, surface matching, texture
Procedia PDF Downloads 3956559 A Multi Sensor Monochrome Video Fusion Using Image Quality Assessment
Authors: M. Prema Kumar, P. Rajesh Kumar
Abstract:
The increasing interest in image fusion (combining images of two or more modalities such as infrared and visible light radiation) has led to a need for accurate and reliable image assessment methods. This paper gives a novel approach of merging the information content from several videos taken from the same scene in order to rack up a combined video that contains the finest information coming from different source videos. This process is known as video fusion which helps in providing superior quality (The term quality, connote measurement on the particular application.) image than the source images. In this technique different sensors (whose redundant information can be reduced) are used for various cameras that are imperative for capturing the required images and also help in reducing. In this paper Image fusion technique based on multi-resolution singular value decomposition (MSVD) has been used. The image fusion by MSVD is almost similar to that of wavelets. The idea behind MSVD is to replace the FIR filters in wavelet transform with singular value decomposition (SVD). It is computationally very simple and is well suited for real time applications like in remote sensing and in astronomy.Keywords: multi sensor image fusion, MSVD, image processing, monochrome video
Procedia PDF Downloads 5786558 Italian Speech Vowels Landmark Detection through the Legacy Tool 'xkl' with Integration of Combined CNNs and RNNs
Authors: Kaleem Kashif, Tayyaba Anam, Yizhi Wu
Abstract:
This paper introduces a methodology for advancing Italian speech vowels landmark detection within the distinctive feature-based speech recognition domain. Leveraging the legacy tool 'xkl' by integrating combined convolutional neural networks (CNNs) and recurrent neural networks (RNNs), the study presents a comprehensive enhancement to the 'xkl' legacy software. This integration incorporates re-assigned spectrogram methodologies, enabling meticulous acoustic analysis. Simultaneously, our proposed model, integrating combined CNNs and RNNs, demonstrates unprecedented precision and robustness in landmark detection. The augmentation of re-assigned spectrogram fusion within the 'xkl' software signifies a meticulous advancement, particularly enhancing precision related to vowel formant estimation. This augmentation catalyzes unparalleled accuracy in landmark detection, resulting in a substantial performance leap compared to conventional methods. The proposed model emerges as a state-of-the-art solution in the distinctive feature-based speech recognition systems domain. In the realm of deep learning, a synergistic integration of combined CNNs and RNNs is introduced, endowed with specialized temporal embeddings, harnessing self-attention mechanisms, and positional embeddings. The proposed model allows it to excel in capturing intricate dependencies within Italian speech vowels, rendering it highly adaptable and sophisticated in the distinctive feature domain. Furthermore, our advanced temporal modeling approach employs Bayesian temporal encoding, refining the measurement of inter-landmark intervals. Comparative analysis against state-of-the-art models reveals a substantial improvement in accuracy, highlighting the robustness and efficacy of the proposed methodology. Upon rigorous testing on a database (LaMIT) speech recorded in a silent room by four Italian native speakers, the landmark detector demonstrates exceptional performance, achieving a 95% true detection rate and a 10% false detection rate. A majority of missed landmarks were observed in proximity to reduced vowels. These promising results underscore the robust identifiability of landmarks within the speech waveform, establishing the feasibility of employing a landmark detector as a front end in a speech recognition system. The synergistic integration of re-assigned spectrogram fusion, CNNs, RNNs, and Bayesian temporal encoding not only signifies a significant advancement in Italian speech vowels landmark detection but also positions the proposed model as a leader in the field. The model offers distinct advantages, including unparalleled accuracy, adaptability, and sophistication, marking a milestone in the intersection of deep learning and distinctive feature-based speech recognition. This work contributes to the broader scientific community by presenting a methodologically rigorous framework for enhancing landmark detection accuracy in Italian speech vowels. The integration of cutting-edge techniques establishes a foundation for future advancements in speech signal processing, emphasizing the potential of the proposed model in practical applications across various domains requiring robust speech recognition systems.Keywords: landmark detection, acoustic analysis, convolutional neural network, recurrent neural network
Procedia PDF Downloads 676557 Using the Technology Acceptance Model to Examine Seniors’ Attitudes toward Facebook
Authors: Chien-Jen Liu, Shu Ching Yang
Abstract:
Using the technology acceptance model (TAM), this study examined the external variables of technological complexity (TC) to acquire a better understanding of the factors that influence the acceptance of computer application courses by learners at Active Aging Universities. After the learners in this study had completed a 27-hour Facebook course, 44 learners responded to a modified TAM survey. Data were collected to examine the path relationships among the variables that influence the acceptance of Facebook-mediated community learning. The partial least squares (PLS) method was used to test the measurement and the structural model. The study results demonstrated that attitudes toward Facebook use directly influence behavioral intentions (BI) with respect to Facebook use, evincing a high prediction rate of 58.3%. In addition to the perceived usefulness (PU) and perceived ease of use (PEOU) measures that are proposed in the TAM, other external variables, such as TC, also indirectly influence BI. These four variables can explain 88% of the variance in BI and demonstrate a high level of predictive ability. Finally, limitations of this investigation and implications for further research are discussed.Keywords: technology acceptance model (TAM), technological complexity, partial least squares (PLS), perceived usefulness
Procedia PDF Downloads 3496556 Rapid Algorithm for GPS Signal Acquisition
Authors: Fabricio Costa Silva, Samuel Xavier de Souza
Abstract:
A Global Positioning System (GPS) receiver is responsible to determine position, velocity and timing information by using satellite information. To get this information are necessary to combine an incoming and a locally generated signal. The procedure called acquisition need to found two information, the frequency and phase of the incoming signal. This is very time consuming, so there are several techniques to reduces the computational complexity, but each of then put projects issues in conflict. I this papers we present a method that can reduce the computational complexity by reducing the search space and paralleling the search.Keywords: GPS, acquisition, complexity, parallelism
Procedia PDF Downloads 5426555 Advanced Stability Criterion for Time-Delayed Systems of Neutral Type and Its Application
Authors: M. J. Park, S. H. Lee, C. H. Lee, O. M. Kwon
Abstract:
This paper investigates stability problem for linear systems of neutral type with time-varying delay. By constructing various Lyapunov-Krasovskii functional, and utilizing some mathematical techniques, the sufficient stability conditions for the systems are established in terms of linear matrix inequalities (LMIs), which can be easily solved by various effective optimization algorithms. Finally, some illustrative examples are given to show the effectiveness of the proposed criterion.Keywords: neutral systems, time-delay, stability, Lyapnov method, LMI
Procedia PDF Downloads 3526554 Effects of Plyometric Exercises on Agility, Power and Speed Improvement of U-17 Female Sprinters in Case of Burayu Athletics Project, Oromia, Ethiopia
Authors: Abdeta Bayissa Mekessa
Abstract:
The purpose of this study was to examine the effects of plyometric exercises on agility, power, and speed and improvement of U-17 female sprinters in the case of the Burayu Athletics project. The true experimental research design was employed for conducting this study. The total populations of the study were 14 U-17 female sprinters from Burayu athletics project. The populations were small in numbers; therefore, the researcher took all as a sample by using comprehensive sampling techniques. These subjects were classified into the Experimental group (N=7) and the Control group (N=7) by using simple random sampling techniques. The Experimental group participated in plyometric training for 8 weeks, 3 days per week and 60 minutes duration per day in addition to their regular training. But, the control groups were following their only regular training program. The variables selected for the purpose of this study were agility, power and speed. The tests were the Illinois agility test, standing long jump test, and 30m sprint test, respectively. Both groups were tested before (pre-test) and after (post-test) 8 weeks of plyometric training. For data analysis, the researcher used SPSS version 26.0 software. The collected data was analyzed using a paired sample t-test to observe the difference between the pre-test and post-test results of the plyometric exercises of the study. The significant level of p<0.05 was considered. The result of the study shows that after 8 weeks of plyometric training, significant improvements were found in Agility (MD=0.45, p<0.05), power (MD=-1.157, P<0.05) and speed (MD=0.37, P<0.05) for experimental group subjects. On the other hand, there was no significant change (P>0.05) in those variables in the control groups. Finally, the findings of the study showed that eight (8) weeks of plyometric exercises had a positive effect on agility, power and speed improvement of female sprinters. Therefore, Athletics coaches and athletes are highly recommended to include plyometric exercise in their training program.Keywords: ploymetric exercise, speed power, aglity, female sprinter
Procedia PDF Downloads 436553 Heuristic of Style Transfer for Real-Time Detection or Classification of Weather Conditions from Camera Images
Authors: Hamed Ouattara, Pierre Duthon, Frédéric Bernardin, Omar Ait Aider, Pascal Salmane
Abstract:
In this article, we present three neural network architectures for real-time classification of weather conditions (sunny, rainy, snowy, foggy) from images. Inspired by recent advances in style transfer, two of these architectures -Truncated ResNet50 and Truncated ResNet50 with Gram Matrix and Attention- surpass the state of the art and demonstrate re-markable generalization capability on several public databases, including Kaggle (2000 images), Kaggle 850 images, MWI (1996 images) [1], and Image2Weather [2]. Although developed for weather detection, these architectures are also suitable for other appearance-based classification tasks, such as animal species recognition, texture classification, disease detection in medical images, and industrial defect identification. We illustrate these applications in the section “Applications of Our Models to Other Tasks” with the “SIIM-ISIC Melanoma Classification Challenge 2020” [3].Keywords: weather simulation, weather measurement, weather classification, weather detection, style transfer, Pix2Pix, CycleGAN, CUT, neural style transfer
Procedia PDF Downloads 176552 Influence of Chemical Processing Treatment on Handle Properties of Worsted Suiting Fabric
Authors: Priyanka Lokhande, Ram P. Sawant, Ganesh Kakad, Avinash Kolhatkar
Abstract:
In order to evaluate the influence of chemical processing on low-stress mechanical properties and fabric hand of worsted cloth, eight worsted suiting fabric samples of balance plain and twill weave were studied. The Kawabata KES-FB system has been used for the measurement of low-stress mechanical properties of before and after chemically processed worsted suiting fabrics. Primary hand values and Total Hand Values (THV) of before and after chemically processed worsted suiting fabrics were calculated using the KES-FB test data. Upon statistical analysis, it is observed that chemical processing has considerable influence on the low-stress mechanical properties and thereby on handle properties of worsted suiting fabrics. Improvement in the Total Hand Values (THV) after chemical processing is experienced in most of fabric samples.Keywords: low stress mechanical properties, plain and twill weave, total hand value (THV), worsted suiting fabric
Procedia PDF Downloads 2866551 Basic Evaluation for Polyetherimide Membrane Using Spectroscopy Techniques
Authors: Hanan Alenezi
Abstract:
Membrane performance depends on the kind of solvent used in preparation. A membrane made by Polyetherimide (PEI) was evaluated for gas separation using X-Ray Diffraction (XRD), Scanning electron microscope (SEM), and Energy Dispersive X-Ray Spectroscopy (EDS). The purity and the thickness are detected to evaluate the membrane in order to optimize PEI membrane preparation.Keywords: Energy Dispersive X-Ray Spectroscopy (EDS), Membrane, Polyetherimide PEI, Scanning electron microscope (SEM), Solvent, X-Ray Diffraction (XRD)
Procedia PDF Downloads 1876550 Pedagogical Tools In The 21st Century
Authors: M. Aherrahrou
Abstract:
Moroccan education is currently facing many difficulties and problems due to traditional methods of teaching. Neuro -Linguistic Programming (NLP) appears to hold much potential for education at all levels. In this paper, the major aim is to explore the effect of certain Neuro -Linguistic Programming techniques in one educational institution in Morocco. Quantitative and Qualitative methods are used. The findings prove the effectiveness of this new approach regarding Moroccan education, and it is a promising tool to improve the quality of learning.Keywords: learning and teaching environment, Neuro- Linguistic Programming, education, quality of learning
Procedia PDF Downloads 3586549 Personalizing Human Physical Life Routines Recognition over Cloud-based Sensor Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Pervasive computing is a growing research field that aims to acknowledge human physical life routines (HPLR) based on body-worn sensors such as MEMS sensors-based technologies. The use of these technologies for human activity recognition is progressively increasing. On the other hand, personalizing human life routines using numerous machine-learning techniques has always been an intriguing topic. In contrast, various methods have demonstrated the ability to recognize basic movement patterns. However, it still needs to be improved to anticipate the dynamics of human living patterns. This study introduces state-of-the-art techniques for recognizing static and dy-namic patterns and forecasting those challenging activities from multi-fused sensors. Further-more, numerous MEMS signals are extracted from one self-annotated IM-WSHA dataset and two benchmarked datasets. First, we acquired raw data is filtered with z-normalization and denoiser methods. Then, we adopted statistical, local binary pattern, auto-regressive model, and intrinsic time scale decomposition major features for feature extraction from different domains. Next, the acquired features are optimized using maximum relevance and minimum redundancy (mRMR). Finally, the artificial neural network is applied to analyze the whole system's performance. As a result, we attained a 90.27% recognition rate for the self-annotated dataset, while the HARTH and KU-HAR achieved 83% on nine living activities and 90.94% on 18 static and dynamic routines. Thus, the proposed HPLR system outperformed other state-of-the-art systems when evaluated with other methods in the literature.Keywords: artificial intelligence, machine learning, gait analysis, local binary pattern (LBP), statistical features, micro-electro-mechanical systems (MEMS), maximum relevance and minimum re-dundancy (MRMR)
Procedia PDF Downloads 25