Search results for: data mining techniques
26906 Addressing Oral Sensory Issues and Possible Remediation in Children with Autism Spectrum Disorders: Illustrated with a Case Study
Authors: A. K. Aswathy, Asha Manoharan, Arya Manoharan
Abstract:
The purpose of this study are to define the nature of oral sensory issues in children with autism spectrum disorder (ASD), identify important components of the assessment and treatment of this issues specific to this population, and delineate specific therapeutic techniques designed to improve assessment and treatment within therapeutic settings. Literature review and case example is used to define the predominant nature of the oral sensory issues that are experienced by some children on the autism spectrum. Characteristics of this complex disorder that can have an impact on feeding skill and behavior are also identified. These factors are then integrated to create assessment and intervention techniques that can be used in conjunction with traditional feeding approaches to facilitate improvements in eating as well as reducing oral apraxic component in this unique population. The complex nature of ASD and its many influences on feeding skills and behavior create the need for modification to both assessment and treatment approaches. Additional research is needed to create therapeutic protocols that can be used by speech-language pathologists to effectively assess and treat feeding and oro motor apraxic difficulties that are commonly encountered in children with ASD.Keywords: autism, assessment, feeding, intervention, oral sensory issues, oral apraxia
Procedia PDF Downloads 30926905 Impact of Natural Degradation of Low Density Polyethylene on Its Morphology
Authors: Meryem Imane Babaghayou, Asma Abdelhafidi, Salem Fouad Chabira, Mohammed Sebaa
Abstract:
A challenge of plastics industries is the realization of materials that resist the degradation in its application environment, and that to guarantee a longer life time therefore an optimal time of use. Blown extruded films of low-density polyethylene (LDPE) supplied by SABIC SAUDI ARABIA blown and extruded in SOFIPLAST company in Setif ALGERIA , have been subjected to climatic ageing in a sub-Saharan facility at Laghouat (Algeria) with direct exposure to sun. Samples were characterized by X-ray diffraction (XRD) and differential scanning calorimetry (DSC) techniques after prescribed amounts of time up to 8 months. It has been shown via these two techniques the impact of UV irradiation on the morphological development of a plastic material, especially the crystallinity degree which increases with exposure time. The reason of these morphological changes is related to photooxidative reactions leading to cross linking in the beginning and to chain scissions for an advanced stage of ageing this last ones are the first responsible. The crystallinity degree change is essentially controlled by the secondary crystallization of the amorphous chains whose mobility is enhanced by the chain scission processes. The diffusion of these short segments integrates the surface of the lamellae increasing in this way their thicknesses. The results presented highlight the complexity of the involved phenomena.Keywords: Low Density poly (Ethylene), crystallinity, ageing, XRD, DSC
Procedia PDF Downloads 40826904 The Potential in the Use of Building Information Modelling and Life-Cycle Assessment for Retrofitting Buildings: A Study Based on Interviews with Experts in Both Fields
Authors: Alex Gonzalez Caceres, Jan Karlshøj, Tor Arvid Vik
Abstract:
Life cycle of residential buildings are expected to be several decades, 40% of European residential buildings have inefficient energy conservation measure. The existing building represents 20-40% of the energy use and the CO₂ emission. Since net zero energy buildings are a short-term goal, (should be achieved by EU countries after 2020), is necessary to plan the next logical step, which is to prepare the existing outdated stack of building to retrofit them into an energy efficiency buildings. In order to accomplish this, two specialize and widespread tool can be used Building Information Modelling (BIM) and life-cycle assessment (LCA). BIM and LCA are tools used by a variety of disciplines; both are able to represent and analyze the constructions in different stages. The combination of these technologies could improve greatly the retrofitting techniques. The incorporation of the carbon footprint, introducing a single database source for different material analysis. To this is added the possibility of considering different analysis approaches such as costs and energy saving. Is expected with these measures, enrich the decision-making. The methodology is based on two main activities; the first task involved the collection of data this is accomplished by literature review and interview with experts in the retrofitting field and BIM technologies. The results of this task are presented as an evaluation checklist of BIM ability to manage data and improve decision-making in retrofitting projects. The last activity involves an evaluation using the results of the previous tasks, to check how far the IFC format can support the requirements by each specialist, and its uses by third party software. The result indicates that BIM/LCA have a great potential to improve the retrofitting process in existing buildings, but some modification must be done in order to meet the requirements of the specialists for both, retrofitting and LCA evaluators.Keywords: retrofitting, BIM, LCA, energy efficiency
Procedia PDF Downloads 22026903 A DEA Model in a Multi-Objective Optimization with Fuzzy Environment
Authors: Michael Gidey Gebru
Abstract:
Most DEA models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp DEA into DEA with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the DEA model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units’ efficiency. Finally, the developed DEA model is illustrated with an application on real data 50 educational institutions.Keywords: efficiency, DEA, fuzzy, decision making units, higher education institutions
Procedia PDF Downloads 5226902 Industrial Rock Characterization using Nuclear Magnetic Resonance (NMR): A Case Study of Ewekoro Quarry
Authors: Olawale Babatunde Olatinsu, Deborah Oluwaseun Olorode
Abstract:
Industrial rocks were collected from a quarry site at Ewekoro in south-western Nigeria and analysed using Nuclear Magnetic Resonance (NMR) technique. NMR measurement was conducted on the samples in partial water-saturated and full brine-saturated conditions. Raw NMR data were analysed with the aid of T2 curves and T2 spectra generated by inversion of raw NMR data using conventional regularized least-squares inversion routine. Results show that NMR transverse relaxation (T2) signatures fairly adequately distinguish between the rock types. Similar T2 curve trend and rates at partial saturation suggests that the relaxation is mainly due to adsorption of water on micropores of similar sizes while T2 curves at full saturation depict relaxation decay rate as: 1/T2(shale)>1/ T2(glauconite)>1/ T2(limestone) and 1/T2(sandstone). NMR T2 distributions at full brine-saturation show: unimodal distribution in shale; bimodal distribution in sandstone and glauconite; and trimodal distribution in limestone. Full saturation T2 distributions revealed the presence of well-developed and more abundant micropores in all the samples with T2 in the range, 402-504 μs. Mesopores with amplitudes much lower than those of micropores are present in limestone, sandstone and glauconite with T2 range: 8.45-26.10 ms, 6.02-10.55 ms, and 9.45-13.26 ms respectively. Very low amplitude macropores of T2 values, 90.26-312.16 ms, are only recognizable in limestone samples. Samples with multiple peaks showed well-connected pore systems with sandstone having the highest degree of connectivity. The difference in T2 curves and distributions for the rocks at full saturation can be utilised as a potent diagnostic tool for discrimination of these rock types found at Ewekoro.Keywords: Ewekoro, NMR techniques, industrial rocks, characterization, relaxation
Procedia PDF Downloads 29726901 Early Detection of Lymphedema in Post-Surgery Oncology Patients
Authors: Sneha Noble, Rahul Krishnan, Uma G., D. K. Vijaykumar
Abstract:
Breast-Cancer related Lymphedema is a major problem that affects many women. Lymphedema is the swelling that generally occurs in the arms or legs caused by the removal of or damage to lymph nodes as a part of cancer treatment. Treating it at the earliest possible stage is the best way to manage the condition and prevent it from leading to pain, recurrent infection, reduced mobility, and impaired function. So, this project aims to focus on the multi-modal approaches to identify the risks of Lymphedema in post-surgical oncology patients and prevent it at the earliest. The Kinect IR Sensor is utilized to capture the images of the body and after image processing techniques, the region of interest is obtained. Then, performing the voxelization method will provide volume measurements in pre-operative and post-operative periods in patients. The formation of a mathematical model will help in the comparison of values. Clinical pathological data of patients will be investigated to assess the factors responsible for the development of lymphedema and its risks.Keywords: Kinect IR sensor, Lymphedema, voxelization, lymph nodes
Procedia PDF Downloads 13826900 Spatial Interpolation of Intermediate Soil Properties to Enhance Geotechnical Surveying for Foundation Design
Authors: Yelbek B. Utepov, Assel T. Mukhamejanova, Aliya K. Aldungarova, Aida G. Nazarova, Sabit A. Karaulov, Nurgul T. Alibekova, Aigul K. Kozhas, Dias Kazhimkanuly, Akmaral K. Tleubayeva
Abstract:
This research focuses on enhancing geotechnical surveying for foundation design through the spatial interpolation of intermediate soil properties. Traditional geotechnical practices rely on discrete data from borehole drilling, soil sampling, and laboratory analyses, often neglecting the continuous nature of soil properties and disregarding values in intermediate locations. This study challenges these omissions by emphasizing interpolation techniques such as Kriging, Inverse Distance Weighting, and Spline interpolation to capture the nuanced spatial variations in soil properties. The methodology is applied to geotechnical survey data from two construction sites in Astana, Kazakhstan, revealing continuous representations of Young's Modulus, Cohesion, and Friction Angle. The spatial heatmaps generated through interpolation offered valuable insights into the subsurface environment, highlighting heterogeneity and aiding in more informed foundation design decisions for considered cites. Moreover, intriguing patterns of heterogeneity, as well as visual clusters and transitions between soil classes, were explored within seemingly uniform layers. The study bridges the gap between discrete borehole samples and the continuous subsurface, contributing to the evolution of geotechnical engineering practices. The proposed approach, utilizing open-source software geographic information systems, provides a practical tool for visualizing soil characteristics and may pave the way for future advancements in geotechnical surveying and foundation design.Keywords: soil mechanical properties, spatial interpolation, inverse distance weighting, heatmaps
Procedia PDF Downloads 8526899 Data-Driven Decision Making: Justification of Not Leaving Class without It
Authors: Denise Hexom, Judith Menoher
Abstract:
Teachers and administrators across America are being asked to use data and hard evidence to inform practice as they begin the task of implementing Common Core State Standards. Yet, the courses they are taking in schools of education are not preparing teachers or principals to understand the data-driven decision making (DDDM) process nor to utilize data in a much more sophisticated fashion. DDDM has been around for quite some time, however, it has only recently become systematically and consistently applied in the field of education. This paper discusses the theoretical framework of DDDM; empirical evidence supporting the effectiveness of DDDM; a process a department in a school of education has utilized to implement DDDM; and recommendations to other schools of education who attempt to implement DDDM in their decision-making processes and in their students’ coursework.Keywords: data-driven decision making, institute of higher education, special education, continuous improvement
Procedia PDF Downloads 38726898 Quantile Coherence Analysis: Application to Precipitation Data
Authors: Yaeji Lim, Hee-Seok Oh
Abstract:
The coherence analysis measures the linear time-invariant relationship between two data sets and has been studied various fields such as signal processing, engineering, and medical science. However classical coherence analysis tends to be sensitive to outliers and focuses only on mean relationship. In this paper, we generalized cross periodogram to quantile cross periodogram and provide richer inter-relationship between two data sets. This is a general version of Laplace cross periodogram. We prove its asymptotic distribution under the long range process and compare them with ordinary coherence through numerical examples. We also present real data example to confirm the usefulness of quantile coherence analysis.Keywords: coherence, cross periodogram, spectrum, quantile
Procedia PDF Downloads 39026897 Material Parameter Identification of Modified AbdelKarim-Ohno Model
Authors: Martin Cermak, Tomas Karasek, Jaroslav Rojicek
Abstract:
The key role in phenomenological modelling of cyclic plasticity is good understanding of stress-strain behaviour of given material. There are many models describing behaviour of materials using numerous parameters and constants. Combination of individual parameters in those material models significantly determines whether observed and predicted results are in compliance. Parameter identification techniques such as random gradient, genetic algorithm, and sensitivity analysis are used for identification of parameters using numerical modelling and simulation. In this paper genetic algorithm and sensitivity analysis are used to study effect of 4 parameters of modified AbdelKarim-Ohno cyclic plasticity model. Results predicted by Finite Element (FE) simulation are compared with experimental data from biaxial ratcheting test with semi-elliptical loading path.Keywords: genetic algorithm, sensitivity analysis, inverse approach, finite element method, cyclic plasticity, ratcheting
Procedia PDF Downloads 45326896 Enhancing Code Security with AI-Powered Vulnerability Detection
Authors: Zzibu Mark Brian
Abstract:
As software systems become increasingly complex, ensuring code security is a growing concern. Traditional vulnerability detection methods often rely on manual code reviews or static analysis tools, which can be time-consuming and prone to errors. This paper presents a distinct approach to enhancing code security by leveraging artificial intelligence (AI) and machine learning (ML) techniques. Our proposed system utilizes a combination of natural language processing (NLP) and deep learning algorithms to identify and classify vulnerabilities in real-world codebases. By analyzing vast amounts of open-source code data, our AI-powered tool learns to recognize patterns and anomalies indicative of security weaknesses. We evaluated our system on a dataset of over 10,000 open-source projects, achieving an accuracy rate of 92% in detecting known vulnerabilities. Furthermore, our tool identified previously unknown vulnerabilities in popular libraries and frameworks, demonstrating its potential for improving software security.Keywords: AI, machine language, cord security, machine leaning
Procedia PDF Downloads 3626895 Recognizing an Individual, Their Topic of Conversation and Cultural Background from 3D Body Movement
Authors: Gheida J. Shahrour, Martin J. Russell
Abstract:
The 3D body movement signals captured during human-human conversation include clues not only to the content of people’s communication but also to their culture and personality. This paper is concerned with automatic extraction of this information from body movement signals. For the purpose of this research, we collected a novel corpus from 27 subjects, arranged them into groups according to their culture. We arranged each group into pairs and each pair communicated with each other about different topics. A state-of-art recognition system is applied to the problems of person, culture, and topic recognition. We borrowed modeling, classification, and normalization techniques from speech recognition. We used Gaussian Mixture Modeling (GMM) as the main technique for building our three systems, obtaining 77.78%, 55.47%, and 39.06% from the person, culture, and topic recognition systems respectively. In addition, we combined the above GMM systems with Support Vector Machines (SVM) to obtain 85.42%, 62.50%, and 40.63% accuracy for person, culture, and topic recognition respectively. Although direct comparison among these three recognition systems is difficult, it seems that our person recognition system performs best for both GMM and GMM-SVM, suggesting that inter-subject differences (i.e. subject’s personality traits) are a major source of variation. When removing these traits from culture and topic recognition systems using the Nuisance Attribute Projection (NAP) and the Intersession Variability Compensation (ISVC) techniques, we obtained 73.44% and 46.09% accuracy from culture and topic recognition systems respectively.Keywords: person recognition, topic recognition, culture recognition, 3D body movement signals, variability compensation
Procedia PDF Downloads 54126894 Conception of a Predictive Maintenance System for Forest Harvesters from Multiple Data Sources
Authors: Lazlo Fauth, Andreas Ligocki
Abstract:
For cost-effective use of harvesters, expensive repairs and unplanned downtimes must be reduced as far as possible. The predictive detection of failing systems and the calculation of intelligent service intervals, necessary to avoid these factors, require in-depth knowledge of the machines' behavior. Such know-how needs permanent monitoring of the machine state from different technical perspectives. In this paper, three approaches will be presented as they are currently pursued in the publicly funded project PreForst at Ostfalia University of Applied Sciences. These include the intelligent linking of workshop and service data, sensors on the harvester, and a special online hydraulic oil condition monitoring system. Furthermore the paper shows potentials as well as challenges for the use of these data in the conception of a predictive maintenance system.Keywords: predictive maintenance, condition monitoring, forest harvesting, forest engineering, oil data, hydraulic data
Procedia PDF Downloads 14526893 Sampled-Data Control for Fuel Cell Systems
Authors: H. Y. Jung, Ju H. Park, S. M. Lee
Abstract:
A sampled-data controller is presented for solid oxide fuel cell systems which is expressed by a sector bounded nonlinear model. The sector bounded nonlinear systems, which have a feedback connection with a linear dynamical system and nonlinearity satisfying certain sector type constraints. Also, the sampled-data control scheme is very useful since it is possible to handle digital controller and increasing research efforts have been devoted to sampled-data control systems with the development of modern high-speed computers. The proposed control law is obtained by solving a convex problem satisfying several linear matrix inequalities. Simulation results are given to show the effectiveness of the proposed design method.Keywords: sampled-data control, fuel cell, linear matrix inequalities, nonlinear control
Procedia PDF Downloads 56526892 Performance Comparison of AODV and Soft AODV Routing Protocol
Authors: Abhishek, Seema Devi, Jyoti Ohri
Abstract:
A mobile ad hoc network (MANET) represents a system of wireless mobile nodes that can self-organize freely and dynamically into arbitrary and temporary network topology. Unlike a wired network, wireless network interface has limited transmission range. Routing is the task of forwarding data packets from source to a given destination. Ad-hoc On Demand Distance Vector (AODV) routing protocol creates a path for a destination only when it required. This paper describes the implementation of AODV routing protocol using MATLAB-based Truetime simulator. In MANET's node movements are not fixed while they are random in nature. Hence intelligent techniques i.e. fuzzy and ANFIS are used to optimize the transmission range. In this paper, we compared the transmission range of AODV, fuzzy AODV and ANFIS AODV. For soft computing AODV, we have taken transmitted power and received threshold as input and transmission range as output. ANFIS gives better results as compared to fuzzy AODV.Keywords: ANFIS, AODV, fuzzy, MANET, reactive routing protocol, routing protocol, truetime
Procedia PDF Downloads 49826891 From Parchment to Pixels: Digital Preservation for the Future
Authors: Abida Khatoon
Abstract:
This study provides an overview of ancient manuscripts, including their historical significance, current digital preservation methods, and the challenges we face in safeguarding these invaluable resources. India has a long-standing tradition of manuscript preservation, with texts that span a wide range of subjects, from religious scriptures to scientific treatises. These manuscripts were written on various materials, including palm leaves, parchment, metal, bark, wood, animal skin, and paper. These manuscripts offer a deep insight into India's cultural and intellectual history. Ancient manuscripts are crucial historical records, providing valuable insights into past civilizations and knowledge systems. As these physical documents become increasingly fragile, digital preservation methods have become essential to ensure their continued accessibility. Digital preservation involves several key techniques. Scanning and digitization create high-resolution digital images of manuscripts, while reprography produces copies to reduce wear on originals. Digital archiving ensures proper storage and management of these digital files, and preservation of electronic data addresses modern formats like web pages and emails. Despite its benefits, digital preservation faces several challenges. Technological obsolescence, data integrity issues, and the resource-intensive nature of the process are significant hurdles. Securing adequate funding is particularly challenging due to high initial costs and ongoing expenses. Looking ahead, the future of digital preservation is promising. Advancements in technology, increased collaboration among institutions, and the development of sustainable funding models will enhance the preservation and accessibility of these important historical documents.Keywords: preservation strategies, Indian manuscript, cultural heritage, archiving
Procedia PDF Downloads 1826890 Repair and Strengthening of Plain and FRC Shear Deficient Beams Using Externally Bonded CFRP Sheets
Authors: H. S. S. Abou El-Mal, H. E. M. Sallam
Abstract:
This paper presents experimental and analytical study on the behavior of repaired and strengthened shear critical RC beams using externally bonded CFRP bi-directional fabrics. The use of CFRP sheets to repair or strengthen RC beams has been repetitively studied and proven feasible. However, the use of combined repair techniques and applying that method to both plain and FRC beams can maximize the shear capacity of RC shear deficient beams. A total of twelve slender beams were tested under four-point bending. The test parameters included CFRP layout, number of layers and fiber direction, injecting cracks before applying repairing sheets, enhancing the flexural capacity to differentiate between shear repair and strengthening techniques, and concrete matrix types. The findings revealed that applying CFRP sheets increased the overall shear capacity, the amount and orientation of wrapping is of prime importance in both repairing and strengthening, CFRP wrapping could change the failure mode from shear to flexural shear, the use of crack injection combined to CFRP wrapping further improved the shear capacity while, applying the previous method to FRC beams enhanced both shear capacity and failure ductility. Acceptable agreement was found between predicted shear capacities using the Canadian code and the experimental results of the current study.Keywords: CFRP, FRC, repair, shear strengthening
Procedia PDF Downloads 34926889 Compressed Suffix Arrays to Self-Indexes Based on Partitioned Elias-Fano
Abstract:
A practical and simple self-indexing data structure, Partitioned Elias-Fano (PEF) - Compressed Suffix Arrays (CSA), is built in linear time for the CSA based on PEF indexes. Moreover, the PEF-CSA is compared with two classical compressed indexing methods, Ferragina and Manzini implementation (FMI) and Sad-CSA on different type and size files in Pizza & Chili. The PEF-CSA performs better on the existing data in terms of the compression ratio, count, and locates time except for the evenly distributed data such as proteins data. The observations of the experiments are that the distribution of the φ is more important than the alphabet size on the compression ratio. Unevenly distributed data φ makes better compression effect, and the larger the size of the hit counts, the longer the count and locate time.Keywords: compressed suffix array, self-indexing, partitioned Elias-Fano, PEF-CSA
Procedia PDF Downloads 25226888 Data, Digital Identity and Antitrust Law: An Exploratory Study of Facebook’s Novi Digital Wallet
Authors: Wanjiku Karanja
Abstract:
Facebook has monopoly power in the social networking market. It has grown and entrenched its monopoly power through the capture of its users’ data value chains. However, antitrust law’s consumer welfare roots have prevented it from effectively addressing the role of data capture in Facebook’s market dominance. These regulatory blind spots are augmented in Facebook’s proposed Diem cryptocurrency project and its Novi Digital wallet. Novi, which is Diem’s digital identity component, shall enable Facebook to collect an unprecedented volume of consumer data. Consequently, Novi has seismic implications on internet identity as the network effects of Facebook’s large user base could establish it as the de facto internet identity layer. Moreover, the large tracts of data Facebook shall collect through Novi shall further entrench Facebook's market power. As such, the attendant lock-in effects of this project shall be very difficult to reverse. Urgent regulatory action is therefore required to prevent this expansion of Facebook’s data resources and monopoly power. This research thus highlights the importance of data capture to competition and market health in the social networking industry. It utilizes interviews with key experts to empirically interrogate the impact of Facebook’s data capture and control of its users’ data value chains on its market power. This inquiry is contextualized against Novi’s expansive effect on Facebook’s data value chains. It thus addresses the novel antitrust issues arising at the nexus of Facebook’s monopoly power and the privacy of its users’ data. It also explores the impact of platform design principles, specifically data portability and data portability, in mitigating Facebook’s anti-competitive practices. As such, this study finds that Facebook is a powerful monopoly that dominates the social media industry to the detriment of potential competitors. Facebook derives its power from its size, annexure of the consumer data value chain, and control of its users’ social graphs. Additionally, the platform design principles of data interoperability and data portability are not a panacea to restoring competition in the social networking market. Their success depends on the establishment of robust technical standards and regulatory frameworks.Keywords: antitrust law, data protection law, data portability, data interoperability, digital identity, Facebook
Procedia PDF Downloads 12326887 A Randomized, Controlled Trial to Test Behavior Change Techniques to Improve Low Intensity Physical Activity in Older Adults
Authors: Ciaran Friel, Jerry Suls, Mark Butler, Patrick Robles, Samantha Gordon, Frank Vicari, Karina W. Davidson
Abstract:
Physical activity guidelines focus on increasing moderate-intensity activity for older adults, but adherence to recommendations remains low. This is despite the fact that scientific evidence supports that any increase in physical activity is positively correlated with health benefits. Behavior change techniques (BCTs) have demonstrated effectiveness in reducing sedentary behavior and promoting physical activity. This pilot study uses a Personalized Trials (N-of-1) design to evaluate the efficacy of using four BCTs to promote an increase in low-intensity physical activity (2,000 steps of walking per day) in adults aged 45-75 years old. The 4 BCTs tested were goal setting, action planning, feedback, and self-monitoring. BCTs were tested in random order and delivered by text message prompts requiring participant engagement. The study recruited health system employees in the target age range, without mobility restrictions and demonstrating interest in increasing their daily activity by a minimum of 2,000 steps per day for a minimum of five days per week. Participants were sent a Fitbit® fitness tracker with an established study account and password. Participants were recommended to wear the Fitbit device 24/7 but were required to wear it for a minimum of ten hours per day. Baseline physical activity was measured by Fitbit for two weeks. In the 8-week intervention phase of the study, participants received each of the four BCTs, in random order, for a two-week period. Text message prompts were delivered daily each morning at a consistent time. All prompts required participant engagement to acknowledge receipt of the BCT message. Engagement is dependent upon the BCT message and may have included recording that a detailed plan for walking has been made or confirmed a daily step goal (action planning, goal setting). Additionally, participants may have been directed to a study dashboard to view their step counts or compare themselves to their baseline average step count (self-monitoring, feedback). At the end of each two-week testing interval, participants were asked to complete the Self-Efficacy for Walking Scale (SEW_Dur), a validated measure that assesses the participant’s confidence in walking incremental distances, and a survey measuring their satisfaction with the individual BCT that they tested. At the end of their trial, participants received a personalized summary of their step data in response to each individual BCT. The analysis will examine the novel individual-level heterogeneity of treatment effect made possible by N-of-1 design and pool results across participants to efficiently estimate the overall efficacy of the selected behavioral change techniques in increasing low-intensity walking by 2,000 steps, five days per week. Self-efficacy will be explored as the likely mechanism of action prompting behavior change. This study will inform the providers and demonstrate the feasibility of an N-of-1 study design to effectively promote physical activity as a component of healthy aging.Keywords: aging, exercise, habit, walking
Procedia PDF Downloads 9226886 Recommendations for Data Quality Filtering of Opportunistic Species Occurrence Data
Authors: Camille Van Eupen, Dirk Maes, Marc Herremans, Kristijn R. R. Swinnen, Ben Somers, Stijn Luca
Abstract:
In ecology, species distribution models are commonly implemented to study species-environment relationships. These models increasingly rely on opportunistic citizen science data when high-quality species records collected through standardized recording protocols are unavailable. While these opportunistic data are abundant, uncertainty is usually high, e.g., due to observer effects or a lack of metadata. Data quality filtering is often used to reduce these types of uncertainty in an attempt to increase the value of studies relying on opportunistic data. However, filtering should not be performed blindly. In this study, recommendations are built for data quality filtering of opportunistic species occurrence data that are used as input for species distribution models. Using an extensive database of 5.7 million citizen science records from 255 species in Flanders, the impact on model performance was quantified by applying three data quality filters, and these results were linked to species traits. More specifically, presence records were filtered based on record attributes that provide information on the observation process or post-entry data validation, and changes in the area under the receiver operating characteristic (AUC), sensitivity, and specificity were analyzed using the Maxent algorithm with and without filtering. Controlling for sample size enabled us to study the combined impact of data quality filtering, i.e., the simultaneous impact of an increase in data quality and a decrease in sample size. Further, the variation among species in their response to data quality filtering was explored by clustering species based on four traits often related to data quality: commonness, popularity, difficulty, and body size. Findings show that model performance is affected by i) the quality of the filtered data, ii) the proportional reduction in sample size caused by filtering and the remaining absolute sample size, and iii) a species ‘quality profile’, resulting from a species classification based on the four traits related to data quality. The findings resulted in recommendations on when and how to filter volunteer generated and opportunistically collected data. This study confirms that correctly processed citizen science data can make a valuable contribution to ecological research and species conservation.Keywords: citizen science, data quality filtering, species distribution models, trait profiles
Procedia PDF Downloads 20326885 Preservation and Packaging Techniques for Extending the Shelf Life of Cucumbers: A Review of Methods and Factors Affecting Quality
Authors: Abdul Umaro Tholley
Abstract:
The preservation and packaging of cucumbers are essential to maintain their shelf life and quality. Cucumbers are a perishable food item that is highly susceptible to spoilage due to their high-water content and delicate nature. Therefore, proper preservation and packaging techniques are crucial to extend their shelf life and prevent economic loss. There are several methods of preserving cucumbers, including refrigeration, canning, pickling, and dehydration. Refrigeration is the most used preservation method, as it slows down the rate of deterioration and maintains the freshness and quality of the cucumbers. Canning and pickling are also popular preservation methods that use heat treatment and acidic solutions, respectively, to prevent microbial growth and increase shelf life. Dehydration involves removing the water content from cucumbers to increase their shelf life, but it may affect their texture and taste. Packaging also plays a vital role in preserving cucumbers. The packaging materials should be selected based on their ability to maintain the quality and freshness of the cucumbers. The most used packaging materials for cucumbers are polyethylene bags, which prevent moisture loss and protect the cucumbers from physical damage. Other packaging materials, such as corrugated boxes and wooden crates, may also be used, but they offer less protection against moisture loss and damage. The quality of cucumbers is affected by several factors, including storage temperature, humidity, and exposure to light. Cucumbers should be stored at temperatures between 7 and 10 °C, with a relative humidity of 90-95%, to maintain their freshness and quality. Exposure to light should also be minimized to prevent the formation of yellowing and decay. In conclusion, the preservation and packaging of cucumbers are essential to maintain their quality and extend their shelf life. Refrigeration, canning, pickling, and dehydration are common preservation methods that can be used to preserve cucumbers. The packaging materials used should be carefully selected to prevent moisture loss and physical damage. Proper storage conditions, such as temperature, humidity, and light exposure, should also be maintained to ensure the quality and freshness of cucumbers. Overall, proper preservation and packaging techniques can help reduce economic loss and provide consumers with high-quality cucumbers.Keywords: cucumbers, preservation, packaging, shelf life
Procedia PDF Downloads 9626884 Data Quality Enhancement with String Length Distribution
Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda
Abstract:
Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.Keywords: string classification, data quality, feature selection, probability distribution, string length
Procedia PDF Downloads 31826883 Geospatial Curve Fitting Methods for Disease Mapping of Tuberculosis in Eastern Cape Province, South Africa
Authors: Davies Obaromi, Qin Yongsong, James Ndege
Abstract:
To interpolate scattered or regularly distributed data, there are imprecise or exact methods. However, there are some of these methods that could be used for interpolating data in a regular grid and others in an irregular grid. In spatial epidemiology, it is important to examine how a disease prevalence rates are distributed in space, and how they relate with each other within a defined distance and direction. In this study, for the geographic and graphic representation of the disease prevalence, linear and biharmonic spline methods were implemented in MATLAB, and used to identify, localize and compare for smoothing in the distribution patterns of tuberculosis (TB) in Eastern Cape Province. The aim of this study is to produce a more “smooth” graphical disease map for TB prevalence patterns by a 3-D curve fitting techniques, especially the biharmonic splines that can suppress noise easily, by seeking a least-squares fit rather than exact interpolation. The datasets are represented generally as a 3D or XYZ triplets, where X and Y are the spatial coordinates and Z is the variable of interest and in this case, TB counts in the province. This smoothing spline is a method of fitting a smooth curve to a set of noisy observations using a spline function, and it has also become the conventional method for its high precision, simplicity and flexibility. Surface and contour plots are produced for the TB prevalence at the provincial level for 2012 – 2015. From the results, the general outlook of all the fittings showed a systematic pattern in the distribution of TB cases in the province and this is consistent with some spatial statistical analyses carried out in the province. This new method is rarely used in disease mapping applications, but it has a superior advantage to be assessed at subjective locations rather than only on a rectangular grid as seen in most traditional GIS methods of geospatial analyses.Keywords: linear, biharmonic splines, tuberculosis, South Africa
Procedia PDF Downloads 23926882 Accessibility of Social Justice through Social Security in Indian Organisations: Analysis Based on Workforce
Authors: Neelima Rashmi Lakra
Abstract:
India was among one of the highly developed economy up to 1850 due to its cottage industries. During the end of the 18th century, modern industrial enterprises began with the first cotton mill in Bombay, the jute mill near Calcutta and the coal mine in Raniganj. This was counted as the real beginning of industry in 1854 in India. Prior to this period people concentrated only to agriculture, menial service or handicraft, and the introduction of industries exposed them to the disciplines of factory which was very tedious for them. With increasing number of factories been setup adding on to mining and introduction of railway, World War Period (1914-19), Second World War Period (1939-45) and the Great Depression (1929-33) there were visible change in the nature of work for the people, which resulted in outburst of strike for various reasons in these factories. Here, with India’s independence there was emergence of public sector industries and labour legislations were introduced. Meanwhile, trade unions came to notice to the rescue of the oppressed but failed to continue till long. Soon after, with the New Economic Policy organisations came across to face challenges to perform their best, where social justice for the workmen was in question. On these backdrops, studies were found discussing the central human capabilities which could be addressed through Social Security schemes. Therefore, this study was taken up to look at the reforms and legislations mainly meant for the welfare of the labour. This paper will contribute to the large number of Indian population who are serving in public sectors in India since the introduction of industries and will complement the issue of social justice through social security measures among this huge crowd serving the nation. The objectives of the study include; to find out what labour Legislations have already been existing in India, the role of Trade Union Movement, to look at the effects of New Economic Policy on these reforms and its effects and measures taken for the workforce employed in the public sectors and finally, if these measures fulfil the social justice aspects for the larger society on whole. The methodology followed collection of data from books, journal articles, reports, company reports and manuals focusing mainly on Indian studies and the data was analysed following content analysis method. The findings showed the measures taken for Social Security, but there were also reflections of very few particular additions or amendments to these Acts and provisions with the onset of New Liberalisation Policy. Therefore, the study concluded examining the social justice aspects in the context of a developing economy and discussing the recommendations.Keywords: public sectors, social justice, social security schemes, trade union movement
Procedia PDF Downloads 45026881 Development of Loop Mediated Isothermal Amplification (Lamp) Assay for the Diagnosis of Ovine Theileriosis
Authors: Muhammad Fiaz Qamar, Uzma Mehreen, Muhammad Arfan Zaman, Kazim Ali
Abstract:
Ovine Theileriosis is a world-wide concern, especially in tropical and subtropical areas, due to having tick abundance that has received less awareness in different developed and developing areas due to less worth of sheep, low to the middle level of infection in different small ruminants herd. Across Asia, the prevalence reports have been conducted to provide equivalent calculation of flock and animal level prevalence of Theileriosisin animals. It is a challenge for veterinarians to timely diagnosis & control of Theileriosis and famers because of the nature of the organism and inadequacy of restricted plans to control. All most work is based upon the development of such a technique which should be farmer-friendly, less expensive, and easy to perform into the field. By the timely diagnosis of this disease will decrease the irrational use of the drugs, and other plan was to determine the prevalence of Theileriosis in District Jhang by using the conventional method, PCR and qPCR, and LAMP. We quantify the molecular epidemiology of T.lestoquardiin sheep from Jhang districts, Punjab, Pakistan. In this study, we concluded that the overall prevalence of Theileriosis was (32/350*100= 9.1%) in sheep by using Giemsa staining technique, whereas (48/350*100= 13%) is observed by using PCR technique (56/350*100=16%) in qPCR and the LAMP technique have shown up to this much prevalence percentage (60/350*100= 17.1%). The specificity and sensitivity also calculated in comparison with the PCR and LAMP technique. Means more positive results have been shown when the diagnosis has been done with the help of LAMP. And there is little bit of difference between the positive results of PCR and qPCR, and the least positive animals was by using Giemsa staining technique/conventional method. If we talk about the specificity and sensitivity of the LAMP as compared to PCR, The cross tabulation shows that the results of sensitivity of LAMP counted was 94.4%, and specificity of LAMP counted was 78%. Advances in scientific field must be upon reality based ideas which can lessen the gaps and hurdles in the way of scientific research; the lamp is one of such techniques which have done wonders in adding value and helping human at large. It is such a great biological diagnostic tools and has helped a lot in the proper diagnosis and treatment of certain diseases. Other methods for diagnosis, such as culture techniques and serological techniques, have exposed humans with great danger. However, with the help of molecular diagnostic technique like LAMP, exposure to such pathogens is being avoided in the current era Most prompt and tentative diagnosis can be made using LAMP. Other techniques like PCR has many disadvantages when compared to LAMP as PCR is a relatively expensive, time consuming, and very complicated procedure while LAMP is relatively cheap, easy to perform, less time consuming, and more accurate. LAMP technique has removed hurdles in the way of scientific research and molecular diagnostics, making it approachable to poor and developing countries.Keywords: distribution, thelaria, LAMP, primer sequences, PCR
Procedia PDF Downloads 10326880 Temporally Coherent 3D Animation Reconstruction from RGB-D Video Data
Authors: Salam Khalifa, Naveed Ahmed
Abstract:
We present a new method to reconstruct a temporally coherent 3D animation from single or multi-view RGB-D video data using unbiased feature point sampling. Given RGB-D video data, in form of a 3D point cloud sequence, our method first extracts feature points using both color and depth information. In the subsequent steps, these feature points are used to match two 3D point clouds in consecutive frames independent of their resolution. Our new motion vectors based dynamic alignment method then fully reconstruct a spatio-temporally coherent 3D animation. We perform extensive quantitative validation using novel error functions to analyze the results. We show that despite the limiting factors of temporal and spatial noise associated to RGB-D data, it is possible to extract temporal coherence to faithfully reconstruct a temporally coherent 3D animation from RGB-D video data.Keywords: 3D video, 3D animation, RGB-D video, temporally coherent 3D animation
Procedia PDF Downloads 37326879 Determining Abnomal Behaviors in UAV Robots for Trajectory Control in Teleoperation
Authors: Kiwon Yeom
Abstract:
Change points are abrupt variations in a data sequence. Detection of change points is useful in modeling, analyzing, and predicting time series in application areas such as robotics and teleoperation. In this paper, a change point is defined to be a discontinuity in one of its derivatives. This paper presents a reliable method for detecting discontinuities within a three-dimensional trajectory data. The problem of determining one or more discontinuities is considered in regular and irregular trajectory data from teleoperation. We examine the geometric detection algorithm and illustrate the use of the method on real data examples.Keywords: change point, discontinuity, teleoperation, abrupt variation
Procedia PDF Downloads 16726878 Analysis of Real Time Seismic Signal Dataset Using Machine Learning
Authors: Sujata Kulkarni, Udhav Bhosle, Vijaykumar T.
Abstract:
Due to the closeness between seismic signals and non-seismic signals, it is vital to detect earthquakes using conventional methods. In order to distinguish between seismic events and non-seismic events depending on their amplitude, our study processes the data that come from seismic sensors. The authors suggest a robust noise suppression technique that makes use of a bandpass filter, an IIR Wiener filter, recursive short-term average/long-term average (STA/LTA), and Carl short-term average (STA)/long-term average for event identification (LTA). The trigger ratio used in the proposed study to differentiate between seismic and non-seismic activity is determined. The proposed work focuses on significant feature extraction for machine learning-based seismic event detection. This serves as motivation for compiling a dataset of all features for the identification and forecasting of seismic signals. We place a focus on feature vector dimension reduction techniques due to the temporal complexity. The proposed notable features were experimentally tested using a machine learning model, and the results on unseen data are optimal. Finally, a presentation using a hybrid dataset (captured by different sensors) demonstrates how this model may also be employed in a real-time setting while lowering false alarm rates. The planned study is based on the examination of seismic signals obtained from both individual sensors and sensor networks (SN). A wideband seismic signal from BSVK and CUKG station sensors, respectively located near Basavakalyan, Karnataka, and the Central University of Karnataka, makes up the experimental dataset.Keywords: Carl STA/LTA, features extraction, real time, dataset, machine learning, seismic detection
Procedia PDF Downloads 12426877 Multidimensional Item Response Theory Models for Practical Application in Large Tests Designed to Measure Multiple Constructs
Authors: Maria Fernanda Ordoñez Martinez, Alvaro Mauricio Montenegro
Abstract:
This work presents a statistical methodology for measuring and founding constructs in Latent Semantic Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations present on Item Response Theory. More precisely, we propose initially reducing dimensionality with specific use of Principal Component Analysis for the linguistic data and then, producing axes of groups made from a clustering analysis of the semantic data. This approach allows the user to give meaning to previous clusters and found the real latent structure presented by data. The methodology is applied in a set of real semantic data presenting impressive results for the coherence, speed and precision.Keywords: semantic analysis, factorial analysis, dimension reduction, penalized logistic regression
Procedia PDF Downloads 443