Search results for: continuous speed profile data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30074

Search results for: continuous speed profile data

27134 Complex Fuzzy Evolution Equation with Nonlocal Conditions

Authors: Abdelati El Allaoui, Said Melliani, Lalla Saadia Chadli

Abstract:

The objective of this paper is to study the existence and uniqueness of Mild solutions for a complex fuzzy evolution equation with nonlocal conditions that accommodates the notion of fuzzy sets defined by complex-valued membership functions. We first propose definition of complex fuzzy strongly continuous semigroups. We then give existence and uniqueness result relevant to the complex fuzzy evolution equation.

Keywords: Complex fuzzy evolution equations, nonlocal conditions, mild solution, complex fuzzy semigroups

Procedia PDF Downloads 282
27133 An Efficient Propensity Score Method for Causal Analysis With Application to Case-Control Study in Breast Cancer Research

Authors: Ms Azam Najafkouchak, David Todem, Dorothy Pathak, Pramod Pathak, Joseph Gardiner

Abstract:

Propensity score (PS) methods have recently become the standard analysis as a tool for the causal inference in the observational studies where exposure is not randomly assigned, thus, confounding can impact the estimation of treatment effect on the outcome. For the binary outcome, the effect of treatment on the outcome can be estimated by odds ratios, relative risks, and risk differences. However, using the different PS methods may give you a different estimation of the treatment effect on the outcome. Several methods of PS analyses have been used mainly, include matching, inverse probability of weighting, stratification, and covariate adjusted on PS. Due to the dangers of discretizing continuous variables (exposure, covariates), the focus of this paper will be on how the variation in cut-points or boundaries will affect the average treatment effect (ATE) utilizing the stratification of PS method. Therefore, we are trying to avoid choosing arbitrary cut-points, instead, we continuously discretize the PS and accumulate information across all cut-points for inferences. We will use Monte Carlo simulation to evaluate ATE, focusing on two PS methods, stratification and covariate adjusted on PS. We will then show how this can be observed based on the analyses of the data from a case-control study of breast cancer, the Polish Women’s Health Study.

Keywords: average treatment effect, propensity score, stratification, covariate adjusted, monte Calro estimation, breast cancer, case_control study

Procedia PDF Downloads 105
27132 Legal Regulation of Personal Information Data Transmission Risk Assessment: A Case Study of the EU’s DPIA

Authors: Cai Qianyi

Abstract:

In the midst of global digital revolution, the flow of data poses security threats that call China's existing legislative framework for protecting personal information into question. As a preliminary procedure for risk analysis and prevention, the risk assessment of personal data transmission lacks detailed guidelines for support. Existing provisions reveal unclear responsibilities for network operators and weakened rights for data subjects. Furthermore, the regulatory system's weak operability and a lack of industry self-regulation heighten data transmission hazards. This paper aims to compare the regulatory pathways for data information transmission risks between China and Europe from a legal framework and content perspective. It draws on the “Data Protection Impact Assessment Guidelines” to empower multiple stakeholders, including data processors, controllers, and subjects, while also defining obligations. In conclusion, this paper intends to solve China's digital security shortcomings by developing a more mature regulatory framework and industry self-regulation mechanisms, resulting in a win-win situation for personal data protection and the development of the digital economy.

Keywords: personal information data transmission, risk assessment, DPIA, internet service provider, personal information data transimission, risk assessment

Procedia PDF Downloads 61
27131 Wavelets Contribution on Textual Data Analysis

Authors: Habiba Ben Abdessalem

Abstract:

The emergence of giant set of textual data was the push that has encouraged researchers to invest in this field. The purpose of textual data analysis methods is to facilitate access to such type of data by providing various graphic visualizations. Applying these methods requires a corpus pretreatment step, whose standards are set according to the objective of the problem studied. This step determines the forms list contained in contingency table by keeping only those information carriers. This step may, however, lead to noisy contingency tables, so the use of wavelet denoising function. The validity of the proposed approach is tested on a text database that offers economic and political events in Tunisia for a well definite period.

Keywords: textual data, wavelet, denoising, contingency table

Procedia PDF Downloads 277
27130 Airliner-UAV Flight Formation in Climb Regime

Authors: Pavel Zikmund, Robert Popela

Abstract:

Extreme formation is a theoretical concept of self-sustain flight when a big Airliner is followed by a small UAV glider flying in airliner’s wake vortex. The paper presents results of climb analysis with a goal to lift the gliding UAV to airliner’s cruise altitude. Wake vortex models, the UAV drag polar and basic parameters and airliner’s climb profile are introduced at first. Then, flight performance of the UAV in the wake vortex is evaluated by analytical methods. Time history of optimal distance between the airliner and the UAV during the climb is determined. The results are encouraging, therefore available UAV drag margin for electricity generation is figured out for different vortex models.

Keywords: flight in formation, self-sustained flight, UAV, wake vortex

Procedia PDF Downloads 440
27129 AutoML: Comprehensive Review and Application to Engineering Datasets

Authors: Parsa Mahdavi, M. Amin Hariri-Ardebili

Abstract:

The development of accurate machine learning and deep learning models traditionally demands hands-on expertise and a solid background to fine-tune hyperparameters. With the continuous expansion of datasets in various scientific and engineering domains, researchers increasingly turn to machine learning methods to unveil hidden insights that may elude classic regression techniques. This surge in adoption raises concerns about the adequacy of the resultant meta-models and, consequently, the interpretation of the findings. In response to these challenges, automated machine learning (AutoML) emerges as a promising solution, aiming to construct machine learning models with minimal intervention or guidance from human experts. AutoML encompasses crucial stages such as data preparation, feature engineering, hyperparameter optimization, and neural architecture search. This paper provides a comprehensive overview of the principles underpinning AutoML, surveying several widely-used AutoML platforms. Additionally, the paper offers a glimpse into the application of AutoML on various engineering datasets. By comparing these results with those obtained through classical machine learning methods, the paper quantifies the uncertainties inherent in the application of a single ML model versus the holistic approach provided by AutoML. These examples showcase the efficacy of AutoML in extracting meaningful patterns and insights, emphasizing its potential to revolutionize the way we approach and analyze complex datasets.

Keywords: automated machine learning, uncertainty, engineering dataset, regression

Procedia PDF Downloads 61
27128 Management of Diabetics on Hemodialysis

Authors: Souheila Zemmouchi

Abstract:

Introduction: Diabetes is currently the leading cause of end-stage chronic kidney disease and dialysis, so it adds additional complexity to the management of chronic hemodialysis patients. These patients are extremely fragile because of their multiple cardiovascular and metabolic comorbidities. Clear and complete description of the experience: the management of a diabetic on hemodialysis is particularly difficult due to frequent hypoglycaemia and significant inter and perdialyticglycemic variability that is difficult to predict. The aim of our study is to describe the clinical-biological profile and to assess the cardiovascular risk of diabetics undergoing chronic hemodialysis, and compare them with non-diabetic hemodialysis patients. Methods: This cross-sectional, descriptive, and analytical study was carried out between January 01 and December 31, 2018, involving 309 hemodialysis patients spread over 4 centersThe data were collected prospectively then compiled and analyzed by the SPSS Version 10 software The FRAMINGHAM RISK SCORE has been used to assess cardiovascular risk in all hemodialysis patients Results: The survey involved 309 hemodialysis patients, including 83 diabetics, for a prevalence of 27% The average age 53 ± 10.2 years. The sex ratio is 1.5. 50% of diabetic hemodialysis patients retained residual diuresis against 32% in non-diabetics. In the group of diabetics, we noted more hypertension (70% versus 38% non-diabetics P 0.004), more intradialytichypoglycemia (15% versus 3% non-diabetics P 0.007), initially, vascular exhaustion was found in 4 diabetics versus 2 non-diabetics. 70% of diabetics with anuria had postdialytichyperglycemia. The study found a statistically significant difference between the different levels of cardiovascular risk according to the diabetic status. Conclusion: There are many challenges in the management of diabetics on hemodialysis, both to optimize glycemic control according to an individualized target and to coordinate comprehensive and effective care.

Keywords: hemodialysis, diabetes, chronic renal failure, glycemic control

Procedia PDF Downloads 160
27127 Uptake of Hepatitis B Vaccine among Hepatitis C Positive Patients and Their Vaccine Response in Myanmar

Authors: Zaw Z Aung

Abstract:

Background: High-risk groups for hepatitis B infection (HBV) are people who injected drugs (PWID), men who have sex with men (MSM), people living with HIV (PLHIV) and persons with hepatitis C (HCV), etc. HBV/HCV coinfected patients are at increased risk of cirrhosis, hepatic decompensation and hepatocellular carcinoma. To the best of author’s knowledge, there is currently no data for hepatitis B vaccine utilization in HCV positive patients and their antibody response. Methodology: From February 2018 to May 2018, consented participants at or above 18 years who came to the clinic in Mandalay were tested with the anti-HCV rapid test. Those who tested HCV positive (n=168) were further tested with hepatitis B profile and asked about their previous hepatitis B vaccination history and risk factors. Results: Out of 168 HCV positive participants, three were excluded for active HBV infections. The remaining 165 were categorized into previously vaccinated 64% (n=106) and unvaccinated 36% (n=59) There were three characteristics groups- PWID monoinfected (n=77), General Population (GP) monoinfected (n=22) and HIV/HCV coinfected participants (n=66). Unvaccinated participants were highest in HIV/HCV, with 68%(n=45) followed by GP (23%, n=5) and PWID (12%, n=9). Among previously vaccinated participants, the highest percentage was PWID (88%, n=68), the second highest was GP (77%, n=17) and lowest in HIV/HCV patients (32%, n=21). 63 participants completed third doses of vaccination (PWID=36, GP=13, HIV/HCV=14). 53% of participants who completed 3 dose of hepatitis B were non-responders (n=34): HIV/HCV (86%, n=12), PWID (44%, n=16), and GP (46%, n=6) Conclusion: Even in the presence of effective and safe hepatitis B vaccine, uptake is low among high risk groups especially PLHIV that needs to be improved. Integration or collaboration of hepatitis B vaccination program, HIV/AIDS and hepatitis C treatment centers is desirable. About half of vaccinated participants were non-responders so that optimal doses, schedule and follow-up testing need to be addressed carefully for those groups.

Keywords: Hepatitis B vaccine, Hepatitis C, HIV, Myanmar

Procedia PDF Downloads 146
27126 Analyzing On-Line Process Data for Industrial Production Quality Control

Authors: Hyun-Woo Cho

Abstract:

The monitoring of industrial production quality has to be implemented to alarm early warning for unusual operating conditions. Furthermore, identification of their assignable causes is necessary for a quality control purpose. For such tasks many multivariate statistical techniques have been applied and shown to be quite effective tools. This work presents a process data-based monitoring scheme for production processes. For more reliable results some additional steps of noise filtering and preprocessing are considered. It may lead to enhanced performance by eliminating unwanted variation of the data. The performance evaluation is executed using data sets from test processes. The proposed method is shown to provide reliable quality control results, and thus is more effective in quality monitoring in the example. For practical implementation of the method, an on-line data system must be available to gather historical and on-line data. Recently large amounts of data are collected on-line in most processes and implementation of the current scheme is feasible and does not give additional burdens to users.

Keywords: detection, filtering, monitoring, process data

Procedia PDF Downloads 559
27125 A Review of Travel Data Collection Methods

Authors: Muhammad Awais Shafique, Eiji Hato

Abstract:

Household trip data is of crucial importance for managing present transportation infrastructure as well as to plan and design future facilities. It also provides basis for new policies implemented under Transportation Demand Management. The methods used for household trip data collection have changed with passage of time, starting with the conventional face-to-face interviews or paper-and-pencil interviews and reaching to the recent approach of employing smartphones. This study summarizes the step-wise evolution in the travel data collection methods. It provides a comprehensive review of the topic, for readers interested to know the changing trends in the data collection field.

Keywords: computer, smartphone, telephone, travel survey

Procedia PDF Downloads 313
27124 A Business-to-Business Collaboration System That Promotes Data Utilization While Encrypting Information on the Blockchain

Authors: Hiroaki Nasu, Ryota Miyamoto, Yuta Kodera, Yasuyuki Nogami

Abstract:

To promote Industry 4.0 and Society 5.0 and so on, it is important to connect and share data so that every member can trust it. Blockchain (BC) technology is currently attracting attention as the most advanced tool and has been used in the financial field and so on. However, the data collaboration using BC has not progressed sufficiently among companies on the supply chain of manufacturing industry that handle sensitive data such as product quality, manufacturing conditions, etc. There are two main reasons why data utilization is not sufficiently advanced in the industrial supply chain. The first reason is that manufacturing information is top secret and a source for companies to generate profits. It is difficult to disclose data even between companies with transactions in the supply chain. In the blockchain mechanism such as Bitcoin using PKI (Public Key Infrastructure), in order to confirm the identity of the company that has sent the data, the plaintext must be shared between the companies. Another reason is that the merits (scenarios) of collaboration data between companies are not specifically specified in the industrial supply chain. For these problems this paper proposes a Business to Business (B2B) collaboration system using homomorphic encryption and BC technique. Using the proposed system, each company on the supply chain can exchange confidential information on encrypted data and utilize the data for their own business. In addition, this paper considers a scenario focusing on quality data, which was difficult to collaborate because it is a top secret. In this scenario, we show a implementation scheme and a benefit of concrete data collaboration by proposing a comparison protocol that can grasp the change in quality while hiding the numerical value of quality data.

Keywords: business to business data collaboration, industrial supply chain, blockchain, homomorphic encryption

Procedia PDF Downloads 136
27123 Effects of Sublethal Concentrations of Parkia biglobosa Pod on Weight Gain in the African Catfish, Clarias gariepinus Juveniles

Authors: M. I. Oshimagye, V. O. Ayuba, P. A. Annune

Abstract:

The effect of Sublethal Concentrations of Parkia biglobosa pod extract on the growth and survival of Clarias gariepinus juveniles (mean weight 32.73g ± 0.0) were investigated under laboratory conditions for 8 weeks using the static renewal and continuous aeration system. Statistical analysis showed that fish exposed to various concentrations had significantly lower (P<0.05) growth rate than the control groups. The reduction in growth was observed to be directly proportional to increase in concentration. However, at 50 mg/L no significant depression in weight was observed.

Keywords: Clarias gariepinus, Parkia biglobosa, pod, weight

Procedia PDF Downloads 499
27122 Multivariate Assessment of Mathematics Test Scores of Students in Qatar

Authors: Ali Rashash Alzahrani, Elizabeth Stojanovski

Abstract:

Data on various aspects of education are collected at the institutional and government level regularly. In Australia, for example, students at various levels of schooling undertake examinations in numeracy and literacy as part of NAPLAN testing, enabling longitudinal assessment of such data as well as comparisons between schools and states within Australia. Another source of educational data collected internationally is via the PISA study which collects data from several countries when students are approximately 15 years of age and enables comparisons in the performance of science, mathematics and English between countries as well as ranking of countries based on performance in these standardised tests. As well as student and school outcomes based on the tests taken as part of the PISA study, there is a wealth of other data collected in the study including parental demographics data and data related to teaching strategies used by educators. Overall, an abundance of educational data is available which has the potential to be used to help improve educational attainment and teaching of content in order to improve learning outcomes. A multivariate assessment of such data enables multiple variables to be considered simultaneously and will be used in the present study to help develop profiles of students based on performance in mathematics using data obtained from the PISA study.

Keywords: cluster analysis, education, mathematics, profiles

Procedia PDF Downloads 126
27121 Effects of Cannabis and Cocaine on Driving Related Tasks of Perception, Cognition, and Action

Authors: Michelle V. Tomczak, Reyhaneh Bakhtiari, Aaron Granley, Anthony Singhal

Abstract:

Objective: Cannabis and cocaine are associated with a range of mental and physical effects that can impair aspects of human behavior. Driving is a complex cognitive behavior that is an essential part of everyday life and can be broken down into many subcomponents, each of which can uniquely impact road safety. With the growing movement of jurisdictions to legalize cannabis, there is an increased focus on impairment and driving. The purpose of this study was to identify driving-related cognitive-performance deficits that are impacted by recreational drug use. Design and Methods: With the assistance of law enforcement agencies, we recruited over 300 participants under the influence of various drugs including cannabis and cocaine. These individuals performed a battery of computer-based tasks scientifically proven to be re-lated to on-road driving performance and designed to test response-speed, memory processes, perceptual-motor skills, and decision making. Data from a control group with healthy non-drug using adults was collected as well. Results: Compared to controls, the drug group showed def-icits in all tasks. The data also showed clear differences between the cannabis and cocaine groups where cannabis users were faster, and performed better on some aspects of the decision-making and perceptual-motor tasks. Memory performance was better in the cocaine group for simple tasks but not more complex tasks. Finally, the participants who consumed both drugs performed most similarly to the cannabis group. Conclusions: Our results show distinct and combined effects of cannabis and cocaine on human performance relating to driving. These dif-ferential effects are likely related to the unique effects of each drug on the human brain and how they distinctly contribute to mental states. Our results have important implications for road safety associated with driver impairment.

Keywords: driving, cognitive impairment, recreational drug use, cannabis and cocaine

Procedia PDF Downloads 126
27120 Case Study of Obstructive Sleep Apnea and Methods of Treatment for a Professional Driver

Authors: R. Pääkkönen, L. Korpinen, T. Kava, I. Salmi

Abstract:

This study evaluates obstructive sleep apnea treatment through a case study involving a 67-year-old male driver who had a successful continuous positive airway pressure (CPAP) treatment at home but experienced difficulties with traveling and dental care. There are many cheap sleep apnea and snoring devices available, but there is little professional advice on what kind of devices can help. Professional drivers receive yearly specialized medical care follow-up.

Keywords: sleep, apnea patient, CPAP, professional driver

Procedia PDF Downloads 199
27119 System Response of a Variable-Rate Aerial Application System

Authors: Daniel E. Martin, Chenghai Yang

Abstract:

Variable-rate aerial application systems are becoming more readily available; however, aerial applicators typically only use the systems for constant-rate application of materials, allowing the systems to compensate for upwind and downwind ground speed variations. Much of the resistance to variable-rate aerial application system adoption in the U.S. pertains to applicator’s trust in the systems to turn on and off automatically as desired. The objectives of this study were to evaluate a commercially available variable-rate aerial application system under field conditions to demonstrate both the response and accuracy of the system to desired application rate inputs. This study involved planting oats in a 35-acre fallow field during the winter months to establish a uniform green backdrop in early spring. A binary (on/off) prescription application map was generated and a variable-rate aerial application of glyphosate was made to the field. Airborne multispectral imagery taken before and two weeks after the application documented actual field deposition and efficacy of the glyphosate. When compared to the prescription application map, these data provided application system response and accuracy information. The results of this study will be useful for quantifying and documenting the response and accuracy of a commercially available variable-rate aerial application system so that aerial applicators can be more confident in their capabilities and the use of these systems can increase, taking advantage of all that aerial variable-rate technologies have to offer.

Keywords: variable-rate, aerial application, remote sensing, precision application

Procedia PDF Downloads 474
27118 Non-Parametric Changepoint Approximation for Road Devices

Authors: Loïc Warscotte, Jehan Boreux

Abstract:

The scientific literature of changepoint detection is vast. Today, a lot of methods are available to detect abrupt changes or slight drift in a signal, based on CUSUM or EWMA charts, for example. However, these methods rely on strong assumptions, such as the stationarity of the stochastic underlying process, or even the independence and Gaussian distributed noise at each time. Recently, the breakthrough research on locally stationary processes widens the class of studied stochastic processes with almost no assumptions on the signals and the nature of the changepoint. Despite the accurate description of the mathematical aspects, this methodology quickly suffers from impractical time and space complexity concerning the signals with high-rate data collection, if the characteristics of the process are completely unknown. In this paper, we then addressed the problem of making this theory usable to our purpose, which is monitoring a high-speed weigh-in-motion system (HS-WIM) towards direct enforcement without supervision. To this end, we first compute bounded approximations of the initial detection theory. Secondly, these approximating bounds are empirically validated by generating many independent long-run stochastic processes. The abrupt changes and the drift are both tested. Finally, this relaxed methodology is tested on real signals coming from a HS-WIM device in Belgium, collected over several months.

Keywords: changepoint, weigh-in-motion, process, non-parametric

Procedia PDF Downloads 78
27117 Dataset Quality Index:Development of Composite Indicator Based on Standard Data Quality Indicators

Authors: Sakda Loetpiparwanich, Preecha Vichitthamaros

Abstract:

Nowadays, poor data quality is considered one of the majority costs for a data project. The data project with data quality awareness almost as much time to data quality processes while data project without data quality awareness negatively impacts financial resources, efficiency, productivity, and credibility. One of the processes that take a long time is defining the expectations and measurements of data quality because the expectation is different up to the purpose of each data project. Especially, big data project that maybe involves with many datasets and stakeholders, that take a long time to discuss and define quality expectations and measurements. Therefore, this study aimed at developing meaningful indicators to describe overall data quality for each dataset to quick comparison and priority. The objectives of this study were to: (1) Develop a practical data quality indicators and measurements, (2) Develop data quality dimensions based on statistical characteristics and (3) Develop Composite Indicator that can describe overall data quality for each dataset. The sample consisted of more than 500 datasets from public sources obtained by random sampling. After datasets were collected, there are five steps to develop the Dataset Quality Index (SDQI). First, we define standard data quality expectations. Second, we find any indicators that can measure directly to data within datasets. Thirdly, each indicator aggregates to dimension using factor analysis. Next, the indicators and dimensions were weighted by an effort for data preparing process and usability. Finally, the dimensions aggregate to Composite Indicator. The results of these analyses showed that: (1) The developed useful indicators and measurements contained ten indicators. (2) the developed data quality dimension based on statistical characteristics, we found that ten indicators can be reduced to 4 dimensions. (3) The developed Composite Indicator, we found that the SDQI can describe overall datasets quality of each dataset and can separate into 3 Level as Good Quality, Acceptable Quality, and Poor Quality. The conclusion, the SDQI provide an overall description of data quality within datasets and meaningful composition. We can use SQDI to assess for all data in the data project, effort estimation, and priority. The SDQI also work well with Agile Method by using SDQI to assessment in the first sprint. After passing the initial evaluation, we can add more specific data quality indicators into the next sprint.

Keywords: data quality, dataset quality, data quality management, composite indicator, factor analysis, principal component analysis

Procedia PDF Downloads 139
27116 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm

Procedia PDF Downloads 142
27115 Canopy Temperature Acquired from Daytime and Nighttime Aerial Data as an Indicator of Trees’ Health Status

Authors: Agata Zakrzewska, Dominik Kopeć, Adrian Ochtyra

Abstract:

The growing number of new cameras, sensors, and research methods allow for a broader application of thermal data in remote sensing vegetation studies. The aim of this research was to check whether it is possible to use thermal infrared data with a spectral range (3.6-4.9 μm) obtained during the day and the night to assess the health condition of selected species of deciduous trees in an urban environment. For this purpose, research was carried out in the city center of Warsaw (Poland) in 2020. During the airborne data acquisition, thermal data, laser scanning, and orthophoto map images were collected. Synchronously with airborne data, ground reference data were obtained for 617 studied species (Acer platanoides, Acer pseudoplatanus, Aesculus hippocastanum, Tilia cordata, and Tilia × euchlora) in different health condition states. The results were as follows: (i) healthy trees are cooler than trees in poor condition and dying both in the daytime and nighttime data; (ii) the difference in the canopy temperatures between healthy and dying trees was 1.06oC of mean value on the nighttime data and 3.28oC of mean value on the daytime data; (iii) condition classes significantly differentiate on both daytime and nighttime thermal data, but only on daytime data all condition classes differed statistically significantly from each other. In conclusion, the aerial thermal data can be considered as an alternative to hyperspectral data, a method of assessing the health condition of trees in an urban environment. Especially data obtained during the day, which can differentiate condition classes better than data obtained at night. The method based on thermal infrared and laser scanning data fusion could be a quick and efficient solution for identifying trees in poor health that should be visually checked in the field.

Keywords: middle wave infrared, thermal imagery, tree discoloration, urban trees

Procedia PDF Downloads 115
27114 Myosin-Driven Movement of Nanoparticles – An Approach to High-Speed Tracking

Authors: Sneha Kumari, Ravi Krishnan Elangovan

Abstract:

This abstract describes the development of a high-speed tracking method by modification in motor components for nanoparticle attachment. Myosin motors are nano-sized protein machines powering movement that defines life. These miniature molecular devices serve as engines utilizing chemical energy stored in ATP to produce useful mechanical energy in the form of a few nanometre displacement events leading to force generation that is required for cargo transport, cell division, cell locomotion, translated to macroscopic movements like running etc. With the advent of in vitro motility assay (IVMA), detailed functional studies of the actomyosin system could be performed. The major challenge with the currently available IVMA for tracking actin filaments is a resolution limitation of ± 50nm. To overcome this, we are trying to develop Single Molecule IVMA in which nanoparticle (GNP/QD) will be attached along or on the barbed end of actin filaments using CapZ protein and visualization by a compact TIRF module called ‘cTIRF’. The waveguide-based illumination by cTIRF offers a unique separation of excitation and collection optics, enabling imaging by scattering without emission filters. So, this technology is well equipped to perform tracking with high precision in temporal resolution of 2ms with significantly improved SNR by 100-fold as compared to conventional TIRF. Also, the nanoparticles (QD/GNP) attached to actin filament act as a point source of light coffering ease in filament tracking compared to conventional manual tracking. Moreover, the attachment of cargo (QD/GNP) to the thin filament paves the way for various nano-technological applications through their transportation to different predetermined locations on the chip

Keywords: actin, cargo, IVMA, myosin motors and single-molecule system

Procedia PDF Downloads 87
27113 Foundation Phase Teachers' Experiences of School Based Support Teams: A Case of Selected Schools in Johannesburg

Authors: Ambeck Celyne Tebid, Harry S. Rampa

Abstract:

The South African Education system recognises the need for all learners including those experiencing learning difficulties, to have access to a single unified system of education. For teachers to be pedagogically responsive to an increasingly diverse learner population without appropriate support has been proven to be unrealistic. As such, this has considerably hampered interest amongst teachers, especially those at the foundation phase to work within an Inclusive Education (IE) and training system. This qualitative study aimed at investigating foundation phase teachers’ experiences of school-based support teams (SBSTs) in two Full-Service (inclusive schools) and one Mainstream public primary school in the Gauteng province of South Africa; with particular emphasis on finding ways to supporting them, since teachers claimed they were not empowered in their initial training to teach learners experiencing learning difficulties. Hence, SBSTs were created at school levels to fill this gap thereby, supporting teaching and learning by identifying and addressing learners’, teachers’ and schools’ needs. With the notion that IE may be failing because of systemic reasons, this study uses Bronfenbrenner’s (1979) ecosystemic as well as Piaget’s (1980) maturational theory to examine the nature of support and experiences amongst teachers taking individual and systemic factors into consideration. Data was collected using in-depth, face-to-face interviews, document analysis and observation with 6 foundation phase teachers drawn from 3 different schools, 3 SBST coordinators, and 3 school principals. Data was analysed using the phenomenological data analysis method. Amongst the findings of the study is that South African full- service and mainstream schools have functional SBSTs which render formal and informal support to the teachers; this support varies in quality depending on the socio-economic status of the relevant community where the schools are situated. This paper, however, argues that what foundation phase teachers settled for as ‘support’ is flawed; as well as how they perceive the SBST and its role is problematic. The paper conclude by recommending that, the SBST should consider other approaches at foundation phase teacher support such as, empowering teachers with continuous practical experiences on how to deal with real classroom scenarios, as well as ensuring that all support, be it on academic or non-academic issues should be provided within a learning community framework where the teacher, family, SBST and where necessary, community organisations should harness their skills towards a common goal.

Keywords: foundation phase, full- service schools, inclusive education, learning difficulties, school-based support teams, teacher support

Procedia PDF Downloads 234
27112 Reclamation of Molding Sand: A Chemical Approach to Recycle Waste Foundry Sand

Authors: Mohd Moiz Khan, S. M. Mahajani, G. N. Jadhav

Abstract:

Waste foundry sand (total clay content 15%) contains toxic heavy metals and particulate matter which make dumping of waste sand an environmental and health hazard. Disposal of waste foundry sand (WFS) remains one of the substantial challenges faced by Indian foundries nowadays. To cope up with this issue, the chemical method was used to reclaim WFS. A stirrer tank reactor was used for chemical reclamation. Experiments were performed to reduce the total clay content from 15% to as low as 0.9% in chemical reclamation. This method, although found to be effective for WFS reclamation, it may face a challenge due to the possibly high operating cost. Reclaimed sand was found to be satisfactory in terms of sand qualities such as total clay (0.9%), active clay (0.3%), acid demand value (ADV) (2.6%), loss on igniting (LOI) (3 %), grain fineness number (GFN) (56), and compressive strength (60 kPa). The experimental data generated on chemical reactor under different conditions is further used to optimize the design and operating parameters (rotation speed, sand to acidic solution ratio, acid concentration, temperature and time) for the best performance. The use of reclaimed sand within the foundry would improve the economics and efficiency of the process and reduce environmental concerns.

Keywords: chemical reclamation, clay content, environmental concerns, recycle, waste foundry sand

Procedia PDF Downloads 147
27111 Identification of Knee Dynamic Profiles in High Performance Athletes with the Use of Motion Tracking

Authors: G. Espriú-Pérez, F. A. Vargas-Oviedo, I. Zenteno-Aguirrezábal, M. D. Moya-Bencomo

Abstract:

One of the injuries with a higher incidence among university-level athletes in the North of Mexico is presented in the knee. This injury generates absenteeism in training and competitions for at least 8 weeks. There is no active quantitative methodology, or protocol, that directly contributes to the clinical evaluation performed by the medical personnel at the prevalence of knee injuries. The main objective is to contribute with a quantitative tool that allows further development of preventive and corrective measures to these injuries. The study analyzed 55 athletes for 6 weeks, belonging to the disciplines of basketball, volleyball, soccer and swimming. Using a motion capture system (Nexus®, Vicon®), a three-dimensional analysis was developed that allows the measurement of the range of movement of the joint. To focus on the performance of the lower limb, eleven different movements were chosen from the Functional Performance Test, Functional Movement Screen, and the Cincinnati Jump Test. The research identifies the profile of the natural movement of a healthy knee, with the use of medical guidance, and its differences between each sport. The data recovered by the single-leg crossover hop managed to differentiate the type of knee movement among athletes. A maximum difference of 60° of offset was found in the adduction movement between male and female athletes of the same discipline. The research also seeks to serve as a guideline for the implementation of protocols that help identify the recovery level of such injuries.

Keywords: Cincinnati jump test, functional movement screen, functional performance test, knee, motion capture system

Procedia PDF Downloads 125
27110 Orbit Determination from Two Position Vectors Using Finite Difference Method

Authors: Akhilesh Kumar, Sathyanarayan G., Nirmala S.

Abstract:

An unusual approach is developed to determine the orbit of satellites/space objects. The determination of orbits is considered a boundary value problem and has been solved using the finite difference method (FDM). Only positions of the satellites/space objects are known at two end times taken as boundary conditions. The technique of finite difference has been used to calculate the orbit between end times. In this approach, the governing equation is defined as the satellite's equation of motion with a perturbed acceleration. Using the finite difference method, the governing equations and boundary conditions are discretized. The resulting system of algebraic equations is solved using Tri Diagonal Matrix Algorithm (TDMA) until convergence is achieved. This methodology test and evaluation has been done using all GPS satellite orbits from National Geospatial-Intelligence Agency (NGA) precise product for Doy 125, 2023. Towards this, two hours of twelve sets have been taken into consideration. Only positions at the end times of each twelve sets are considered boundary conditions. This algorithm is applied to all GPS satellites. Results achieved using FDM compared with the results of NGA precise orbits. The maximum RSS error for the position is 0.48 [m] and the velocity is 0.43 [mm/sec]. Also, the present algorithm is applied on the IRNSS satellites for Doy 220, 2023. The maximum RSS error for the position is 0.49 [m], and for velocity is 0.28 [mm/sec]. Next, a simulation has been done for a Highly Elliptical orbit for DOY 63, 2023, for the duration of 6 hours. The RSS of difference in position is 0.92 [m] and velocity is 1.58 [mm/sec] for the orbital speed of more than 5km/sec. Whereas the RSS of difference in position is 0.13 [m] and velocity is 0.12 [mm/sec] for the orbital speed less than 5km/sec. Results show that the newly created method is reliable and accurate. Further applications of the developed methodology include missile and spacecraft targeting, orbit design (mission planning), space rendezvous and interception, space debris correlation, and navigation solutions.

Keywords: finite difference method, grid generation, NavIC system, orbit perturbation

Procedia PDF Downloads 84
27109 The Community Structure of Fish and its Correlation with Mangrove Forest Litter Production in Panjang Island, Banten Bay, Indonesia

Authors: Meilisha Putri Pertiwi, Mufti Petala Patria

Abstract:

Mangrove forest often categorized as a productive ecosystem in trophic water and the highest carbon storage among all the forest types. Mangrove-derived organic matter determines the food web of fish and invertebrates. In Indonesia trophic water ecosystem, 80% commersial fish caught in coastal area are high related to food web in mangrove forest ecosystem. Based on the previous research in Panjang Island, Bojonegara, Banten, Indonesia, removed mangrove litterfall to the sea water were 9,023 g/m³/s for two stations (west station–5,169 g/m³/s and north station-3,854 g/m³/s). The vegetation were dominated from Rhizophora apiculata and Rhizopora stylosa. C element is the highest content (27,303% and 30,373%) than N element (0,427% and 0,35%) and P element (0,19% and 0,143%). The aim of research also to know the diversity of fish inhabit in mangrove forest. Fish sampling is by push net. Fish caught are collected into plastics, total length measured, weigh measured, and individual and total counted. Meanwhile, the 3 modified pipes (1 m long, 5 inches diameter, and a closed one hole part facing the river by using a nylon cloth) set in the water channel connecting mangrove forest and sea water for each stasiun. They placed for 1 hour at low tide. Then calculate the speed of water flow and volume of modified pipes. The fish and mangrove litter will be weigh for wet weight, dry weight, and analyze the C, N, and P element content. The sampling data will be conduct 3 times of month in full moon. The salinity, temperature, turbidity, pH, DO, and the sediment of mangrove forest will be measure too. This research will give information about the fish diversity in mangrove forest, the removed mangrove litterfall to the sea water, the composition of sediment, the total element content (C, N, P) of fish and mangrove litter, and the correlation of element content absorption between fish and mangrove litter. The data will be use for the fish and mangrove ecosystem conservation.

Keywords: fish diversity, mangrove forest, mangrove litter, carbon element, nitrogen element, P element, conservation

Procedia PDF Downloads 485
27108 Hierarchical Clustering Algorithms in Data Mining

Authors: Z. Abdullah, A. R. Hamdan

Abstract:

Clustering is a process of grouping objects and data into groups of clusters to ensure that data objects from the same cluster are identical to each other. Clustering algorithms in one of the areas in data mining and it can be classified into partition, hierarchical, density based, and grid-based. Therefore, in this paper, we do a survey and review for four major hierarchical clustering algorithms called CURE, ROCK, CHAMELEON, and BIRCH. The obtained state of the art of these algorithms will help in eliminating the current problems, as well as deriving more robust and scalable algorithms for clustering.

Keywords: clustering, unsupervised learning, algorithms, hierarchical

Procedia PDF Downloads 885
27107 Mitigating Acid Mine Drainage Pollution: A Case Study In the Witwatersrand Area of South Africa

Authors: Elkington Sibusiso Mnguni

Abstract:

In South Africa, mining has been a key economic sector since the discovery of gold in 1886 in the Witwatersrand region, where the city of Johannesburg is located. However, some mines have since been decommissioned, and the continuous pumping of acid mine drainage (AMD) also stopped causing the AMD to rise towards the ground surface. This posed a serious environmental risk to the groundwater resources and river systems in the region. This paper documents the development and extent of the environmental damage as well as the measures implemented by the government to alleviate such damage. The study will add to the body of knowledge on the subject of AMD treatment to prevent environmental degradation. The method used to gather and collate relevant data and information was the desktop study. The key findings include the social and environmental impact of the AMD, which include the pollution of water sources for domestic use leading to skin and other health problems and the loss of biodiversity in some areas. It was also found that the technical intervention of constructing a plant to pump and treat the AMD using the high-density sludge technology was the most effective short-term solution available while a long-term solution was being explored. Some successes and challenges experienced during the implementation of the project are also highlighted. The study will be a useful record of the current status of the AMD treatment interventions in the region.

Keywords: acid mine drainage, groundwater resources, pollution, river systems, technical intervention, high density sludge

Procedia PDF Downloads 186
27106 End to End Monitoring in Oracle Fusion Middleware for Data Verification

Authors: Syed Kashif Ali, Usman Javaid, Abdullah Chohan

Abstract:

In large enterprises multiple departments use different sort of information systems and databases according to their needs. These systems are independent and heterogeneous in nature and sharing information/data between these systems is not an easy task. The usage of middleware technologies have made data sharing between systems very easy. However, monitoring the exchange of data/information for verification purposes between target and source systems is often complex or impossible for maintenance department due to security/access privileges on target and source systems. In this paper, we are intended to present our experience of an end to end data monitoring approach at middle ware level implemented in Oracle BPEL for data verification without any help of monitoring tool.

Keywords: service level agreement, SOA, BPEL, oracle fusion middleware, web service monitoring

Procedia PDF Downloads 480
27105 Response Surface Methodology to Obtain Disopyramide Phosphate Loaded Controlled Release Ethyl Cellulose Microspheres

Authors: Krutika K. Sawant, Anil Solanki

Abstract:

The present study deals with the preparation and optimization of ethyl cellulose-containing disopyramide phosphate loaded microspheres using solvent evaporation technique. A central composite design consisting of a two-level full factorial design superimposed on a star design was employed for optimizing the preparation microspheres. The drug:polymer ratio (X1) and speed of the stirrer (X2) were chosen as the independent variables. The cumulative release of the drug at a different time (2, 6, 10, 14, and 18 hr) was selected as the dependent variable. An optimum polynomial equation was generated for the prediction of the response variable at time 10 hr. Based on the results of multiple linear regression analysis and F statistics, it was concluded that sustained action can be obtained when X1 and X2 are kept at high levels. The X1X2 interaction was found to be statistically significant. The drug release pattern fitted the Higuchi model well. The data of a selected batch were subjected to an optimization study using Box-Behnken design, and an optimal formulation was fabricated. Good agreement was observed between the predicted and the observed dissolution profiles of the optimal formulation.

Keywords: disopyramide phosphate, ethyl cellulose, microspheres, controlled release, Box-Behnken design, factorial design

Procedia PDF Downloads 457