Search results for: data mining techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29328

Search results for: data mining techniques

25848 The Staphylococcus aureus Exotoxin Recognition Using Nanobiosensor Designed by an Antibody-Attached Nanosilica Method

Authors: Hamed Ahari, Behrouz Akbari Adreghani, Vadood Razavilar, Amirali Anvar, Sima Moradi, Hourieh Shalchi

Abstract:

Considering the ever increasing population and industrialization of the developmental trend of humankind's life, we are no longer able to detect the toxins produced in food products using the traditional techniques. This is due to the fact that the isolation time for food products is not cost-effective and even in most of the cases, the precision in the practical techniques like the bacterial cultivation and other techniques suffer from operator errors or the errors of the mixtures used. Hence with the advent of nanotechnology, the design of selective and smart sensors is one of the greatest industrial revelations of the quality control of food products that in few minutes time, and with a very high precision can identify the volume and toxicity of the bacteria. Methods and Materials: In this technique, based on the bacterial antibody connection to nanoparticle, a sensor was used. In this part of the research, as the basis for absorption for the recognition of bacterial toxin, medium sized silica nanoparticles of 10 nanometer in form of solid powder were utilized with Notrino brand. Then the suspension produced from agent-linked nanosilica which was connected to bacterial antibody was positioned near the samples of distilled water, which were contaminated with Staphylococcus aureus bacterial toxin with the density of 10-3, so that in case any toxin exists in the sample, a connection between toxin antigen and antibody would be formed. Finally, the light absorption related to the connection of antigen to the particle attached antibody was measured using spectrophotometry. The gene of 23S rRNA that is conserved in all Staphylococcus spp., also used as control. The accuracy of the test was monitored by using serial dilution (l0-6) of overnight cell culture of Staphylococcus spp., bacteria (OD600: 0.02 = 107 cell). It showed that the sensitivity of PCR is 10 bacteria per ml of cells within few hours. Result: The results indicate that the sensor detects up to 10-4 density. Additionally, the sensitivity of the sensors was examined after 60 days, the sensor by the 56 days had confirmatory results and started to decrease after those time periods. Conclusions: Comparing practical nano biosensory to conventional methods like that culture and biotechnology methods(such as polymerase chain reaction) is accuracy, sensitiveness and being unique. In the other way, they reduce the time from the hours to the 30 minutes.

Keywords: exotoxin, nanobiosensor, recognition, Staphylococcus aureus

Procedia PDF Downloads 381
25847 Assessment of Spectral Indices for Soil Salinity Estimation in Irrigated Land

Authors: R. Lhissou , A. El Harti , K. Chokmani, E. Bachaoui, A. El Ghmari

Abstract:

Soil salinity is a serious environmental hazard in many countries around the world especially the arid and semi-arid countries like Morocco. Salinization causes negative effects on the ground; it affects agricultural production, infrastructure, water resources and biodiversity. Remote sensing can provide soil salinity information for large areas, and in a relatively short time. In addition, remote sensing is not limited by extremes in terrain or hazardous condition. Contrariwise, experimental methods for monitoring soil salinity by direct measurements in situ are very demanding of time and resources, and also very limited in spatial coverage. In the irrigated perimeter of Tadla plain in central Morocco, the increased use of saline groundwater and surface water, coupled with agricultural intensification leads to the deterioration of soil quality especially by salinization. In this study, we assessed several spectral indices of soil salinity cited in the literature using Landsat TM satellite images and field measurements of electrical conductivity (EC). Three Landsat TM satellite images were taken during 3 months in the dry season (September, October and November 2011). Based on field measurement data of EC collected in three field campaigns over the three dates simultaneously with acquisition dates of Landsat TM satellite images, a two assessment techniques are used to validate a soil salinity spectral indices. Firstly, the spectral indices are validated locally by pixel. The second validation technique is made using a window of size 3x3 pixels. The results of the study indicated that the second technique provides getting a more accurate validation and the assessment has shown its limits when it comes to assess across the pixel. In addition, the EC values measured from field have a good correlation with some spectral indices derived from Landsat TM data and the best results show an r² of 0.88, 0.79 and 0.65 for Salinity Index (SI) in the three dates respectively. The results have shown the usefulness of spectral indices as an auxiliary variable in the spatial estimation and mapping salinity in irrigated land.

Keywords: remote sensing, spectral indices, soil salinity, irrigated land

Procedia PDF Downloads 385
25846 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency

Authors: Rania Alshikhe, Vinita Jindal

Abstract:

Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from traveling vehicles, such as taxis through installed global positioning system (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.

Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE

Procedia PDF Downloads 153
25845 Mitigating Supply Chain Risk for Sustainability Using Big Data Knowledge: Evidence from the Manufacturing Supply Chain

Authors: Mani Venkatesh, Catarina Delgado, Purvishkumar Patel

Abstract:

The sustainable supply chain is gaining popularity among practitioners because of increased environmental degradation and stakeholder awareness. On the other hand supply chain, risk management is very crucial for the practitioners as it potentially disrupts supply chain operations. Prediction and addressing the risk caused by social issues in the supply chain is paramount importance to the sustainable enterprise. More recently, the usage of Big data analytics for forecasting business trends has been gaining momentum among professionals. The aim of the research is to explore the application of big data, predictive analytics in successfully mitigating supply chain social risk and demonstrate how such mitigation can help in achieving sustainability (environmental, economic & social). The method involves the identification and validation of social issues in the supply chain by an expert panel and survey. Later, we used a case study to illustrate the application of big data in the successful identification and mitigation of social issues in the supply chain. Our result shows that the company can predict various social issues through big data, predictive analytics and mitigate the social risk. We also discuss the implication of this research to the body of knowledge and practice.

Keywords: big data, sustainability, supply chain social sustainability, social risk, case study

Procedia PDF Downloads 401
25844 Effects of Waist-to-Hip Ratio and Visceral Fat Measurements Improvement on Offshore Petrochemical Company Shift Employees' Work Efficiency

Authors: Essam Amerian

Abstract:

The aim of this study was to investigate the effects of improving waist-to-hip ratio (WHR) and visceral fat components on the health of shift workers in an offshore petrochemical company. A total of 100 male shift workers participated in the study, with an average age of 40.5 years and an average BMI of 28.2 kg/m². The study employed a randomized controlled trial design, with participants assigned to either an intervention group or a control group. The intervention group received a 12-week program that included dietary counseling, physical activity recommendations, and stress management techniques. The control group received no intervention. The outcomes measured were changes in WHR, visceral fat components, blood pressure, and lipid profile. The results showed that the intervention group had a statistically significant improvement in WHR (p<0.001) and visceral fat components (p<0.001) compared to the control group. Furthermore, there were statistically significant improvements in systolic blood pressure (p=0.015) and total cholesterol (p=0.034) in the intervention group compared to the control group. These findings suggest that implementing a 12-week program that includes dietary counseling, physical activity recommendations, and stress management techniques can effectively improve WHR, visceral fat components, and cardiovascular health among shift workers in an offshore petrochemical company.

Keywords: body composition, waist-hip-ratio, visceral fat, shift worker, work efficiency

Procedia PDF Downloads 73
25843 Different Processing Methods to Obtain a Carbon Composite Element for Cycling

Authors: Maria Fonseca, Ana Branco, Joao Graca, Rui Mendes, Pedro Mimoso

Abstract:

The present work is focused on the production of a carbon composite element for cycling through different techniques, namely, blow-molding and high-pressure resin transfer injection (HP-RTM). The main objective of this work is to compare both processes to produce carbon composite elements for the cycling industry. It is well known that the carbon composite components for cycling are produced mainly through blow-molding; however, this technique depends strongly on manual labour, resulting in a time-consuming production process. Comparatively, HP-RTM offers a more automated process which should lead to higher production rates. Nevertheless, a comparison of the elements produced through both techniques must be done, in order to assess if the final products comply with the required standards of the industry. The main difference between said techniques lies in the used material. Blow-moulding uses carbon prepreg (carbon fibres pre-impregnated with a resin system), and the material is laid up by hand, piece by piece, on a mould or on a hard male. After that, the material is cured at a high temperature. On the other hand, in the HP-RTM technique, dry carbon fibres are placed on a mould, and then resin is injected at high pressure. After some research regarding the best material systems (prepregs and braids) and suppliers, an element was designed (similar to a handlebar) to be constructed. The next step was to perform FEM simulations in order to determine what the best layup of the composite material was. The simulations were done for the prepreg material, and the obtained layup was transposed to the braids. The selected material was a prepreg with T700 carbon fibre (24K) and an epoxy resin system, for the blow-molding technique. For HP-RTM, carbon fibre elastic UD tubes and ± 45º braids were used, with both 3K and 6K filaments per tow, and the resin system was an epoxy as well. After the simulations for the prepreg material, the optimized layup was: [45°, -45°,45°, -45°,0°,0°]. For HP-RTM, the transposed layup was [ ± 45° (6k); 0° (6k); partial ± 45° (6k); partial ± 45° (6k); ± 45° (3k); ± 45° (3k)]. The mechanical tests showed that both elements can withstand the maximum load (in this case, 1000 N); however, the one produced through blow-molding can support higher loads (≈1300N against 1100N from HP-RTM). In what concerns to the fibre volume fraction (FVF), the HP-RTM element has a slightly higher value ( > 61% compared to 59% of the blow-molding technique). The optical microscopy has shown that both elements have a low void content. In conclusion, the elements produced using HP-RTM can compare to the ones produced through blow-molding, both in mechanical testing and in the visual aspect. Nevertheless, there is still space for improvement in the HP-RTM elements since the layup of the braids, and UD tubes could be optimized.

Keywords: HP-RTM, carbon composites, cycling, FEM

Procedia PDF Downloads 127
25842 Improving the Analytical Power of Dynamic DEA Models, by the Consideration of the Shape of the Distribution of Inputs/Outputs Data: A Linear Piecewise Decomposition Approach

Authors: Elias K. Maragos, Petros E. Maravelakis

Abstract:

In Dynamic Data Envelopment Analysis (DDEA), which is a subfield of Data Envelopment Analysis (DEA), the productivity of Decision Making Units (DMUs) is considered in relation to time. In this case, as it is accepted by the most of the researchers, there are outputs, which are produced by a DMU to be used as inputs in a future time. Those outputs are known as intermediates. The common models, in DDEA, do not take into account the shape of the distribution of those inputs, outputs or intermediates data, assuming that the distribution of the virtual value of them does not deviate from linearity. This weakness causes the limitation of the accuracy of the analytical power of the traditional DDEA models. In this paper, the authors, using the concept of piecewise linear inputs and outputs, propose an extended DDEA model. The proposed model increases the flexibility of the traditional DDEA models and improves the measurement of the dynamic performance of DMUs.

Keywords: Dynamic Data Envelopment Analysis, DDEA, piecewise linear inputs, piecewise linear outputs

Procedia PDF Downloads 156
25841 Task Validity in Neuroimaging Studies: Perspectives from Applied Linguistics

Authors: L. Freeborn

Abstract:

Recent years have seen an increasing number of neuroimaging studies related to language learning as imaging techniques such as fMRI and EEG have become more widely accessible to researchers. By using a variety of structural and functional neuroimaging techniques, these studies have already made considerable progress in terms of our understanding of neural networks and processing related to first and second language acquisition. However, the methodological designs employed in neuroimaging studies to test language learning have been questioned by applied linguists working within the field of second language acquisition (SLA). One of the major criticisms is that tasks designed to measure language learning gains rarely have a communicative function, and seldom assess learners’ ability to use the language in authentic situations. This brings the validity of many neuroimaging tasks into question. The fundamental reason why people learn a language is to communicate, and it is well-known that both first and second language proficiency are developed through meaningful social interaction. With this in mind, the SLA field is in agreement that second language acquisition and proficiency should be measured through learners’ ability to communicate in authentic real-life situations. Whilst authenticity is not always possible to achieve in a classroom environment, the importance of task authenticity should be reflected in the design of language assessments, teaching materials, and curricula. Tasks that bear little relation to how language is used in real-life situations can be considered to lack construct validity. This paper first describes the typical tasks used in neuroimaging studies to measure language gains and proficiency, then analyses to what extent these tasks can validly assess these constructs.

Keywords: neuroimaging studies, research design, second language acquisition, task validity

Procedia PDF Downloads 131
25840 A Study of Growth Factors on Sustainable Manufacturing in Small and Medium-Sized Enterprises: Case Study of Japan Manufacturing

Authors: Tadayuki Kyoutani, Shigeyuki Haruyama, Ken Kaminishi, Zefry Darmawan

Abstract:

Japan’s semiconductor industries have developed greatly in recent years. Many were started from a Small and Medium-sized Enterprises (SMEs) that found at a good circumstance and now become the prosperous industries in the world. Sustainable growth factors that support the creation of spirit value inside the Japanese company were strongly embedded through performance. Those factors were not clearly defined among each company. A series of literature research conducted to explore quantitative text mining about the definition of sustainable growth factors. Sustainable criteria were developed from previous research to verify the definition of the factors. A typical frame work was proposed as a systematical approach to develop sustainable growth factor in a specific company. Result of approach was review in certain period shows that factors influenced in sustainable growth was importance for the company to achieve the goal.

Keywords: SME, manufacture, sustainable, growth factor

Procedia PDF Downloads 243
25839 A Fuzzy TOPSIS Based Model for Safety Risk Assessment of Operational Flight Data

Authors: N. Borjalilu, P. Rabiei, A. Enjoo

Abstract:

Flight Data Monitoring (FDM) program assists an operator in aviation industries to identify, quantify, assess and address operational safety risks, in order to improve safety of flight operations. FDM is a powerful tool for an aircraft operator integrated into the operator’s Safety Management System (SMS), allowing to detect, confirm, and assess safety issues and to check the effectiveness of corrective actions, associated with human errors. This article proposes a model for safety risk assessment level of flight data in a different aspect of event focus based on fuzzy set values. It permits to evaluate the operational safety level from the point of view of flight activities. The main advantages of this method are proposed qualitative safety analysis of flight data. This research applies the opinions of the aviation experts through a number of questionnaires Related to flight data in four categories of occurrence that can take place during an accident or an incident such as: Runway Excursions (RE), Controlled Flight Into Terrain (CFIT), Mid-Air Collision (MAC), Loss of Control in Flight (LOC-I). By weighting each one (by F-TOPSIS) and applying it to the number of risks of the event, the safety risk of each related events can be obtained.

Keywords: F-topsis, fuzzy set, flight data monitoring (FDM), flight safety

Procedia PDF Downloads 161
25838 From Modeling of Data Structures towards Automatic Programs Generating

Authors: Valentin P. Velikov

Abstract:

Automatic program generation saves time, human resources, and allows receiving syntactically clear and logically correct modules. The 4-th generation programming languages are related to drawing the data and the processes of the subject area, as well as, to obtain a frame of the respective information system. The application can be separated in interface and business logic. That means, for an interactive generation of the needed system to be used an already existing toolkit or to be created a new one.

Keywords: computer science, graphical user interface, user dialog interface, dialog frames, data modeling, subject area modeling

Procedia PDF Downloads 298
25837 Optimized Weight Selection of Control Data Based on Quotient Space of Multi-Geometric Features

Authors: Bo Wang

Abstract:

The geometric processing of multi-source remote sensing data using control data of different scale and different accuracy is an important research direction of multi-platform system for earth observation. In the existing block bundle adjustment methods, as the controlling information in the adjustment system, the approach using single observation scale and precision is unable to screen out the control information and to give reasonable and effective corresponding weights, which reduces the convergence and adjustment reliability of the results. Referring to the relevant theory and technology of quotient space, in this project, several subjects are researched. Multi-layer quotient space of multi-geometric features is constructed to describe and filter control data. Normalized granularity merging mechanism of multi-layer control information is studied and based on the normalized scale factor, the strategy to optimize the weight selection of control data which is less relevant to the adjustment system can be realized. At the same time, geometric positioning experiment is conducted using multi-source remote sensing data, aerial images, and multiclass control data to verify the theoretical research results. This research is expected to break through the cliché of the single scale and single accuracy control data in the adjustment process and expand the theory and technology of photogrammetry. Thus the problem to process multi-source remote sensing data will be solved both theoretically and practically.

Keywords: multi-source image geometric process, high precision geometric positioning, quotient space of multi-geometric features, optimized weight selection

Procedia PDF Downloads 279
25836 Consortium Blockchain-based Model for Data Management Applications in the Healthcare Sector

Authors: Teo Hao Jing, Shane Ho Ken Wae, Lee Jin Yu, Burra Venkata Durga Kumar

Abstract:

Current distributed healthcare systems face the challenge of interoperability of health data. Storing electronic health records (EHR) in local databases causes them to be fragmented. This problem is aggravated as patients visit multiple healthcare providers in their lifetime. Existing solutions are unable to solve this issue and have caused burdens to healthcare specialists and patients alike. Blockchain technology was found to be able to increase the interoperability of health data by implementing digital access rules, enabling uniformed patient identity, and providing data aggregation. Consortium blockchain was found to have high read throughputs, is more trustworthy, more secure against external disruptions and accommodates transactions without fees. Therefore, this paper proposes a blockchain-based model for data management applications. In this model, a consortium blockchain is implemented by using a delegated proof of stake (DPoS) as its consensus mechanism. This blockchain allows collaboration between users from different organizations such as hospitals and medical bureaus. Patients serve as the owner of their information, where users from other parties require authorization from the patient to view their information. Hospitals upload the hash value of patients’ generated data to the blockchain, whereas the encrypted information is stored in a distributed cloud storage.

Keywords: blockchain technology, data management applications, healthcare, interoperability, delegated proof of stake

Procedia PDF Downloads 131
25835 Long-Term Climate Patterns in Eastern and Southeastern Ethiopia

Authors: Messay Mulugeta, Degefa Tolossa

Abstract:

The purpose of this paper is to scrutinize trends of climate risks in eastern and southeastern parts of Ethiopia. This part of the country appears severely affected by recurrent droughts, erratic rainfall, and increasing temperature condition. Particularly, erratic rains and moisture stresses have been forcibly threatening and shoving the people over many decades coupled with unproductive policy frameworks and weak institutional setups. These menaces have been more severe in dry lowlands where rainfall is more erratic and scarce. Long-term climate data of nine weather stations in eastern and southeastern parts of Ethiopia were obtained from National Meteorological Agency of Ethiopia (NMA). As issues related to climate risks are very intricate, different techniques and indices were applied to deal with the objectives of the study. It is concluded that erratic rainfall, moisture scarcity, and increasing temperature conditions have been the main challenges in eastern and southeastern Ethiopia. In fact, these risks can be eased by putting in place efficient and integrated rural development strategies, environmental rehabilitation plans of action in overworked areas, proper irrigation and water harvesting practices and well thought-out and genuine resettlement schemes.

Keywords: rainfall variability, erratic rains, precipitation concentration index (PCI), climatic pattern, Ethiopia

Procedia PDF Downloads 229
25834 Sustainable Production of Tin Oxide Nanoparticles: Exploring Synthesis Techniques, Formation Mechanisms, and Versatile Applications

Authors: Yemane Tadesse Gebreslassie, Henok Gidey Gebretnsae

Abstract:

Nanotechnology has emerged as a highly promising field of research with wide-ranging applications across various scientific disciplines. In recent years, tin oxide has garnered significant attention due to its intriguing properties, particularly when synthesized in the nanoscale range. While numerous physical and chemical methods exist for producing tin oxide nanoparticles, these approaches tend to be costly, energy-intensive, and involve the use of toxic chemicals. Given the growing concerns regarding human health and environmental impact, there has been a shift towards developing cost-effective and environmentally friendly processes for tin oxide nanoparticle synthesis. Green synthesis methods utilizing biological entities such as plant extracts, bacteria, and natural biomolecules have shown promise in successfully producing tin oxide nanoparticles. However, scaling up the production to an industrial level using green synthesis approaches remains challenging due to the complexity of biological substrates, which hinders the elucidation of reaction mechanisms and formation processes. Thus, this review aims to provide an overview of the various sources of biological entities and methodologies employed in the green synthesis of tin oxide nanoparticles, as well as their impact on nanoparticle properties. Furthermore, this research delves into the strides made in comprehending the mechanisms behind the formation of nanoparticles as documented in existing literature. It also sheds light on the array of analytical techniques employed to investigate and elucidate the characteristics of these minuscule particles.

Keywords: nanotechnology, tin oxide, green synthesis, formation mechanisms

Procedia PDF Downloads 43
25833 Finding the Free Stream Velocity Using Flow Generated Sound

Authors: Saeed Hosseini, Ali Reza Tahavvor

Abstract:

Sound processing is one the subjects that newly attracts a lot of researchers. It is efficient and usually less expensive than other methods. In this paper the flow generated sound is used to estimate the flow speed of free flows. Many sound samples are gathered. After analyzing the data, a parameter named wave power is chosen. For all samples, the wave power is calculated and averaged for each flow speed. A curve is fitted to the averaged data and a correlation between the wave power and flow speed is founded. Test data are used to validate the method and errors for all test data were under 10 percent. The speed of the flow can be estimated by calculating the wave power of the flow generated sound and using the proposed correlation.

Keywords: the flow generated sound, free stream, sound processing, speed, wave power

Procedia PDF Downloads 407
25832 Numerical Investigation of Pressure Drop and Erosion Wear by Computational Fluid Dynamics Simulation

Authors: Praveen Kumar, Nitin Kumar, Hemant Kumar

Abstract:

The modernization of computer technology and commercial computational fluid dynamic (CFD) simulation has given better detailed results as compared to experimental investigation techniques. CFD techniques are widely used in different field due to its flexibility and performance. Evaluation of pipeline erosion is complex phenomenon to solve by numerical arithmetic technique, whereas CFD simulation is an easy tool to resolve that type of problem. Erosion wear behaviour due to solid–liquid mixture in the slurry pipeline has been investigated using commercial CFD code in FLUENT. Multi-phase Euler-Lagrange model was adopted to predict the solid particle erosion wear in 22.5° pipe bend for the flow of bottom ash-water suspension. The present study addresses erosion prediction in three dimensional 22.5° pipe bend for two-phase (solid and liquid) flow using finite volume method with standard k-ε turbulence, discrete phase model and evaluation of erosion wear rate with varying velocity 2-4 m/s. The result shows that velocity of solid-liquid mixture found to be highly dominating parameter as compared to solid concentration, density, and particle size. At low velocity, settling takes place in the pipe bend due to low inertia and gravitational effect on solid particulate which leads to high erosion at bottom side of pipeline.

Keywords: computational fluid dynamics (CFD), erosion, slurry transportation, k-ε Model

Procedia PDF Downloads 403
25831 Efficiency of DMUs in Presence of New Inputs and Outputs in DEA

Authors: Esmat Noroozi, Elahe Sarfi, Farha Hosseinzadeh Lotfi

Abstract:

Examining the impacts of data modification is considered as sensitivity analysis. A lot of studies have considered the data modification of inputs and outputs in DEA. The issues which has not heretofore been considered in DEA sensitivity analysis is modification in the number of inputs and (or) outputs and determining the impacts of this modification in the status of efficiency of DMUs. This paper is going to present systems that show the impacts of adding one or multiple inputs or outputs on the status of efficiency of DMUs and furthermore a model is presented for recognizing the minimum number of inputs and (or) outputs from among specified inputs and outputs which can be added whereas an inefficient DMU will become efficient. Finally the presented systems and model have been utilized for a set of real data and the results have been reported.

Keywords: data envelopment analysis, efficiency, sensitivity analysis, input, out put

Procedia PDF Downloads 442
25830 Assessment of the Impact of Social Compliance Certification on Abolition of Forced Labour and Discrimination in the Garment Manufacturing Units in Bengaluru: A Perspective of Women Sewing Operators

Authors: Jonalee Das Bajpai, Sandeep Shastri

Abstract:

The Indian Textile and Garment Industry is one of the major contributors to the country’s economy. This industry is also one of the largest labour intensive industries after agriculture and livestock. This Indian garment industry caters to both the domestic and international market. Although this industry comes under the purview of Indian Labour Laws and other voluntary work place standards yet, this industry is often criticized for the undue exploitation of the workers. This paper explored the status of forced labour and discrimination at work place in the garment manufacturing units in Bengaluru. This study is conducted from the perspective of women sewing operators as majority of operators in Bengaluru are women. The research also explored to study the impact of social compliance certification in abolishing forced labour and discrimination at work place. Objectives of the Research: 1. To study the impact of 'Social Compliance Certification' on abolition of forced labour among the women workforce. 2. To study the impact of 'Social Compliance Certification' on abolition of discrimination at workplace among the women workforce. Sample Size and Data Collection Techniques: The main backbone of the data which is the primary data was collected through a structured questionnaire. The questionnaire attempted to explore the extent of prevalence of forced labour and discrimination against women workers from the perspective of women workers themselves. The sample size for the same was 600 (n) women sewing operators from the garment industry with minimum one year of work experience. Three hundred samples were selected from units with Social Compliance Certification like SA8000, WRAP, BSCI, ETI and so on. Other three hundred samples were selected from units without Social Compliance Certification. Out of these three hundred samples, one hundred and fifty samples were selected from units with Buyer’s Code of Conduct and another one hundred and fifty were from domestic units that do not come under the purview of any such certification. The responses of the survey were further authenticated through on sight visit and personal interactions. Comparative analysis of the workplace environment between units with Social Compliance certification, units with Buyer’s Code of Conduct and domestic units that do not come under the purview of any such voluntary workplace environment enabled to analyze the impact of Social Compliance certification on abolition of workplace environment and discrimination at workplace. Correlation analysis has been conducted to measure the relationship between impact of forced labour and discrimination at workplace on the level of job satisfaction. The result displayed that abolition of forced labour and abolition of discrimination at workplace have a higher level of job satisfaction among the women workers.

Keywords: discrimination, garment industry, forced labour, social compliance certification

Procedia PDF Downloads 191
25829 A Study to Examine the Use of Traditional Agricultural Practices to Fight the Effects of Climate Change

Authors: Rushva Parihar, Anushka Barua

Abstract:

The negative repercussions of a warming planet are already visible, with biodiversity loss, water scarcity, and extreme weather events becoming ever so frequent. The agriculture sector is perhaps the most impacted, and modern agriculture has failed to defend farmers from the effects of climate change. This, coupled with the added pressure of higher demands for food production caused due to population growth, has only compounded the impact. Traditional agricultural practices that are routed in indigenous knowledge have long safeguarded the delicate balance of the ecosystem through sustainable production techniques. This paper uses secondary data to explore these traditional processes (like Beejamrita, Jeevamrita, sheep penning, earthen bunding, and others) from around the world that have been developed over centuries and focuses on how they can be used to tackle contemporary issues arising from climate change (such as nutrient and water loss, soil degradation, increased incidences of pests). Finally, the resulting framework has been applied to the context of Indian agriculture as a means to combat climate change and improve food security, all while encouraging documentation and transfer of local knowledge as a shared resource among farmers.

Keywords: sustainable food systems, traditional agricultural practices, climate smart agriculture, climate change, indigenous knowledge

Procedia PDF Downloads 122
25828 WebAppShield: An Approach Exploiting Machine Learning to Detect SQLi Attacks in an Application Layer in Run-time

Authors: Ahmed Abdulla Ashlam, Atta Badii, Frederic Stahl

Abstract:

In recent years, SQL injection attacks have been identified as being prevalent against web applications. They affect network security and user data, which leads to a considerable loss of money and data every year. This paper presents the use of classification algorithms in machine learning using a method to classify the login data filtering inputs into "SQLi" or "Non-SQLi,” thus increasing the reliability and accuracy of results in terms of deciding whether an operation is an attack or a valid operation. A method Web-App auto-generated twin data structure replication. Shielding against SQLi attacks (WebAppShield) that verifies all users and prevents attackers (SQLi attacks) from entering and or accessing the database, which the machine learning module predicts as "Non-SQLi" has been developed. A special login form has been developed with a special instance of data validation; this verification process secures the web application from its early stages. The system has been tested and validated, up to 99% of SQLi attacks have been prevented.

Keywords: SQL injection, attacks, web application, accuracy, database

Procedia PDF Downloads 144
25827 Application of Deep Neural Networks to Assess Corporate Credit Rating

Authors: Parisa Golbayani, Dan Wang, Ionut¸ Florescu

Abstract:

In this work we implement machine learning techniques to financial statement reports in order to asses company’s credit rating. Specifically, the work analyzes the performance of four neural network architectures (MLP, CNN, CNN2D, LSTM) in predicting corporate credit rating as issued by Standard and Poor’s. The paper focuses on companies from the energy, financial, and healthcare sectors in the US. The goal of this analysis is to improve application of machine learning algorithms to credit assessment. To accomplish this, the study investigates three questions. First, we investigate if the algorithms perform better when using a selected subset of important features or whether better performance is obtained by allowing the algorithms to select features themselves. Second, we address the temporal aspect inherent in financial data and study whether it is important for the results obtained by a machine learning algorithm. Third, we aim to answer if one of the four particular neural network architectures considered consistently outperforms the others, and if so under which conditions. This work frames the problem as several case studies to answer these questions and analyze the results using ANOVA and multiple comparison testing procedures.

Keywords: convolutional neural network, long short term memory, multilayer perceptron, credit rating

Procedia PDF Downloads 232
25826 From Theory to Practice: Harnessing Mathematical and Statistical Sciences in Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid growth of data in diverse domains has created an urgent need for effective utilization of mathematical and statistical sciences in data analytics. This abstract explores the journey from theory to practice, emphasizing the importance of harnessing mathematical and statistical innovations to unlock the full potential of data analytics. Drawing on a comprehensive review of existing literature and research, this study investigates the fundamental theories and principles underpinning mathematical and statistical sciences in the context of data analytics. It delves into key mathematical concepts such as optimization, probability theory, statistical modeling, and machine learning algorithms, highlighting their significance in analyzing and extracting insights from complex datasets. Moreover, this abstract sheds light on the practical applications of mathematical and statistical sciences in real-world data analytics scenarios. Through case studies and examples, it showcases how mathematical and statistical innovations are being applied to tackle challenges in various fields such as finance, healthcare, marketing, and social sciences. These applications demonstrate the transformative power of mathematical and statistical sciences in data-driven decision-making. The abstract also emphasizes the importance of interdisciplinary collaboration, as it recognizes the synergy between mathematical and statistical sciences and other domains such as computer science, information technology, and domain-specific knowledge. Collaborative efforts enable the development of innovative methodologies and tools that bridge the gap between theory and practice, ultimately enhancing the effectiveness of data analytics. Furthermore, ethical considerations surrounding data analytics, including privacy, bias, and fairness, are addressed within the abstract. It underscores the need for responsible and transparent practices in data analytics, and highlights the role of mathematical and statistical sciences in ensuring ethical data handling and analysis. In conclusion, this abstract highlights the journey from theory to practice in harnessing mathematical and statistical sciences in data analytics. It showcases the practical applications of these sciences, the importance of interdisciplinary collaboration, and the need for ethical considerations. By bridging the gap between theory and practice, mathematical and statistical sciences contribute to unlocking the full potential of data analytics, empowering organizations and decision-makers with valuable insights for informed decision-making.

Keywords: data analytics, mathematical sciences, optimization, machine learning, interdisciplinary collaboration, practical applications

Procedia PDF Downloads 87
25825 Regression for Doubly Inflated Multivariate Poisson Distributions

Authors: Ishapathik Das, Sumen Sen, N. Rao Chaganty, Pooja Sengupta

Abstract:

Dependent multivariate count data occur in several research studies. These data can be modeled by a multivariate Poisson or Negative binomial distribution constructed using copulas. However, when some of the counts are inflated, that is, the number of observations in some cells are much larger than other cells, then the copula based multivariate Poisson (or Negative binomial) distribution may not fit well and it is not an appropriate statistical model for the data. There is a need to modify or adjust the multivariate distribution to account for the inflated frequencies. In this article, we consider the situation where the frequencies of two cells are higher compared to the other cells, and develop a doubly inflated multivariate Poisson distribution function using multivariate Gaussian copula. We also discuss procedures for regression on covariates for the doubly inflated multivariate count data. For illustrating the proposed methodologies, we present a real data containing bivariate count observations with inflations in two cells. Several models and linear predictors with log link functions are considered, and we discuss maximum likelihood estimation to estimate unknown parameters of the models.

Keywords: copula, Gaussian copula, multivariate distributions, inflated distributios

Procedia PDF Downloads 153
25824 An Exploratory Research of Human Character Analysis Based on Smart Watch Data: Distinguish the Drinking State from Normal State

Authors: Lu Zhao, Yanrong Kang, Lili Guo, Yuan Long, Guidong Xing

Abstract:

Smart watches, as a handy device with rich functionality, has become one of the most popular wearable devices all over the world. Among the various function, the most basic is health monitoring. The monitoring data can be provided as an effective evidence or a clue for the detection of crime cases. For instance, the step counting data can help to determine whether the watch wearer was quiet or moving during the given time period. There is, however, still quite few research on the analysis of human character based on these data. The purpose of this research is to analyze the health monitoring data to distinguish the drinking state from normal state. The analysis result may play a role in cases involving drinking, such as drunk driving. The experiment mainly focused on finding the figures of smart watch health monitoring data that change with drinking and figuring up the change scope. The chosen subjects are mostly in their 20s, each of whom had been wearing the same smart watch for a week. Each subject drank for several times during the week, and noted down the begin and end time point of the drinking. The researcher, then, extracted and analyzed the health monitoring data from the watch. According to the descriptive statistics analysis, it can be found that the heart rate change when drinking. The average heart rate is about 10% higher than normal, the coefficient of variation is less than about 30% of the normal state. Though more research is needed to be carried out, this experiment and analysis provide a thought of the application of the data from smart watches.

Keywords: character analysis, descriptive statistics analysis, drink state, heart rate, smart watch

Procedia PDF Downloads 162
25823 Repeated Reuse of Insulin Injection Syringes and Incidence of Bacterial Contamination among Diabetic Patients in Jimma University Specialized Hospital, Jimma, Ethiopia

Authors: Muluneh Ademe, Zeleke Mekonnen

Abstract:

Objective: to determine the level of bacterial contamination of reused insulin syringes among diabetic patients. Method: A facility based cross-sectional study was conducted among diabetic patients. Data on socio-demographic variables, history of injection syringe reuse, and frequency of reuse of syringes were collected using predesigned questionnaire. Finally, the samples from the syringes were cultured according to standard microbiological techniques. Result: Eighteen diabetic patients at Jimma University Hospital participated. A total of 83.3% of participants reused a single injection syringe for >30 consecutive injections, while 16.7% reused for >30 injections. Our results showed 22.2% of syringes were contaminated with methicillin-resistant Staphylococcus aures. Conclusion: We conclude reuse of syringe is associated with microbial contamination. The findings that 4/18 syringes being contaminated with bacteria is an alarming situation. A mechanism should be designed for patients to get injection syringes with affordable price. If reusing is not avoidable, reducing number of injections per a single syringe and avoiding needle touching with hand or other non-sterile material may be an alternative to reduce the risk of contamination.

Keywords: diabetes mellitus, Ethiopia, subcutaneous insulin injection, syringe reuse

Procedia PDF Downloads 377
25822 An Approach to Practical Determination of Fair Premium Rates in Crop Hail Insurance Using Short-Term Insurance Data

Authors: Necati Içer

Abstract:

Crop-hail insurance plays a vital role in managing risks and reducing the financial consequences of hail damage on crop production. Predicting insurance premium rates with short-term data is a major difficulty in numerous nations because of the unique characteristics of hailstorms. This study aims to suggest a feasible approach for establishing equitable premium rates in crop-hail insurance for nations with short-term insurance data. The primary goal of the rate-making process is to determine premium rates for high and zero loss costs of villages and enhance their credibility. To do this, a technique was created using the author's practical knowledge of crop-hail insurance. With this approach, the rate-making method was developed using a range of temporal and spatial factor combinations with both hypothetical and real data, including extreme cases. This article aims to show how to incorporate the temporal and spatial elements into determining fair premium rates using short-term insurance data. The article ends with a suggestion on the ultimate premium rates for insurance contracts.

Keywords: crop-hail insurance, premium rate, short-term insurance data, spatial and temporal parameters

Procedia PDF Downloads 42
25821 Electricity Generation from Renewables and Targets: An Application of Multivariate Statistical Techniques

Authors: Filiz Ersoz, Taner Ersoz, Tugrul Bayraktar

Abstract:

Renewable energy is referred to as "clean energy" and common popular support for the use of renewable energy (RE) is to provide electricity with zero carbon dioxide emissions. This study provides useful insight into the European Union (EU) RE, especially, into electricity generation obtained from renewables, and their targets. The objective of this study is to identify groups of European countries, using multivariate statistical analysis and selected indicators. The hierarchical clustering method is used to decide the number of clusters for EU countries. The conducted statistical hierarchical cluster analysis is based on the Ward’s clustering method and squared Euclidean distances. Hierarchical cluster analysis identified eight distinct clusters of European countries. Then, non-hierarchical clustering (k-means) method was applied. Discriminant analysis was used to determine the validity of the results with data normalized by Z score transformation. To explore the relationship between the selected indicators, correlation coefficients were computed. The results of the study reveal the current situation of RE in European Union Member States.

Keywords: share of electricity generation, k-means clustering, discriminant, CO2 emission

Procedia PDF Downloads 409
25820 Awareness in the Code of Ethics for Nurse Educators among Nurse Educators, Nursing Students and Professional Nurses at the Royal Thai Army, Thailand

Authors: Wallapa Boonrod

Abstract:

Thai National Education Act 1999 required all educational institutions received external quality evaluation at least once every five years. The purpose of this study was to compare the awareness in the code of ethics for nurse educators among nurse educators, professional nurses, and nursing students under The Royal Thai Army Nurse College. The sample consisted of 51 of nurse educators 200 nursing students and 340 professional nurses from Army nursing college and hospital by stratified random sampling techniques. The descriptive statistics indicated that the nurse educators, nursing students and professional nurses had different levels of awareness in the 9 roles of nurse educators: Nurse, Reliable Sacrifice, Intelligence, Giver, Nursing Skills, Teaching Responsibility, Unbiased Care, Tie to Organization, and Role Model. The code of ethics for nurse educators (CENE) measurement models from the awareness of nurse educators, professional nurses, and nursing students were well fitted with the empirical data. The CENE models from them were invariant in forms, but variant in factor loadings. Thai Army nurse educators strive to create a learning environment that nurtures the highest nursing potential and standards in their nursing students.

Keywords: awareness of the code of ethics for nurse educators, nursing college and hospital under The Royal Thai Army, Thai Army nurse educators, professional nurses

Procedia PDF Downloads 445
25819 The Teacher’s Role in Generating and Maintaining the Motivation of Adult Learners of English: A Mixed Methods Study in Hungarian Corporate Contexts

Authors: Csaba Kalman

Abstract:

In spite of the existence of numerous second language (L2) motivation theories, the teacher’s role in motivating learners has remained an under-researched niche to this day. If we narrow down our focus on the teacher’s role on motivating adult learners of English in an English as a Foreign Language (EFL) context in corporate environments, empirical research is practically non-existent. This study fills the above research niche by exploring the most motivating aspects of the teacher’s personality, behaviour, and teaching practices that affect adult learners’ L2 motivation in corporate contexts in Hungary. The study was conducted in a wide range of industries in 18 organisations that employ over 250 people in Hungary. In order to triangulate the research, 21 human resources managers, 18 language teachers, and 466 adult learners of English were involved in the investigation by participating in interview studies, and quantitative questionnaire studies that measured ten scales related to the teacher’s role, as well as two criterion measure scales of intrinsic and extrinsic motivation. The qualitative data were analysed using a template organising style, while descriptive, inferential statistics, as well as multivariate statistical techniques, such as correlation and regression analyses, were used for analysing the quantitative data. The results showed that certain aspects of the teacher’s personality (thoroughness, enthusiasm, credibility, and flexibility), as well as preparedness, incorporating English for Specific Purposes (ESP) in the syllabus, and focusing on the present, proved to be the most salient aspects of the teacher’s motivating influence. The regression analyses conducted with the criterion measure scales revealed that 22% of the variance in learners’ intrinsic motivation could be explained by the teacher’s preparedness and appearance, and 23% of the variance in learners’ extrinsic motivation could be attributed to the teacher’s personal branding and incorporating ESP in the syllabus. The findings confirm the pivotal role teachers play in motivating L2 learners independent of the context they teach in; and, at the same time, call for further research so that we can better conceptualise the motivating influence of L2 teachers.

Keywords: adult learners, corporate contexts, motivation, teacher’s role

Procedia PDF Downloads 97