Search results for: component prediction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4660

Search results for: component prediction

2440 Time and Cost Efficiency Analysis of Quick Die Change System on Metal Stamping Industry

Authors: Rudi Kurniawan Arief

Abstract:

Manufacturing cost and setup time are the hot topics to improve in Metal Stamping industry because material and components price are always rising up while costumer requires to cut down the component price year by year. The Single Minute Exchange of Die (SMED) is one of many methods to reduce waste in stamping industry. The Japanese Quick Die Change (QDC) dies system is one of SMED systems that could reduce both of setup time and manufacturing cost. However, this system is rarely used in stamping industries. This paper will analyze how deep the QDC dies system could reduce setup time and the manufacturing cost. The research is conducted by direct observation, simulating and comparing of QDC dies system with conventional dies system. In this research, we found that the QDC dies system could save up to 35% of manufacturing cost and reduce 70% of setup times. This simulation proved that the QDC die system is effective for cost reduction but must be applied in several parallel production processes.

Keywords: press die, metal stamping, QDC system, single minute exchange die, manufacturing cost saving, SMED

Procedia PDF Downloads 166
2439 A Particle Filter-Based Data Assimilation Method for Discrete Event Simulation

Authors: Zhi Zhu, Boquan Zhang, Tian Jing, Jingjing Li, Tao Wang

Abstract:

Data assimilation is a model and data hybrid-driven method that dynamically fuses new observation data with a numerical model to iteratively approach the real system state. It is widely used in state prediction and parameter inference of continuous systems. Because of the discrete event system’s non-linearity and non-Gaussianity, traditional Kalman Filter based on linear and Gaussian assumptions cannot perform data assimilation for such systems, so particle filter has gradually become a technical approach for discrete event simulation data assimilation. Hence, we proposed a particle filter-based discrete event simulation data assimilation method and took the unmanned aerial vehicle (UAV) maintenance service system as a proof of concept to conduct simulation experiments. The experimental results showed that the filtered state data is closer to the real state of the system, which verifies the effectiveness of the proposed method. This research can provide a reference framework for the data assimilation process of other complex nonlinear systems, such as discrete-time and agent simulation.

Keywords: discrete event simulation, data assimilation, particle filter, model and data-driven

Procedia PDF Downloads 0
2438 Forecasting the Sea Level Change in Strait of Hormuz

Authors: Hamid Goharnejad, Amir Hossein Eghbali

Abstract:

Recent investigations have demonstrated the global sea level rise due to climate change impacts. In this study climate changes study the effects of increasing water level in the strait of Hormuz. The probable changes of sea level rise should be investigated to employ the adaption strategies. The climatic output data of a GCM (General Circulation Model) named CGCM3 under climate change scenario of A1b and A2 were used. Among different variables simulated by this model, those of maximum correlation with sea level changes in the study region and least redundancy among themselves were selected for sea level rise prediction by using stepwise regression. One models of Discrete Wavelet artificial Neural Network (DWNN) was developed to explore the relationship between climatic variables and sea level changes. In these models, wavelet was used to disaggregate the time series of input and output data into different components and then ANN was used to relate the disaggregated components of predictors and predictands to each other. The results showed in the Shahid Rajae Station for scenario A1B sea level rise is among 64 to 75 cm and for the A2 Scenario sea level rise is among 90 to 105 cm. Furthermore the result showed a significant increase of sea level at the study region under climate change impacts, which should be incorporated in coastal areas management.

Keywords: climate change scenarios, sea-level rise, strait of Hormuz, forecasting

Procedia PDF Downloads 267
2437 Identification of Landslide Features Using Back-Propagation Neural Network on LiDAR Digital Elevation Model

Authors: Chia-Hao Chang, Geng-Gui Wang, Jee-Cheng Wu

Abstract:

The prediction of a landslide is a difficult task because it requires a detailed study of past activities using a complete range of investigative methods to determine the changing condition. In this research, first step, LiDAR 1-meter by 1-meter resolution of digital elevation model (DEM) was used to generate six environmental factors of landslide. Then, back-propagation neural networks (BPNN) was adopted to identify scarp, landslide areas and non-landslide areas. The BPNN uses 6 environmental factors in input layer and 1 output layer. Moreover, 6 landslide areas are used as training areas and 4 landslide areas as test areas in the BPNN. The hidden layer is set to be 1 and 2; the hidden layer neurons are set to be 4, 5, 6, 7 and 8; the learning rates are set to be 0.01, 0.1 and 0.5. When using 1 hidden layer with 7 neurons and the learning rate sets to be 0.5, the result of Network training root mean square error is 0.001388. Finally, evaluation of BPNN classification accuracy by the confusion matrix shows that the overall accuracy can reach 94.4%, and the Kappa value is 0.7464.

Keywords: digital elevation model, DEM, environmental factors, back-propagation neural network, BPNN, LiDAR

Procedia PDF Downloads 137
2436 An Approach to Make an Adaptive Immunoassay to Detect an Unknown Disease

Authors: Josselyn Mata Calidonio, Arianna I. Maddox, Kimberly Hamad-Schifferli

Abstract:

Rapid diagnostics are critical infectious disease tools that are designed to detect a known biomarker using antibodies specific to that biomarker. However, a way to detect unknown viruses has not yet been achieved in a paper test format. We describe here a route to make an adaptable paper immunoassay that can detect an unknown biomarker, demonstrating it on SARS-CoV-2 variants. The immunoassay repurposes cross-reactive antibodies raised against the alpha variant. Gold nanoparticles of two different colors conjugated to two different antibodies create a colorimetric signal, and machine learning of the resulting colorimetric pattern is used to train the assay to discriminate between variants of alpha and Omicron BA.5. By using principal component analysis, the colorimetric test patterns can pick up and discriminate an unknown that it has not encountered before, Omicron BA.1. The test has an accuracy of 100% and a potential calculated discriminatory power of 900. We show that it can be used adaptively and that it can be used to pick up emerging variants without the need to raise new antibodies.

Keywords: adaptive immunoassay, detecting unknown viruses, gold nanoparticles, paper immunoassay, repurposing antibodies

Procedia PDF Downloads 108
2435 The Effect of Gibberellic Acid on Gamma-Aminobutyric Acid (GABA) Metabolism in Phaseolus Vulgaris L. Plant Exposed to Drought and Salt Stresses

Authors: Fazilet Özlem Çekiç, Seyda Yılmaz

Abstract:

Salinity and drought are important environmental problems in the world and have negative effects on plant metabolism. Gamma-aminobutyric acid (GABA), four-carbon non-protein amino acid, is a significant component of the free amino acid pool. GABA is widely distributed in prokaryotic and eukaryotic organisms. Environmental stress factors increase GABA accumulation in plants. Our aim was to evaluate the effect of gibberellic acid (GA) on GABA metabolism system during drought and salt stress factors in Phaseolus vulgaris L. plants. GABA, Glutamate dehydrogenase (GDH) activity, chlorophyll, and lipid peroxidation (MDA) analyses were determined. According to our results we can suggest that GA play a role in GABA metabolism during salt and drought stresses in bean plants. Also GABA shunt is an important metabolic pathway and key signaling allowing to adapt to drought and salt stresses.

Keywords: gibberellic acid, GABA, Phaseolus vulgaris L., salinity, drought

Procedia PDF Downloads 418
2434 Progression Rate, Prevalence, Incidence of Black Band Disease on Stony (Scleractinia) in Barranglompo Island, South Sulawesi

Authors: Baso Hamdani, Arniati Massinai, Jamaluddin Jompa

Abstract:

Coral diseases are one of the factors affect reef degradation. This research had analysed the progression rate, incidence, and prevalence of Black Band Disease (BBD) on stony coral (Pachyseris sp.) in relation to the environmental parameters (pH, nitrate, phospate, Dissolved Organic Matter (DOM), and turbidity). The incidence of coral disease was measured weekly for 6 weeks using Belt Transect Method. The progression rate of BBD was measured manually. Furthermore, the prevalence and incidence of BBD were calculated each colonies infected. The relationship between environmental parameters and the progression rate, prevalence and incidence of BBD was analysed by Principal Component Analysis (PCA). The results showed the average of progression rate is 0,07 ± 0,02 cm/ hari. The prevalence of BBD increased from 0,92% - 19,73% in 7 weeks observation with the average incidence of new infected colonies coral 0,2 - 0,65 colony/day The environment factors which important were pH, Nitrate, Phospate, DOM, and Turbidity.

Keywords: progression rate, incidence, prevalence, Black Band Disease, Barranglompo

Procedia PDF Downloads 643
2433 Machine Learning in Momentum Strategies

Authors: Yi-Min Lan, Hung-Wen Cheng, Hsuan-Ling Chang, Jou-Ping Yu

Abstract:

The study applies machine learning models to construct momentum strategies and utilizes the information coefficient as an indicator for selecting stocks with strong and weak momentum characteristics. Through this approach, the study has built investment portfolios capable of generating superior returns and conducted a thorough analysis. Compared to existing research on momentum strategies, machine learning is incorporated to capture non-linear interactions. This approach enhances the conventional stock selection process, which is often impeded by difficulties associated with timeliness, accuracy, and efficiency due to market risk factors. The study finds that implementing bidirectional momentum strategies outperforms unidirectional ones, and momentum factors with longer observation periods exhibit stronger correlations with returns. Optimizing the number of stocks in the portfolio while staying within a certain threshold leads to the highest level of excess returns. The study presents a novel framework for momentum strategies that enhances and improves the operational aspects of asset management. By introducing innovative financial technology applications to traditional investment strategies, this paper can demonstrate significant effectiveness.

Keywords: information coefficient, machine learning, momentum, portfolio, return prediction

Procedia PDF Downloads 51
2432 Gendered Water Insecurity: a Structural Equation Approach for Female-Headed Households in South Africa

Authors: Saul Ngarava, Leocadia Zhou, Nomakhaya Monde

Abstract:

Water crises have the fourth most significant societal impact after weapons of mass destruction, climate change, and extreme weather conditions, ahead of natural disasters. Intricacies between women and water are central to achieving the 2030 Sustainable Development Goals (SDGs). The majority of the 1.2 billion poor people worldwide, with two-thirds being women, and mostly located in Sub Sahara Africa (SSA) and South Asia, do not have access to safe and reliable sources of water. There exist gendered differences in water security based on the division of labour associating women with water. Globally, women and girls are responsible for water collection in 80% of the households which have no water on their premises. Women spend 16 million hours a day collecting water, while men and children spend 6 million and 4 million per day, respectively, which is time foregone in the pursuit of other livelihood activities. Due to their proximity and activities concerning water, women are vulnerable to water insecurity through exposures to water-borne diseases, fatigue from physically carrying water, and exposure to sexual and physical harassment, amongst others. Proximity to treated water and their wellbeing also has an effect on their sensitivity and adaptive capacity to water insecurity. The great distances, difficult terrain and heavy lifting expose women to vulnerabilities of water insecurity. However, few studies have quantified the vulnerabilities and burdens on women, with a few taking a phenomenological qualitative approach. Vulnerability studies have also been scanty in the water security realm, with most studies taking linear forms of either quantifying exposures, sensitivities or adaptive capacities in climate change studies. The current study argues for the need for a water insecurity vulnerability assessment, especially for women into research agendas as well as policy interventions, monitoring, and evaluation. The study sought to identify and provide pathways through which female-headed households were water insecure in South Africa, the 30th driest country in the world. This was through linking the drinking water decision as well as the vulnerability frameworks. Secondary data collected during the 2016 General Household Survey (GHS) was utilised, with a sample of 5928 female-headed households. Principal Component Analysis and Structural Equation Modelling were used to analyse the data. The results show dynamic relationships between water characteristics and water treatment. There were also associations between water access and wealth status of the female-headed households. Association was also found between water access and water treatment as well as between wealth status and water treatment. The study concludes that there are dynamic relationships in water insecurity (exposure, sensitivity, and adaptive capacity) for female-headed households in South Africa. The study recommends that a multi-prong approach is required in tackling exposures, sensitivities, and adaptive capacities to water insecurity. This should include capacitating and empowering women for wealth generation, improve access to water treatment equipment as well as prioritising the improvement of infrastructure that brings piped and safe water to female-headed households.

Keywords: gender, principal component analysis, structural equation modelling, vulnerability, water insecurity

Procedia PDF Downloads 117
2431 Spatial Integrity of Seismic Data for Oil and Gas Exploration

Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof

Abstract:

Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.

Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow

Procedia PDF Downloads 217
2430 Complex Decision Rules in Quality Assurance Processes for Quick Service Restaurant Industry: Human Factors Determining Acceptability

Authors: Brandon Takahashi, Marielle Hanley, Gerry Hanley

Abstract:

The large-scale quick-service restaurant industry is a complex business to manage optimally. With over 40 suppliers providing different ingredients for food preparation and thousands of restaurants serving over 50 unique food offerings across a wide range of regions, the company must implement a quality assurance process. Businesses want to deliver quality food efficiently, reliably, and successfully at a low cost that the public wants to buy. They also want to make sure that their food offerings are never unsafe to eat or of poor quality. A good reputation (and profitable business) developed over the years can be gone in an instant if customers fall ill eating your food. Poor quality also results in food waste, and the cost of corrective actions is compounded by the reduction in revenue. Product compliance evaluation assesses if the supplier’s ingredients are within compliance with the specifications of several attributes (physical, chemical, organoleptic) that a company will test to ensure that a quality, safe to eat food is given to the consumer and will deliver the same eating experience in all parts of the country. The technical component of the evaluation includes the chemical and physical tests that produce numerical results that relate to shelf-life, food safety, and organoleptic qualities. The psychological component of the evaluation includes organoleptic, which is acting on or involving the use of the sense organs. The rubric for product compliance evaluation has four levels: (1) Ideal: Meeting or exceeding all technical (physical and chemical), organoleptic, & psychological specifications. (2) Deviation from ideal but no impact on quality: Not meeting or exceeding some technical and organoleptic/psychological specifications without impact on consumer quality and meeting all food safety requirements (3) Acceptable: Not meeting or exceeding some technical and organoleptic/psychological specifications resulting in reduction of consumer quality but not enough to lessen demand and meeting all food safety requirements (4) Unacceptable: Not meeting food safety requirements, independent of meeting technical and organoleptic specifications or meeting all food safety requirements but product quality results in consumer rejection of food offering. Sampling of products and consumer tastings within the distribution network is a second critical element of the quality assurance process and are the data sources for the statistical analyses. Each finding is not independently assessed with the rubric. For example, the chemical data will be used to back up/support any inferences on the sensory profiles of the ingredients. Certain flavor profiles may not be as apparent when mixed with other ingredients, which leads to weighing specifications differentially in the acceptability decision. Quality assurance processes are essential to achieve that balance of quality and profitability by making sure the food is safe and tastes good but identifying and remediating product quality issues before they hit the stores. Comprehensive quality assurance procedures implement human factors methodologies, and this report provides recommendations for systemic application of quality assurance processes for quick service restaurant services. This case study will review the complex decision rubric and evaluate processes to ensure the right balance of cost, quality, and safety is achieved.

Keywords: decision making, food safety, organoleptics, product compliance, quality assurance

Procedia PDF Downloads 187
2429 Language Factor in the Formation of National and Cultural Identity of Kazakhstan

Authors: Andabayeva Dina, Avakova Raushangul, Kortabayeva Gulzhamal, Rakhymbay Bauyrzhan

Abstract:

This article attempts to give an overview of the language situation and language planning in Kazakhstan. Statistical data is given and excursion to history of languages in Kazakhstan is done. Particular emphasis is placed on the national- cultural component of the Kazakh people, namely the impact of the specificity of the Kazakh language on ethnic identity. Language is one of the basic aspects of national identity. Recently, in the Republic of Kazakhstan purposeful work on language development has been conducted. Optimal solution of language problems is a factor of interethnic relations harmonization, strengthening and consolidation of the peoples and public consent. Development of languages - one of the important directions of the state policy in the Republic of Kazakhstan. The problem of the state language, as part of national (civil) identification play a huge role in the successful integration process of Kazakh society. And quite rightly assume that one of the foundations of a new civic identity is knowing Kazakh language by all citizens of Kazakhstan. The article is an analysis of the language situation in Kazakhstan in close connection with the peculiarities of cultural identity.

Keywords: Kazakhstan, mentality, language policy, ethnolinguistics, language planning, language personality

Procedia PDF Downloads 633
2428 Optimizing the Scanning Time with Radiation Prediction Using a Machine Learning Technique

Authors: Saeed Eskandari, Seyed Rasoul Mehdikhani

Abstract:

Radiation sources have been used in many industries, such as gamma sources in medical imaging. These waves have destructive effects on humans and the environment. It is very important to detect and find the source of these waves because these sources cannot be seen by the eye. A portable robot has been designed and built with the purpose of revealing radiation sources that are able to scan the place from 5 to 20 meters away and shows the location of the sources according to the intensity of the waves on a two-dimensional digital image. The operation of the robot is done by measuring the pixels separately. By increasing the image measurement resolution, we will have a more accurate scan of the environment, and more points will be detected. But this causes a lot of time to be spent on scanning. In this paper, to overcome this challenge, we designed a method that can optimize this time. In this method, a small number of important points of the environment are measured. Hence the remaining pixels are predicted and estimated by regression algorithms in machine learning. The research method is based on comparing the actual values of all pixels. These steps have been repeated with several other radiation sources. The obtained results of the study show that the values estimated by the regression method are very close to the real values.

Keywords: regression, machine learning, scan radiation, robot

Procedia PDF Downloads 73
2427 Pavement Roughness Prediction Systems: A Bump Integrator Approach

Authors: Manish Pal, Rumi Sutradhar

Abstract:

Pavement surface unevenness plays a pivotal role on roughness index of road which affects on riding comfort ability. Comfort ability refers to the degree of protection offered to vehicle occupants from uneven elements in the road surface. So, it is preferable to have a lower roughness index value for a better riding quality of road users. Roughness is generally defined as an expression of irregularities in the pavement surface which can be measured using different equipment like MERLIN, Bump integrator, Profilometer etc. Among them Bump Integrator is quite simple and less time consuming in case of long road sections. A case study is conducted on low volume roads in West District in Tripura to determine roughness index (RI) using Bump Integrator at the standard speed of 32 km/h. But it becomes too tough to maintain the requisite standard speed throughout the road section. The speed of Bump Integrator (BI) has to lower or higher in some distinctive situations. So, it becomes necessary to convert these roughness index values of other speeds to the standard speed of 32 km/h. This paper highlights on that roughness index conversional model. Using SPSS (Statistical Package of Social Sciences) software a generalized equation is derived among the RI value at standard speed of 32 km/h and RI value at other speed conditions.

Keywords: bump integrator, pavement distresses, roughness index, SPSS

Procedia PDF Downloads 245
2426 Challenges and Lessons of Mentoring Processes for Novice Principals: An Exploratory Case Study of Induction Programs in Chile

Authors: Carolina Cuéllar, Paz González

Abstract:

Research has shown that school leadership has a significant indirect effect on students’ achievements. In Chile, evidence has also revealed that this impact is stronger in vulnerable schools. With the aim of strengthening school leadership, public policy has taken up the challenge of enhancing capabilities of novice principals through the implementation of induction programs, which include a mentoring component, entrusting the task of delivering these programs to universities. The importance of using mentoring or coaching models in the preparation of novice school leaders has been emphasized in the international literature. Thus, it can be affirmed that building leadership capacity through partnership is crucial to facilitate cognitive and affective support required in the initial phase of the principal career, gain role clarification and socialization in context, stimulate reflective leadership practice, among others. In Chile, mentoring is a recent phenomenon in the field of school leadership and it is even more new in the preparation of new principals who work in public schools. This study, funded by the Chilean Ministry of Education, sought to explore the challenges and lessons arising from the design and implementation of mentoring processes which are part of the induction programs, according to the perception of the different actors involved: ministerial agents, university coordinators, mentors and novice principals. The investigation used a qualitative design, based on a study of three cases (three induction programs). The sources of information were 46 semi-structured interviews, applied in two moments (at the beginning and end of mentoring). Content analysis technique was employed. Data focused on the uniqueness of each case and the commonalities within the cases. Five main challenges and lessons emerged in the design and implementation of mentoring within the induction programs for new principals from Chilean public schools. They comprised the need of (i) developing a shared conceptual framework on mentoring among the institutions and actors involved, which helps align the expectations for the mentoring component within the induction programs, along with assisting in establishing a theory of action of mentoring that is relevant to the public school context; (ii) recognizing trough actions and decisions at different levels that the role of a mentor differs from the role of a principal, which challenge the idea that an effective principal will always be an effective mentor; iii) improving mentors’ selection and preparation processes trough the definition of common guiding criteria to ensure that a mentor takes responsibility for developing critical judgment of novice principals, which implies not limiting the mentor’s actions to assist in the compliance of prescriptive practices and standards; (iv) generating common evaluative models with goals, instruments and indicators consistent with the characteristics of mentoring processes, which helps to assess expected results and impact; and (v) including the design of a mentoring structure as an outcome of the induction programs, which helps sustain mentoring within schools as a collective professional development practice. Results showcased interwoven elements that entail continuous negotiations at different levels. Taking action will contribute to policy efforts aimed at professionalizing the leadership role in public schools.

Keywords: induction programs, mentoring, novice principals, school leadership preparation

Procedia PDF Downloads 123
2425 Neural Network Analysis Applied to Risk Prediction of Early Neonatal Death

Authors: Amanda R. R. Oliveira, Caio F. F. C. Cunha, Juan C. L. Junior, Amorim H. P. Junior

Abstract:

Children deaths are traumatic events that most often can be prevented. The technology of prevention and intervention in cases of infant deaths is available at low cost and with solid evidence and favorable results, however, with low access cover. Weight is one of the main factors related to death in the neonatal period, so the newborns of low birth weight are a population at high risk of death in the neonatal period, especially early neonatal period. This paper describes the development of a model based in neural network analysis to predict the mortality risk rating in the early neonatal period for newborns of low birth weight to identify the individuals of this population with increased risk of death. The neural network applied was trained with a set of newborns data obtained from Brazilian health system. The resulting network presented great success rate in identifying newborns with high chances of death, which demonstrates the potential for using this tool in an integrated manner to the health system, in order to direct specific actions for improving prognosis of newborns.

Keywords: low birth weight, neonatal death risk, neural network, newborn

Procedia PDF Downloads 443
2424 Separating Permanent and Induced Magnetic Signature: A Simple Approach

Authors: O. J. G. Somsen, G. P. M. Wagemakers

Abstract:

Magnetic signature detection provides sensitive detection of metal objects, especially in the natural environment. Our group is developing a tabletop setup for magnetic signatures of various small and model objects. A particular issue is the separation of permanent and induced magnetization. While the latter depends only on the composition and shape of the object, the former also depends on the magnetization history. With common deperming techniques, a significant permanent signature may still remain, which confuses measurements of the induced component. We investigate a basic technique of separating the two. Measurements were done by moving the object along an aluminum rail while the three field components are recorded by a detector attached near the center. This is done first with the rail parallel to the Earth magnetic field and then with anti-parallel orientation. The reversal changes the sign of the induced- but not the permanent magnetization so that the two can be separated. Our preliminary results on a small iron block show excellent reproducibility. A considerable permanent magnetization was indeed present, resulting in a complex asymmetric signature. After separation, a much more symmetric induced signature was obtained that can be studied in detail and compared with theoretical calculations.

Keywords: magnetic signature, data analysis, magnetization, deperming techniques

Procedia PDF Downloads 448
2423 Video-Based Psychoeducation for Caregivers of Persons with Schizophrenia

Authors: Jilu David

Abstract:

Background: Schizophrenia is one of the most misunderstood mental illnesses across the globe. Lack of understanding about mental illnesses often delay treatment, severely affects the functionality of the person, and causes distress to the family. The study, Video-based Psychoeducation for Caregivers of Persons with Schizophrenia, consisted of developing a psychoeducational video about Schizophrenia, its symptoms, causes, treatment, and the importance of family support. Methodology: A quasi-experimental pre-post design was used to understand the feasibility of the study. Qualitative analysis strengthened the feasibility outcomes. Knowledge About Schizophrenia Interview was used to assess the level of knowledge of 10 participants, before and after the screening of the video. Results: Themes of usefulness, length, content, educational component, format of the intervention, and language emerged in the qualitative analysis. There was a statistically significant difference in the knowledge level of participants before and after the video screening. Conclusion: The statistical and qualitative analysis revealed that the video-based psychoeducation program was feasible and that it facilitated a general improvement in knowledge of the participants.

Keywords: Schizophrenia, mental illness, psychoeducation, video-based psychoeducation, family support

Procedia PDF Downloads 127
2422 Prediction of Binding Free Energies for Dyes Removal Using Computational Chemistry

Authors: R. Chanajaree, D. Luanwiset, K. Pongpratea

Abstract:

Dye removal is an environmental concern because the textile industries have been increasing by world population and industrialization. Adsorption is the technique to find adsorbents to remove dyes from wastewater. This method is low-cost and effective for dye removal. This work tries to develop effective adsorbents using the computational approach because it will be able to predict the possibility of the adsorbents for specific dyes in terms of binding free energies. The computational approach is faster and cheaper than the experimental approach in case of finding the best adsorbents. All starting structures of dyes and adsorbents are optimized by quantum calculation. The complexes between dyes and adsorbents are generated by the docking method. The obtained binding free energies from docking are compared to binding free energies from the experimental data. The calculated energies can be ranked as same as the experimental results. In addition, this work also shows the possible orientation of the complexes. This work used two experimental groups of the complexes of the dyes and adsorbents. In the first group, there are chitosan (adsorbent) and two dyes (reactive red (RR) and direct sun yellow (DY)). In the second group, there are poly(1,2-epoxy-3-phenoxy) propane (PEPP), which is the adsorbent, and 2 dyes of bromocresol green (BCG) and alizarin yellow (AY).

Keywords: dyes removal, binding free energies, quantum calculation, docking

Procedia PDF Downloads 148
2421 Design and Construction of an Impulse Current Generator for Lightning Strike Experiments

Authors: Kamran Yousefpour, Mojtaba Rostaghi-Chalaki, Jason Warden, Chanyeop Park

Abstract:

There has been a rising trend in using impulse current generators to investigate the lightning strike protection of materials including aluminum and composites in structures such as wind turbine blade and aircraft body. The focus of this research is to present a new impulse current generator built in the High Voltage Lab at Mississippi State University. The generator is capable of producing component A and D of the natural lightning discharges in accordance with the Society of Automotive Engineers (SAE) standard, which is widely used in the aerospace industry. The generator can supply lightning impulse energy up to 400 kJ with the capability of producing impulse currents with magnitudes greater than 200 kA. The electrical circuit and physical components of an improved impulse current generator are described and several lightning strike waveforms with different amplitudes is presented for comparing with the standard waveform. The results of this study contribute to the fundamental understanding the functionality of the impulse current generators and present a new impulse current generator developed at the High Voltage Lab of Mississippi State University.

Keywords: impulse current generator, lightning, society of automotive engineers, capacitor

Procedia PDF Downloads 163
2420 A Medical Resource Forecasting Model for Emergency Room Patients with Acute Hepatitis

Authors: R. J. Kuo, W. C. Cheng, W. C. Lien, T. J. Yang

Abstract:

Taiwan is a hyper endemic area for the Hepatitis B virus (HBV). The estimated total number of HBsAg carriers in the general population who are more than 20 years old is more than 3 million. Therefore, a case record review is conducted from January 2003 to June 2007 for all patients with a diagnosis of acute hepatitis who were admitted to the Emergency Department (ED) of a well-known teaching hospital. The cost for the use of medical resources is defined as the total medical fee. In this study, principal component analysis (PCA) is firstly employed to reduce the number of dimensions. Support vector regression (SVR) and artificial neural network (ANN) are then used to develop the forecasting model. A total of 117 patients meet the inclusion criteria. 61% patients involved in this study are hepatitis B related. The computational result shows that the proposed PCA-SVR model has superior performance than other compared algorithms. In conclusion, the Child-Pugh score and echogram can both be used to predict the cost of medical resources for patients with acute hepatitis in the ED.

Keywords: acute hepatitis, medical resource cost, artificial neural network, support vector regression

Procedia PDF Downloads 420
2419 Modeling by Application of the Nernst-Planck Equation and Film Theory for Predicting of Chromium Salts through Nanofiltration Membrane

Authors: Aimad Oulebsir, Toufik Chaabane, Sivasankar Venkatramann, Andre Darchen, Rachida Maachi

Abstract:

The objective of this study is to propose a model for the prediction of the mechanism transfer of the trivalent ions through a nanofiltration membrane (NF) by introduction of the polarization concentration phenomenon and to study its influence on the retention of salts. This model is the combination of the Nernst-Planck equation and the equations of the film theory. This model is characterized by two transfer parameters: Reflection coefficient s and solute permeability Ps which are estimated numerically. The thickness of the boundary layer, δ, solute concentration at the membrane surface, Cm, and concentration profile in the polarization layer have also been estimated. The mathematical formulation suggested was established. The retentions of trivalent salts are estimated and compared with the experimental results. A comparison between the results with and without phenomena of polarization of concentration is made and the thickness of boundary layer alimentation side was given. Experimental and calculated results are shown to be in good agreement. The model is then success fully extended to experimental data reported in the literature.

Keywords: nanofiltration, concentration polarisation, chromium salts, mass transfer

Procedia PDF Downloads 277
2418 A Numerical Study of the Tidal Currents in the Persian Gulf and Oman Sea

Authors: Fatemeh Sadat Sharifi, A. A. Bidokhti, M. Ezam, F. Ahmadi Givi

Abstract:

This study focuses on the tidal oscillation and its speed to create a general pattern in seas. The purpose of the analysis is to find out the amplitude and phase for several important tidal components. Therefore, Regional Ocean Models (ROMS) was rendered to consider the correlation and accuracy of this pattern. Finding tidal harmonic components allows us to predict tide at this region. Better prediction of these tides, making standard platform, making suitable wave breakers, helping coastal building, navigation, fisheries, port management and tsunami research. Result shows a fair accuracy in the SSH. It reveals tidal currents are highest in Hormuz Strait and the narrow and shallow region between Kish Island. To investigate flow patterns of the region, the results of limited size model of FVCOM were utilized. Many features of the present day view of ocean circulation have some precedent in tidal and long- wave studies. Tidal waves are categorized to be among the long waves. So that tidal currents studies have indeed effects in subsequent studies of sea and ocean circulations.

Keywords: barotropic tide, FVCOM, numerical model, OTPS, ROMS

Procedia PDF Downloads 225
2417 Prediction of Trailing-Edge Noise under Adverse-Pressure Gradient Effect

Authors: Li Chen

Abstract:

For an aerofoil or hydrofoil in high Reynolds number flows, broadband noise is generated efficiently as the result of the turbulence convecting over the trailing edge. This noise can be related to the surface pressure fluctuations, which can be predicted by either CFD or empirical models. However, in reality, the aerofoil or hydrofoil often operates at an angle of attack. Under this situation, the flow is subjected to an Adverse-Pressure-Gradient (APG), and as a result, a flow separation may occur. This study is to assess trailing-edge noise models for such flows. In the present work, the trailing-edge noise from a 2D airfoil at 6 degree of angle of attach is investigated. Under this condition, the flow is experiencing a strong APG, and the flow separation occurs. The flow over the airfoil with a chord of 300 mm, equivalent to a Reynold Number 4x10⁵, is simulated using RANS with the SST k-ɛ turbulent model. The predicted surface pressure fluctuations are compared with the published experimental data and empirical models, and show a good agreement with the experimental data. The effect of the APG on the trailing edge noise is discussed, and the associated trailing edge noise is calculated.

Keywords: aero-acoustics, adverse-pressure gradient, computational fluid dynamics, trailing-edge noise

Procedia PDF Downloads 333
2416 Impact of Tablet Based Learning on Continuous Assessment (ESPRIT Smart School Framework)

Authors: Mehdi Attia, Sana Ben Fadhel, Lamjed Bettaieb

Abstract:

Mobile technology has become a part of our daily lives and assist learners (despite their level and age) in their leaning process using various apparatus and mobile devices (laptop, tablets, etc.). This paper presents a new learning framework based on tablets. This solution has been developed and tested in ESPRIT “Ecole Supérieure Privée d’Igénieurie et de Technologies”, a Tunisian school of engineering. This application is named ESSF: Esprit Smart School Framework. In this work, the main features of the proposed solution are listed, particularly its impact on the learners’ evaluation process. Learner’s assessment has always been a critical component of the learning process as it measures students’ knowledge. However, traditional evaluation methods in which the learner is evaluated once or twice each year cannot reflect his real level. This is why a continuous assessment (CA) process becomes necessary. In this context we have proved that ESSF offers many important features that enhance and facilitate the implementation of the CA process.

Keywords: continuous assessment, mobile learning, tablet based learning, smart school, ESSF

Procedia PDF Downloads 332
2415 Establishment of Standardized Bill of Material for Korean Urban Rail Transit System

Authors: J. E. Jung, J. M. Yang, J. W. Kim

Abstract:

The railway market across the world has been standardized with the globalization strategy of Europe. On the other hand, the Korean urban railway system is operated by 10 operators which have established their standards and independently managed BOMs. When operators manage different BOMs, lack of system compatibility prevents them from sharing information and hinders work linkage and efficiency. Europe launched a large-scale railway project in 1993 when the European Union went into effect. In particular, the recent standardization efforts of the EU-funded MODTRAIN project are similar to the approach of the urban rail system standardization research that is underway in Korea. This paper looks into the BOMs of Koran urban rail transit operators and suggests the standard BOM for the rail transit system in Korea by reviewing rail vehicle technologies and the MODTRAIN project of Europe. The standard BOM is structured up to the key device level or module level, and it allows vehicle manufacturers and component manufacturers to manage their lower-level BOMs and share them with each other and with operators.

Keywords: BOM, Korean rail, urban rail, standardized

Procedia PDF Downloads 311
2414 Digital Holographic Interferometric Microscopy for the Testing of Micro-Optics

Authors: Varun Kumar, Chandra Shakher

Abstract:

Micro-optical components such as microlenses and microlens array have numerous engineering and industrial applications for collimation of laser diodes, imaging devices for sensor system (CCD/CMOS, document copier machines etc.), for making beam homogeneous for high power lasers, a critical component in Shack-Hartmann sensor, fiber optic coupling and optical switching in communication technology. Also micro-optical components have become an alternative for applications where miniaturization, reduction of alignment and packaging cost are necessary. The compliance with high-quality standards in the manufacturing of micro-optical components is a precondition to be compatible on worldwide markets. Therefore, high demands are put on quality assurance. For quality assurance of these lenses, an economical measurement technique is needed. For cost and time reason, technique should be fast, simple (for production reason), and robust with high resolution. The technique should provide non contact, non-invasive and full field information about the shape of micro- optical component under test. The interferometric techniques are noncontact type and non invasive and provide full field information about the shape of the optical components. The conventional interferometric technique such as holographic interferometry or Mach-Zehnder interferometry is available for characterization of micro-lenses. However, these techniques need more experimental efforts and are also time consuming. Digital holography (DH) overcomes the above described problems. Digital holographic microscopy (DHM) allows one to extract both the amplitude and phase information of a wavefront transmitted through the transparent object (microlens or microlens array) from a single recorded digital hologram by using numerical methods. Also one can reconstruct the complex object wavefront at different depths due to numerical reconstruction. Digital holography provides axial resolution in nanometer range while lateral resolution is limited by diffraction and the size of the sensor. In this paper, Mach-Zehnder based digital holographic interferometric microscope (DHIM) system is used for the testing of transparent microlenses. The advantage of using the DHIM is that the distortions due to aberrations in the optical system are avoided by the interferometric comparison of reconstructed phase with and without the object (microlens array). In the experiment, first a digital hologram is recorded in the absence of sample (microlens array) as a reference hologram. Second hologram is recorded in the presence of microlens array. The presence of transparent microlens array will induce a phase change in the transmitted laser light. Complex amplitude of object wavefront in presence and absence of microlens array is reconstructed by using Fresnel reconstruction method. From the reconstructed complex amplitude, one can evaluate the phase of object wave in presence and absence of microlens array. Phase difference between the two states of object wave will provide the information about the optical path length change due to the shape of the microlens. By the knowledge of the value of the refractive index of microlens array material and air, the surface profile of microlens array is evaluated. The Sag of microlens and radius of curvature of microlens are evaluated and reported. The sag of microlens agrees well within the experimental limit as provided in the specification by the manufacturer.

Keywords: micro-optics, microlens array, phase map, digital holographic interferometric microscopy

Procedia PDF Downloads 495
2413 Facility Anomaly Detection with Gaussian Mixture Model

Authors: Sunghoon Park, Hank Kim, Jinwon An, Sungzoon Cho

Abstract:

Internet of Things allows one to collect data from facilities which are then used to monitor them and even predict malfunctions in advance. Conventional quality control methods focus on setting a normal range on a sensor value defined between a lower control limit and an upper control limit, and declaring as an anomaly anything falling outside it. However, interactions among sensor values are ignored, thus leading to suboptimal performance. We propose a multivariate approach which takes into account many sensor values at the same time. In particular Gaussian Mixture Model is used which is trained to maximize likelihood value using Expectation-Maximization algorithm. The number of Gaussian component distributions is determined by Bayesian Information Criterion. The negative Log likelihood value is used as an anomaly score. The actual usage scenario goes like a following. For each instance of sensor values from a facility, an anomaly score is computed. If it is larger than a threshold, an alarm will go off and a human expert intervenes and checks the system. A real world data from Building energy system was used to test the model.

Keywords: facility anomaly detection, gaussian mixture model, anomaly score, expectation maximization algorithm

Procedia PDF Downloads 267
2412 The Assessment of Some Biological Parameters With Dynamic Energy Budget of Mussels in Agadir Bay

Authors: Zahra Okba, Hassan El Ouizgani

Abstract:

Anticipating an individual’s behavior to the environmental factors allows for having relevant ecological forecasts. The Dynamic Energy Budget model facilitates prediction, and it is mechanically dependent on biology to abiotic factors but is generally field verified under relatively stable physical conditions. Dynamic Energy Budget Theory (DEB) is a robust framework that can link the individual state to environmental factors, and in our work, we have tested its ability to account for variability by looking at model predictions in the Agadir Bay, which is characterized by a semi-arid climate and temperature is strongly influenced by the trade winds front and nutritional availability. From previous works in our laboratory, we have collected different biological DEB model parameters of Mytilus galloprovincialis mussel in Agadir Bay. We mathematically formulated the equations that make up the DEB model and then adjusted our analytical functions with the observed biological data of our local species. We also assumed the condition of constant immersion, and then we integrated the details of the tidal cycles to calculate the metabolic depression at low tide. Our results are quite satisfactory concerning the length and shape of the shell in one part and the gonadosomatic index in another part.

Keywords: dynamic energy budget, mussels, mytilus galloprovincialis, agadir bay, DEB model

Procedia PDF Downloads 108
2411 Case Study Analysis for Driver's Company in the Transport Sector with the Help of Data Mining

Authors: Diana Katherine Gonzalez Galindo, David Rolando Suarez Mora

Abstract:

With this study, we used data mining as a new alternative of the solution to evaluate the comments of the customers in order to find a pattern that helps us to determine some behaviors to reduce the deactivation of the partners of the LEVEL app. In one of the greatest business created in the last times, the partners are being affected due to an internal process that compensates the customer for a bad experience, but these comments could be false towards the driver, that’s why we made an investigation to collect information to restructure this process, many partners have been disassociated due to this internal process and many of them refuse the comments given by the customer. The main methodology used in this case study is the observation, we recollect information in real time what gave us the opportunity to see the most common issues to get the most accurate solution. With this new process helped by data mining, we could get a prediction based on the behaviors of the customer and some basic data recollected such as the age, the gender, and others; this could help us in future to improve another process. This investigation gives more opportunities to the partner to keep his account active even if the customer writes a message through the app. The term is trying to avoid a recession of drivers in the future offering improving in the processes, at the same time we are in search of stablishing a strategy which benefits both the app’s managers and the associated driver.

Keywords: agent, driver, deactivation, rider

Procedia PDF Downloads 275