Search results for: stock movement prediction
3646 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis
Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate
Abstract:
This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull
Procedia PDF Downloads 733645 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.Keywords: classification, CRISP-DM, machine learning, predictive quality, regression
Procedia PDF Downloads 1443644 COVID-19 Analysis with Deep Learning Model Using Chest X-Rays Images
Authors: Uma Maheshwari V., Rajanikanth Aluvalu, Kumar Gautam
Abstract:
The COVID-19 disease is a highly contagious viral infection with major worldwide health implications. The global economy suffers as a result of COVID. The spread of this pandemic disease can be slowed if positive patients are found early. COVID-19 disease prediction is beneficial for identifying patients' health problems that are at risk for COVID. Deep learning and machine learning algorithms for COVID prediction using X-rays have the potential to be extremely useful in solving the scarcity of doctors and clinicians in remote places. In this paper, a convolutional neural network (CNN) with deep layers is presented for recognizing COVID-19 patients using real-world datasets. We gathered around 6000 X-ray scan images from various sources and split them into two categories: normal and COVID-impacted. Our model examines chest X-ray images to recognize such patients. Because X-rays are commonly available and affordable, our findings show that X-ray analysis is effective in COVID diagnosis. The predictions performed well, with an average accuracy of 99% on training photographs and 88% on X-ray test images.Keywords: deep CNN, COVID–19 analysis, feature extraction, feature map, accuracy
Procedia PDF Downloads 803643 Characteristics of Elastic Tracked-Crawler Based on Worm-Rack Mechanism
Authors: Jun-ya Nagase
Abstract:
There are many pipes such as a water pipe and a gas pipe in a chemical plant and house. It is possible to prevent accidents by these inspections. However, many pipes are very narrow and it is difficult for people to inspect directly. Therefore, development of a robot that can move in narrow pipe is necessary. A wheel movement type robot, a snake-like robot and a multi-leg robot are all described in the relevant literature as pipe inspection robots that are currently studied. Among them, the tracked crawler robot can travel by traversing uneven ground flexibly with a crawler belt attached firmly to the ground surface. Although conventional crawler robots have high efficiency and/or high ground-covering ability, they require a comparatively large space to move. In this study, a cylindrical crawler robot based on worm-rack mechanism, which does not need large space to move and which has high ground-covering ability, is proposed. Experiments have demonstrated smooth operation and a forward movement of the robot by application of voltage to the motor. In addition, performance tests show that it can propel itself in confined spaces. This paper reports the structure, drive mechanism, prototype, and experimental evaluation.Keywords: tracked-crawler, pipe inspection robot, worm-rack mechanism, amoeba locomotion
Procedia PDF Downloads 4313642 Enhancing Athlete Training using Real Time Pose Estimation with Neural Networks
Authors: Jeh Patel, Chandrahas Paidi, Ahmed Hambaba
Abstract:
Traditional methods for analyzing athlete movement often lack the detail and immediacy required for optimal training. This project aims to address this limitation by developing a Real-time human pose estimation system specifically designed to enhance athlete training across various sports. This system leverages the power of convolutional neural networks (CNNs) to provide a comprehensive and immediate analysis of an athlete’s movement patterns during training sessions. The core architecture utilizes dilated convolutions to capture crucial long-range dependencies within video frames. Combining this with the robust encoder-decoder architecture to further refine pose estimation accuracy. This capability is essential for precise joint localization across the diverse range of athletic poses encountered in different sports. Furthermore, by quantifying movement efficiency, power output, and range of motion, the system provides data-driven insights that can be used to optimize training programs. Pose estimation data analysis can also be used to develop personalized training plans that target specific weaknesses identified in an athlete’s movement patterns. To overcome the limitations posed by outdoor environments, the project employs strategies such as multi-camera configurations or depth sensing techniques. These approaches can enhance pose estimation accuracy in challenging lighting and occlusion scenarios, where pose estimation accuracy in challenging lighting and occlusion scenarios. A dataset is collected From the labs of Martin Luther King at San Jose State University. The system is evaluated through a series of tests that measure its efficiency and accuracy in real-world scenarios. Results indicate a high level of precision in recognizing different poses, substantiating the potential of this technology in practical applications. Challenges such as enhancing the system’s ability to operate in varied environmental conditions and further expanding the dataset for training were identified and discussed. Future work will refine the model’s adaptability and incorporate haptic feedback to enhance the interactivity and richness of the user experience. This project demonstrates the feasibility of an advanced pose detection model and lays the groundwork for future innovations in assistive enhancement technologies.Keywords: computer vision, deep learning, human pose estimation, U-NET, CNN
Procedia PDF Downloads 563641 An Artificial Intelligence Supported QUAL2K Model for the Simulation of Various Physiochemical Parameters of Water
Authors: Mehvish Bilal, Navneet Singh, Jasir Mushtaq
Abstract:
Water pollution puts people's health at risk, and it can also impact the ecology. For practitioners of integrated water resources management (IWRM), water quality modelling may be useful for informing decisions about pollution control (such as discharge permitting) or demand management (such as abstraction permitting). To comprehend the current pollutant load, movement of effective load movement of contaminants generates effective relation between pollutants, mathematical simulation, source, and water quality is regarded as one of the best estimating tools. The current study involves the Qual2k model, which includes manual simulation of the various physiochemical characteristics of water. To this end, various sensors could be installed for the automatic simulation of various physiochemical characteristics of water. An artificial intelligence model has been proposed for the automatic simulation of water quality parameters. Models of water quality have become an effective tool for identifying worldwide water contamination, as well as the ultimate fate and behavior of contaminants in the water environment. Water quality model research is primarily conducted in Europe and other industrialized countries in the first world, where theoretical underpinnings and practical research are prioritized.Keywords: artificial intelligence, QUAL2K, simulation, physiochemical parameters
Procedia PDF Downloads 1063640 Understanding Children’s Visual Attention to Personal Protective Equipment Using Eye-Tracking
Authors: Vanessa Cho, Janet Hsiao, Nigel King, Robert Anthonappa
Abstract:
Background: The personal protective equipment (PPE) requirements for health care workers (HCWs) have changed significantly during the COVID-19 pandemic. Aim: To ascertain, using eye-tracking technology, what children notice the most when seeing HCWs in various PPE. Design: A Tobii nano pro-eye-tracking camera tracked 156 children's visual attention while they viewed photographs of HCWs in various PPEs. Eye Movement analysis with Hidden Markov Models (EMHMM) was employed to analyse 624 recordings using two approaches, namely (i) data-driven where children's fixation determined the regions of interest (ROIs), and (ii) fixed ROIs where the investigators predefined the ROIs. Results: Two significant eye movement patterns, namely distributed(85.2%) and selective(14.7%), were identified(P<0.05). Most children fixated primarily on the face regardless of the different PPEs. Children fixated equally on all PPE images in the distributed pattern, while a strong preference for unmasked faces was evident in the selective pattern (P<0.01). Conclusion: Children as young as 2.5 years used a top-down visual search behaviour and demonstrated their face processing ability. Most children did not show a strong visual preference for a specific PPE, while a minority preferred PPE with distinct facial features, namely without masks and loupes.Keywords: COVID-19, PPE, dentistry, pediatric
Procedia PDF Downloads 903639 The Creation of Micromedia on Social Networking Sites as a Social Movement Strategy: The Case of Migration Aid, a Hungarian Refugee Relief Group
Authors: Zsofia Nagy, Tibor Dessewffy
Abstract:
The relationship between social movements and the media that represents them comprises both of the media representation of movements on the one hand, and the media strategies employed by movements on the other. A third possible approach is to connect the two and look at the interactions connecting the two sides. This relationship has been affected by the emergence of social networking sites (SNS) that have a transformative effect on both actors. However, the extent and direction of these changes needs to be investigated. Empirical case studies that focus on newly enabled forms of social movements can contribute to these debates in an analytically fruitful way. Therefore in our study, we use the case of Migration Aid, a Hungarian Facebook-based grassroots relief organization that gained prominence during the refugee crisis that unfolded in Hungary in 2015. Migration Aid formed without the use of traditional mobilizational agents, and that took over roles traditionally occupied by formal NGOs or the state. Analyzing different movement strategies towards the media - we find evidence that while effectively combining these strategies, SNSs also create affordances for movements to shift their strategy towards creating alternatives, their own micromedia. Beyond the practical significance of this – the ability to disseminate alternative information independently from traditional media – it also allowed the group to frame the issue in their own terms and to replace vertical modes of communication with horizontal ones. The creation of micromedia also shifts the relationship between social movements and the media away from an asymmetrical and towards a more symbiotic co-existence. We provide four central factors – project identity, the mobilization potential of SNSs, the disruptiveness of the event and selectivity in the construction of social knowledge – that explain this shift. Finally, we look at the specific processes that contribute to the creation of the movement’s own micromedia. We posit that these processes were made possible by the rhizomatic structure of the group and a function of SNSs we coin the Social Information Thermostat function. We conclude our study by positioning our findings in relation with the broader context.Keywords: social networking sites, social movements, micromedia, media strategies
Procedia PDF Downloads 2643638 Pattern Recognition Using Feature Based Die-Map Clustering in the Semiconductor Manufacturing Process
Authors: Seung Hwan Park, Cheng-Sool Park, Jun Seok Kim, Youngji Yoo, Daewoong An, Jun-Geol Baek
Abstract:
Depending on the big data analysis becomes important, yield prediction using data from the semiconductor process is essential. In general, yield prediction and analysis of the causes of the failure are closely related. The purpose of this study is to analyze pattern affects the final test results using a die map based clustering. Many researches have been conducted using die data from the semiconductor test process. However, analysis has limitation as the test data is less directly related to the final test results. Therefore, this study proposes a framework for analysis through clustering using more detailed data than existing die data. This study consists of three phases. In the first phase, die map is created through fail bit data in each sub-area of die. In the second phase, clustering using map data is performed. And the third stage is to find patterns that affect final test result. Finally, the proposed three steps are applied to actual industrial data and experimental results showed the potential field application.Keywords: die-map clustering, feature extraction, pattern recognition, semiconductor manufacturing process
Procedia PDF Downloads 4023637 Passport Bros: Exploring Neocolonial Masculinity and Sex Tourism as a Response to Shifting Gender Dynamics
Authors: Kellen Sharp
Abstract:
This study explores the phenomenon of ‘Passport Bros’, a subset within the manosphere responding to perceived crises in masculinity amidst changing gender dynamics. Focusing on a computational analysis of the passport bro community, the research addresses normative beliefs, deviations from MGTOW ideology, and discussions on nationality, race, and gender. Originating from the MGTOW movement, passport bros engage in a neocolonial approach by seeking traditional, non-Western women, attributing this pursuit to dissatisfaction with modern Western women. The paper examines how hetero pessimism within MGTOW shapes the emergence of passport bros, leading to the adoption of red pill ideologies and ultimately manifesting in the form of sex tourism. Analyzing data collected from passport bro forums through computer-assisted content analysis, the study identifies key discourses such as questions and answers, money, attitudes towards Western and traditional women, and discussions about the movement itself. The findings highlight the nuanced intersection of gender, race, and global power dynamics within the passport bro community, shedding light on their motivations and impact on neocolonial legacies.Keywords: toxic online community, manosphere, gender and media, neocolonialism
Procedia PDF Downloads 753636 Application of Artificial Neural Network for Prediction of Load-Haul-Dump Machine Performance Characteristics
Authors: J. Balaraju, M. Govinda Raj, C. S. N. Murthy
Abstract:
Every industry is constantly looking for enhancement of its day to day production and productivity. This can be possible only by maintaining the men and machinery at its adequate level. Prediction of performance characteristics plays an important role in performance evaluation of the equipment. Analytical and statistical approaches will take a bit more time to solve complex problems such as performance estimations as compared with software-based approaches. Keeping this in view the present study deals with an Artificial Neural Network (ANN) modelling of a Load-Haul-Dump (LHD) machine to predict the performance characteristics such as reliability, availability and preventive maintenance (PM). A feed-forward-back-propagation ANN technique has been used to model the Levenberg-Marquardt (LM) training algorithm. The performance characteristics were computed using Isograph Reliability Workbench 13.0 software. These computed values were validated using predicted output responses of ANN models. Further, recommendations are given to the industry based on the performed analysis for improvement of equipment performance.Keywords: load-haul-dump, LHD, artificial neural network, ANN, performance, reliability, availability, preventive maintenance
Procedia PDF Downloads 1503635 Clinical Prediction Rules for Using Open Kinetic Chain Exercise in Treatment of Knee Osteoarthritis
Authors: Mohamed Aly, Aliaa Rehan Youssef, Emad Sawerees, Mounir Guirgis
Abstract:
Relevance: Osteoarthritis (OA) is the most common degenerative disease seen in all populations. It causes disability and substantial socioeconomic burden. Evidence supports that exercise are the most effective conservative treatment for patients with OA. Therapists experience and clinical judgment play major role in exercise prescription and scientific evidence for this regard is lacking. The development of clinical prediction rules to identify patients who are most likely benefit from exercise may help solving this dilemma. Purpose: This study investigated whether body mass index and functional ability at baseline can predict patients’ response to a selected exercise program. Approach: Fifty-six patients, aged 35 to 65 years, completed an exercise program consisting of open kinetic chain strengthening and passive stretching exercises. The program was given for 3 sessions per week, 45 minutes per session, for 6 weeks Evaluation: At baseline and post treatment, pain severity was assessed using the numerical pain rating scale, whereas functional ability was being assessed by step test (ST), time up and go test (TUG) and 50 feet time walk test (50 FTW). After completing the program, global rate of change (GROC) score of greater than 4 was used to categorize patients as successful and non-successful. Thirty-eight patients (68%) had successful response to the intervention. Logistic regression showed that BMI and 50 FTW test were the only significant predictors. Based on the results, patients with BMI less than 34.71 kg/m2 and 50 FTW test less than 25.64 sec are 68% to 89% more likely to benefit from the exercise program. Conclusions: Clinicians should consider the described strengthening and flexibility exercise program for patents with BMI less than 34.7 Kg/m2 and 50 FTW faster than 25.6 seconds. The validity of these predictors should be investigated for other exercise.Keywords: clinical prediction rule, knee osteoarthritis, physical therapy exercises, validity
Procedia PDF Downloads 4233634 The Application of Artificial Neural Networks for the Performance Prediction of Evacuated Tube Solar Air Collector with Phase Change Material
Authors: Sukhbir Singh
Abstract:
This paper describes the modeling of novel solar air collector (NSAC) system by using artificial neural network (ANN) model. The objective of the study is to demonstrate the application of the ANN model to predict the performance of the NSAC with acetamide as a phase change material (PCM) storage. Input data set consist of time, solar intensity and ambient temperature wherever as outlet air temperature of NSAC was considered as output. Experiments were conducted between 9.00 and 24.00 h in June and July 2014 underneath the prevailing atmospheric condition of Kurukshetra (city of the India). After that, experimental results were utilized to train the back propagation neural network (BPNN) to predict the outlet air temperature of NSAC. The results of proposed algorithm show that the BPNN is effective tool for the prediction of responses. The BPNN predicted results are 99% in agreement with the experimental results.Keywords: Evacuated tube solar air collector, Artificial neural network, Phase change material, solar air collector
Procedia PDF Downloads 1203633 The Theory behind Logistic Regression
Authors: Jan Henrik Wosnitza
Abstract:
The logistic regression has developed into a standard approach for estimating conditional probabilities in a wide range of applications including credit risk prediction. The article at hand contributes to the current literature on logistic regression fourfold: First, it is demonstrated that the binary logistic regression automatically meets its model assumptions under very general conditions. This result explains, at least in part, the logistic regression's popularity. Second, the requirement of homoscedasticity in the context of binary logistic regression is theoretically substantiated. The variances among the groups of defaulted and non-defaulted obligors have to be the same across the level of the aggregated default indicators in order to achieve linear logits. Third, this article sheds some light on the question why nonlinear logits might be superior to linear logits in case of a small amount of data. Fourth, an innovative methodology for estimating correlations between obligor-specific log-odds is proposed. In order to crystallize the key ideas, this paper focuses on the example of credit risk prediction. However, the results presented in this paper can easily be transferred to any other field of application.Keywords: correlation, credit risk estimation, default correlation, homoscedasticity, logistic regression, nonlinear logistic regression
Procedia PDF Downloads 4263632 Runoff Simulation by Using WetSpa Model in Garmabrood Watershed of Mazandaran Province, Iran
Authors: Mohammad Reza Dahmardeh Ghaleno, Mohammad Nohtani, Saeedeh Khaledi
Abstract:
Hydrological models are applied to simulation and prediction floods in watersheds. WetSpa is a distributed, continuous and physically model with daily or hourly time step that explains of precipitation, runoff and evapotranspiration processes for both simple and complex contexts. This model uses a modified rational method for runoff calculation. In this model, runoff is routed along the flow path using Diffusion-Wave Equation which depend on the slope, velocity and flow route characteristics. Garmabrood watershed located in Mazandaran province in Iran and passing over coordinates 53° 10´ 55" to 53° 38´ 20" E and 36° 06´ 45" to 36° 25´ 30"N. The area of the catchment is about 1133 km2 and elevations in the catchment range from 213 to 3136 m at the outlet, with average slope of 25.77 %. Results of the simulations show a good agreement between calculated and measured hydrographs at the outlet of the basin. Drawing upon Nash-Sutcliffe Model Efficiency Coefficient for calibration periodic model estimated daily hydrographs and maximum flow rate with an accuracy up to 61% and 83.17 % respectively.Keywords: watershed simulation, WetSpa, runoff, flood prediction
Procedia PDF Downloads 3373631 Analyzing the Sound of Space - The Glissando of the Planets and the Spiral Movement on the Sound of Earth, Saturn and Jupiter
Authors: L. Tonia, I. Daglis, W. Kurth
Abstract:
The sound of the universe creates an affinity with the sounds of music. The analysis of the sound of space focuses on the existence of a tone material, the microstructure and macrostructure, and the form of the sound through the signals recorded during the flight of the spacecraft Van Allen Probes and Cassini’s mission. The sound becomes from the frequencies that belong to electromagnetic waves. Plasma Wave Science Instrument and Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) recorded the signals from space. A transformation of that signals to audio gave the opportunity to study and analyze the sound. Due to the fact that the musical tone pitch has a frequency and every electromagnetic wave produces a frequency too, the creation of a musical score, which appears as the sound of space, can give information about the form, the symmetry, and the harmony of the sound. The conversion of space radio emissions to audio provides a number of tone pitches corresponding to the original frequencies. Through the process of these sounds, we have the opportunity to present a music score that “composed” from space. In this score, we can see some basic features associated with the music form, the structure, the tone center of music material, the construction and deconstruction of the sound. The structure, which was built through a harmonic world, includes tone centers, major and minor scales, sequences of chords, and types of cadences. The form of the sound represents the symmetry of a spiral movement not only in micro-structural but also to macro-structural shape. Multiple glissando sounds in linear and polyphonic process of the sound, founded in magnetic fields around Earth, Saturn, and Jupiter, but also a spiral movement appeared on the spectrogram of the sound. Whistles, Auroral Kilometric Radiations, and Chorus emissions reveal movements similar to musical excerpts of works by contemporary composers like Sofia Gubaidulina, Iannis Xenakis, EinojuhamiRautavara.Keywords: space sound analysis, spiral, space music, analysis
Procedia PDF Downloads 1773630 Virtual Metrology for Copper Clad Laminate Manufacturing
Authors: Misuk Kim, Seokho Kang, Jehyuk Lee, Hyunchang Cho, Sungzoon Cho
Abstract:
In semiconductor manufacturing, virtual metrology (VM) refers to methods to predict properties of a wafer based on machine parameters and sensor data of the production equipment, without performing the (costly) physical measurement of the wafer properties (Wikipedia). Additional benefits include avoidance of human bias and identification of important factors affecting the quality of the process which allow improving the process quality in the future. It is however rare to find VM applied to other areas of manufacturing. In this work, we propose to use VM to copper clad laminate (CCL) manufacturing. CCL is a core element of a printed circuit board (PCB) which is used in smartphones, tablets, digital cameras, and laptop computers. The manufacturing of CCL consists of three processes: Treating, lay-up, and pressing. Treating, the most important process among the three, puts resin on glass cloth, heat up in a drying oven, then produces prepreg for lay-up process. In this process, three important quality factors are inspected: Treated weight (T/W), Minimum Viscosity (M/V), and Gel Time (G/T). They are manually inspected, incurring heavy cost in terms of time and money, which makes it a good candidate for VM application. We developed prediction models of the three quality factors T/W, M/V, and G/T, respectively, with process variables, raw material, and environment variables. The actual process data was obtained from a CCL manufacturer. A variety of variable selection methods and learning algorithms were employed to find the best prediction model. We obtained prediction models of M/V and G/T with a high enough accuracy. They also provided us with information on “important” predictor variables, some of which the process engineers had been already aware and the rest of which they had not. They were quite excited to find new insights that the model revealed and set out to do further analysis on them to gain process control implications. T/W did not turn out to be possible to predict with a reasonable accuracy with given factors. The very fact indicates that the factors currently monitored may not affect T/W, thus an effort has to be made to find other factors which are not currently monitored in order to understand the process better and improve the quality of it. In conclusion, VM application to CCL’s treating process was quite successful. The newly built quality prediction model allowed one to reduce the cost associated with actual metrology as well as reveal some insights on the factors affecting the important quality factors and on the level of our less than perfect understanding of the treating process.Keywords: copper clad laminate, predictive modeling, quality control, virtual metrology
Procedia PDF Downloads 3503629 Crude Oil and Stocks Markets: Prices and Uncertainty Transmission Analysis
Authors: Kamel Malik Bensafta, Gervasio Semedo
Abstract:
The purpose of this paper is to investigate the relationship between oil prices and socks markets. The empirical analysis in this paper is conducted within the context of Multivariate GARCH models, using a transform version of the so-called BEKK parameterization. We show that mean and uncertainty of US market are transmitted to oil market and European market. We also identify an important transmission from WTI prices to Brent Prices.Keywords: oil volatility, stock markets, MGARCH, transmission, structural break
Procedia PDF Downloads 5253628 Advance Hybrid Manufacturing Supply Chain System to Get Benefits of Push and Pull Systems
Authors: Akhtar Nawaz, Sahar Noor, Iftikhar Hussain
Abstract:
This paper considers advanced hybrid manufacturing planning both push and pull system in which each customer order has a due date by demand forecast and customer orders. We present a tool for model for tool development that requires an absolute due dates and customer orders in a manufacturing supply chain. It is vital for the manufacturing companies to face the problem of variations in demands, increase in varieties by maintaining safety stock and to minimize components obsolescence and uselessness. High inventory cost and low delivery lead time is expected in push type of system and on contrary high delivery lead time and low inventory cost is predicted in the pull type. For this tool for model we need an MRP system for the push and pull environment and control of inventories in push parts and lead time in the pull part. To retain process data quickly, completely and to improve responsiveness and minimize inventory cost, a tool is required to deal with the high product variance and short cycle parts. In practice, planning and scheduling are interrelated and should be solved simultaneously with supply chain to ensure that the due dates of customer orders are met. The proposed tool for model considers alternative process plans for job types, with precedence constraints for job operations. Such a tool for model has not been treated in the literature. To solve the model, tool was developed, so a new technique was required to deal with the issue of high product variance and short life cycles in assemble to order.Keywords: hybrid manufacturing system, supply chain system, make to order, make to stock, assemble to order
Procedia PDF Downloads 5643627 Geophysical Methods and Machine Learning Algorithms for Stuck Pipe Prediction and Avoidance
Authors: Ammar Alali, Mahmoud Abughaban
Abstract:
Cost reduction and drilling optimization is the goal of many drilling operators. Historically, stuck pipe incidents were a major segment of non-productive time (NPT) associated costs. Traditionally, stuck pipe problems are part of the operations and solved post-sticking. However, the real key to savings and success is in predicting the stuck pipe incidents and avoiding the conditions leading to its occurrences. Previous attempts in stuck-pipe predictions have neglected the local geology of the problem. The proposed predictive tool utilizes geophysical data processing techniques and Machine Learning (ML) algorithms to predict drilling activities events in real-time using surface drilling data with minimum computational power. The method combines two types of analysis: (1) real-time prediction, and (2) cause analysis. Real-time prediction aggregates the input data, including historical drilling surface data, geological formation tops, and petrophysical data, from wells within the same field. The input data are then flattened per the geological formation and stacked per stuck-pipe incidents. The algorithm uses two physical methods (stacking and flattening) to filter any noise in the signature and create a robust pre-determined pilot that adheres to the local geology. Once the drilling operation starts, the Wellsite Information Transfer Standard Markup Language (WITSML) live surface data are fed into a matrix and aggregated in a similar frequency as the pre-determined signature. Then, the matrix is correlated with the pre-determined stuck-pipe signature for this field, in real-time. The correlation used is a machine learning Correlation-based Feature Selection (CFS) algorithm, which selects relevant features from the class and identifying redundant features. The correlation output is interpreted as a probability curve of stuck pipe incidents prediction in real-time. Once this probability passes a fixed-threshold defined by the user, the other component, cause analysis, alerts the user of the expected incident based on set pre-determined signatures. A set of recommendations will be provided to reduce the associated risk. The validation process involved feeding of historical drilling data as live-stream, mimicking actual drilling conditions, of an onshore oil field. Pre-determined signatures were created for three problematic geological formations in this field prior. Three wells were processed as case studies, and the stuck-pipe incidents were predicted successfully, with an accuracy of 76%. This accuracy of detection could have resulted in around 50% reduction in NPT, equivalent to 9% cost saving in comparison with offset wells. The prediction of stuck pipe problem requires a method to capture geological, geophysical and drilling data, and recognize the indicators of this issue at a field and geological formation level. This paper illustrates the efficiency and the robustness of the proposed cross-disciplinary approach in its ability to produce such signatures and predicting this NPT event.Keywords: drilling optimization, hazard prediction, machine learning, stuck pipe
Procedia PDF Downloads 2293626 Cooling Profile Analysis of Hot Strip Coil Using Finite Volume Method
Authors: Subhamita Chakraborty, Shubhabrata Datta, Sujay Kumar Mukherjea, Partha Protim Chattopadhyay
Abstract:
Manufacturing of multiphase high strength steel in hot strip mill have drawn significant attention due to the possibility of forming low temperature transformation product of austenite under continuous cooling condition. In such endeavor, reliable prediction of temperature profile of hot strip coil is essential in order to accesses the evolution of microstructure at different location of hot strip coil, on the basis of corresponding Continuous Cooling Transformation (CCT) diagram. Temperature distribution profile of the hot strip coil has been determined by using finite volume method (FVM) vis-à-vis finite difference method (FDM). It has been demonstrated that FVM offer greater computational reliability in estimation of contact pressure distribution and hence the temperature distribution for curved and irregular profiles, owing to the flexibility in selection of grid geometry and discrete point position, Moreover, use of finite volume concept allows enforcing the conservation of mass, momentum and energy, leading to enhanced accuracy of prediction.Keywords: simulation, modeling, thermal analysis, coil cooling, contact pressure, finite volume method
Procedia PDF Downloads 4733625 Artificial Neural Network Based Approach in Prediction of Potential Water Pollution Across Different Land-Use Patterns
Authors: M.Rüştü Karaman, İsmail İşeri, Kadir Saltalı, A.Reşit Brohi, Ayhan Horuz, Mümin Dizman
Abstract:
Considerable relations has recently been given to the environmental hazardous caused by agricultural chemicals such as excess fertilizers. In this study, a neural network approach was investigated in the prediction of potential nitrate pollution across different land-use patterns by using a feedforward multilayered computer model of artificial neural network (ANN) with proper training. Periodical concentrations of some anions, especially nitrate (NO3-), and cations were also detected in drainage waters collected from the drain pipes placed in irrigated tomato field, unirrigated wheat field, fallow and pasture lands. The soil samples were collected from the irrigated tomato field and unirrigated wheat field on a grid system with 20 m x 20 m intervals. Site specific nitrate concentrations in the soil samples were measured for ANN based simulation of nitrate leaching potential from the land profiles. In the application of ANN model, a multi layered feedforward was evaluated, and data sets regarding with training, validation and testing containing the measured soil nitrate values were estimated based on spatial variability. As a result of the testing values, while the optimal structures of 2-15-1 was obtained (R2= 0.96, P < 0.01) for unirrigated field, the optimal structures of 2-10-1 was obtained (R2= 0.96, P < 0.01) for irrigated field. The results showed that the ANN model could be successfully used in prediction of the potential leaching levels of nitrate, based on different land use patterns. However, for the most suitable results, the model should be calibrated by training according to different NN structures depending on site specific soil parameters and varied agricultural managements.Keywords: artificial intelligence, ANN, drainage water, nitrate pollution
Procedia PDF Downloads 3103624 Wireless FPGA-Based Motion Controller Design by Implementing 3-Axis Linear Trajectory
Authors: Kiana Zeighami, Morteza Ozlati Moghadam
Abstract:
Designing a high accuracy and high precision motion controller is one of the important issues in today’s industry. There are effective solutions available in the industry but the real-time performance, smoothness and accuracy of the movement can be further improved. This paper discusses a complete solution to carry out the movement of three stepper motors in three dimensions. The objective is to provide a method to design a fully integrated System-on-Chip (SOC)-based motion controller to reduce the cost and complexity of production by incorporating Field Programmable Gate Array (FPGA) into the design. In the proposed method the FPGA receives its commands from a host computer via wireless internet communication and calculates the motion trajectory for three axes. A profile generator module is designed to realize the interpolation algorithm by translating the position data to the real-time pulses. This paper discusses an approach to implement the linear interpolation algorithm, since it is one of the fundamentals of robots’ movements and it is highly applicable in motion control industries. Along with full profile trajectory, the triangular drive is implemented to eliminate the existence of error at small distances. To integrate the parallelism and real-time performance of FPGA with the power of Central Processing Unit (CPU) in executing complex and sequential algorithms, the NIOS II soft-core processor was added into the design. This paper presents different operating modes such as absolute, relative positioning, reset and velocity modes to fulfill the user requirements. The proposed approach was evaluated by designing a custom-made FPGA board along with a mechanical structure. As a result, a precise and smooth movement of stepper motors was observed which proved the effectiveness of this approach.Keywords: 3-axis linear interpolation, FPGA, motion controller, micro-stepping
Procedia PDF Downloads 2083623 Enhancing Learning Ability among Deaf Students by Using Photographic Images
Authors: Aidah Alias, Mustaffa Halabi Azahari, Adzrool Idzwan Ismail, Salasiah Ahmad
Abstract:
Education is one of the most important elements in a human life. Educations help us in learning and achieve new things in life. The ability of hearing gave us chances to hear voices and it is important in our communication. Hearing stories told by others; hearing news and music to create our creative and sense; seeing and hearing make us understand directly the message trying to deliver. But, what will happen if we are born deaf or having hearing loss while growing up? The objectives of this paper are to identify the current practice in teaching and learning among deaf students and to analyse an appropriate method in enhancing learning process among deaf students. A case study method was employed by using methods of observation and interview to selected deaf students and teachers. The findings indicated that the suitable method of teaching for deaf students is by using pictures and body movement. In other words, by combining these two medium of images and body movement, the best medium that the study suggested is by using video or motion pictures. The study concluded and recommended that video or motion pictures is recommended medium to be used in teaching and learning for deaf students.Keywords: deaf, photographic images, visual communication, education, learning ability
Procedia PDF Downloads 2843622 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models
Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg
Abstract:
Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction
Procedia PDF Downloads 3093621 Challenges of Carbon Trading Schemes in Africa
Authors: Bengan Simbarashe Manwere
Abstract:
The entire African continent, comprising 55 countries, holds a 2% share of the global carbon market. The World Bank attributes the continent’s insignificant share and participation in the carbon market to the limited access to electricity. Approximately 800 million people spread across 47 African countries generate as much power as Spain, with a population of 45million. Only South Africa and North Africa have carbon-reduction investment opportunities on the continent and dominate the 2% market share of the global carbon market. On the back of the 2015 Paris Agreement, South Africa signed into law the Carbon Tax Act 15 of 2019 and the Customs and Excise Amendment Act 13 of 2019 (Gazette No. 4280) on 1 June 2019. By these laws, South Africa was ushered into the league of active global carbon market players. By increasing the cost of production by the rate of R120/tCO2e, the tax intentionally compels the internalization of pollution as a cost of production and, relatedly, stimulate investment in clean technologies. The first phase covered the 1 June 2019 – 31 December 2022 period during which the tax was meant to escalate at CPI + 2% for Scope 1 emitters. However, in the second phase, which stretches from 2023 to 2030, the tax will escalate at the inflation rate only as measured by the consumer price index (CPI). The Carbon Tax Act provides for carbon allowances as mitigation strategies to limit agents’ carbon tax liability by up to 95% for fugitive and process emissions. Although the June 2019 Carbon Tax Act explicitly makes provision for a carbon trading scheme (CTS), the carbon trading regulations thereof were only finalised in December 2020. This points to a delay in the establishment of a carbon trading scheme (CTS). Relatedly, emitters in South Africa are not able to benefit from the 95% reduction in effective carbon tax rate from R120/tCO2e to R6/tCO2e as the Johannesburg Stock Exchange (JSE) has not yet finalized the establishment of the market for trading carbon credits. Whereas most carbon trading schemes have been designed and constructed from the beginning as new tailor-made systems in countries the likes of France, Australia, Romania which treat carbon as a financial product, South Africa intends, on the contrary, to leverage existing trading infrastructure of the Johannesburg Stock Exchange (JSE) and the Clearing and Settlement platforms of Strate, among others, in the interest of the Paris Agreement timelines. Therefore the carbon trading scheme will not be constructed from scratch. At the same time, carbon will be treated as a commodity in order to align with the existing institutional and infrastructural capacity. This explains why the Carbon Tax Act is silent about the involvement of the Financial Sector Conduct Authority (FSCA).For South Africa, there is need to establish they equilibrium stability of the CTS. This is important as South Africa is an innovator in carbon trading and the successful trading of carbon credits on the JSE will lead to imitation by early adopters first, followed by the middle majority thereafter.Keywords: carbon trading scheme (CTS), Johannesburg stock exchange (JSE), carbon tax act 15 of 2019, South Africa
Procedia PDF Downloads 703620 Understanding the Social Movements around the ‘Rohingya Crisis’ within the Political Process Model
Authors: Aklima Jesmin, Ubaidur Rob, M. Ashrafur Rahman
Abstract:
Rohingya population of Arakan state in Myanmar are one the most persecuted ethnic minorities in this 21st century. According to the Universal Declaration of Human Rights (UDHR), all human beings are born free, equal in dignity and rights. However, these populations are systematically excluded from this universal proclamation of human rights as they are Rohingya, which signify ‘other’. Based on the accessible and available literatures about Rohingya issue, this study firstly found there are chronological pattern of human rights violations against the ethnic Rohingya which follows the pathology of the Holocaust in this 21st century of human civilization. These violations have been possible due to modern technology, bureaucracy which has been performed through authorization, routinization and dehumanization; not only in formal institutions but in the society as a whole. This kind of apparently never-ending situation poses any author with the problem of available many scientific articles. The most important sources are, therefore the international daily newspapers, social media and official webpage of the non-state actors for nitty-gritty day to day update. Although it challenges the validity and objectivity of the information, but to address the critical ongoing human rights violations against Rohingya population can become a base for further work on this issue. One of the aspects of this paper is to accommodate all the social movements since August 2017 to date. The findings of this paper is that even though it seemed only human rights violations occurred against Rohingya historically but, simultaneously the process of social movements had also started, can be traced more after the military campaign in 2017. Therefore, the Rohingya crisis can be conceptualized within one ‘campaign’ movement for justice, not as episodic events, especially within the Political Process Model than any other social movement theories. This model identifies that the role of international political movements as well as the role of non-state actors are more powerful than any other episodes of violence conducted against Rohinyga in reframing issue, blaming and shaming to Myanmar government and creating the strategic opportunities for social changes. The lack of empowerment of the affected Rohingya population has been found as the loop to utilize this strategic opportunity. Their lack of empowerment can also affect their capacity to reframe their rights and to manage the campaign for their justice. Therefore, this should be placed at the heart of the international policy agenda within the broader socio-political movement for the justice of Rohingya population. Without ensuring human rights of Rohingya population, achieving the promise of the united nation’s sustainable development goals - no one would be excluded – will be impossible.Keywords: civilization, holocaust, human rights violation, military campaign, political process model, Rohingya population, sustainable development goal, social justice, social movement, strategic opportunity
Procedia PDF Downloads 2833619 New Experiences into Pancreatic Disease Science
Authors: Nadia Akbarpour
Abstract:
Pancreatic ductal adenocarcinoma is a forceful and obliterating illness, which is portrayed by intrusiveness, fast movement, and significant protection from treatment. Advances in neurotic arrangement and malignant growth hereditary qualities have worked on our illustrative comprehension of this infection; be that as it may, significant parts of pancreatic disease science remain ineffectively comprehended. A superior comprehension of pancreatic disease science should lead the way to more viable medicines. In the course of the most recent couple of years, there have been significant advances in the sub-atomic and organic comprehension of pancreatic malignancy. This included comprehension of the genomic intricacy of the illness, the job of pancreatic malignant growth undifferentiated organisms, the importance of the growth microenvironment, and the one-of-a-kind metabolic transformation of pancreas disease cells to acquire supplements under hypoxic climate. Endeavors have been made towards the advancement of the practical answer for its treatment with compelled achievement due to its complicated science. It is grounded that pancreatic malignancy undifferentiated cells (CSCs), yet present in a little count, contribute extraordinarily to PC inception, movement, and metastasis. Standard chemo and radiotherapeutic choices, notwithstanding, grow general endurance, the connected aftereffects are a huge concern. In the midst of the latest decade, our understanding with regards to atomic and cell pathways engaged with PC and the job of CSCs in its movement has expanded massively. By and by, the center is to target CSCs. The natural items have acquired a lot of thought as of late as they, generally, sharpen CSCs to chemotherapy and target atomic flagging engaged with different cancers, including PC. Some arranged investigations have demonstrated promising outcomes recommending that assessments in this course bring a ton to the table for the treatment of PC. Albeit preclinical investigations uncovered the significance of natural items in lessening pancreatic carcinoma, restricted examinations have been led to assess their part in centers. The current survey gives another knowledge to late advances in pancreatic malignancy science, treatment, and the current status of natural items in its expectation.Keywords: pancreatic, genomic, organic, cancer
Procedia PDF Downloads 1383618 The Ability of Forecasting the Term Structure of Interest Rates Based on Nelson-Siegel and Svensson Model
Authors: Tea Poklepović, Zdravka Aljinović, Branka Marasović
Abstract:
Due to the importance of yield curve and its estimation it is inevitable to have valid methods for yield curve forecasting in cases when there are scarce issues of securities and/or week trade on a secondary market. Therefore in this paper, after the estimation of weekly yield curves on Croatian financial market from October 2011 to August 2012 using Nelson-Siegel and Svensson models, yield curves are forecasted using Vector auto-regressive model and Neural networks. In general, it can be concluded that both forecasting methods have good prediction abilities where forecasting of yield curves based on Nelson Siegel estimation model give better results in sense of lower Mean Squared Error than forecasting based on Svensson model Also, in this case Neural networks provide slightly better results. Finally, it can be concluded that most appropriate way of yield curve prediction is neural networks using Nelson-Siegel estimation of yield curves.Keywords: Nelson-Siegel Model, neural networks, Svensson Model, vector autoregressive model, yield curve
Procedia PDF Downloads 3343617 Photo-Fenton Decolorization of Methylene Blue Adsolubilized on Co2+ -Embedded Alumina Surface: Comparison of Process Modeling through Response Surface Methodology and Artificial Neural Network
Authors: Prateeksha Mahamallik, Anjali Pal
Abstract:
In the present study, Co(II)-adsolubilized surfactant modified alumina (SMA) was prepared, and methylene blue (MB) degradation was carried out on Co-SMA surface by visible light photo-Fenton process. The entire reaction proceeded on solid surface as MB was embedded on Co-SMA surface. The reaction followed zero order kinetics. Response surface methodology (RSM) and artificial neural network (ANN) were used for modeling the decolorization of MB by photo-Fenton process as a function of dose of Co-SMA (10, 20 and 30 g/L), initial concentration of MB (10, 20 and 30 mg/L), concentration of H2O2 (174.4, 348.8 and 523.2 mM) and reaction time (30, 45 and 60 min). The prediction capabilities of both the methodologies (RSM and ANN) were compared on the basis of correlation coefficient (R2), root mean square error (RMSE), standard error of prediction (SEP), relative percent deviation (RPD). Due to lower value of RMSE (1.27), SEP (2.06) and RPD (1.17) and higher value of R2 (0.9966), ANN was proved to be more accurate than RSM in order to predict decolorization efficiency.Keywords: adsolubilization, artificial neural network, methylene blue, photo-fenton process, response surface methodology
Procedia PDF Downloads 254