Search results for: modeling techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9788

Search results for: modeling techniques

9668 Threat Modeling Methodology for Supporting Industrial Control Systems Device Manufacturers and System Integrators

Authors: Raluca Ana Maria Viziteu, Anna Prudnikova

Abstract:

Industrial control systems (ICS) have received much attention in recent years due to the convergence of information technology (IT) and operational technology (OT) that has increased the interdependence of safety and security issues to be considered. These issues require ICS-tailored solutions. That led to the need to creation of a methodology for supporting ICS device manufacturers and system integrators in carrying out threat modeling of embedded ICS devices in a way that guarantees the quality of the identified threats and minimizes subjectivity in the threat identification process. To research, the possibility of creating such a methodology, a set of existing standards, regulations, papers, and publications related to threat modeling in the ICS sector and other sectors was reviewed to identify various existing methodologies and methods used in threat modeling. Furthermore, the most popular ones were tested in an exploratory phase on a specific PLC device. The outcome of this exploratory phase has been used as a basis for defining specific characteristics of ICS embedded devices and their deployment scenarios, identifying the factors that introduce subjectivity in the threat modeling process of such devices, and defining metrics for evaluating the minimum quality requirements of identified threats associated to the deployment of the devices in existing infrastructures. Furthermore, the threat modeling methodology was created based on the previous steps' results. The usability of the methodology was evaluated through a set of standardized threat modeling requirements and a standardized comparison method for threat modeling methodologies. The outcomes of these verification methods confirm that the methodology is effective. The full paper includes the outcome of research on different threat modeling methodologies that can be used in OT, their comparison, and the results of implementing each of them in practice on a PLC device. This research is further used to build a threat modeling methodology tailored to OT environments; a detailed description is included. Moreover, the paper includes results of the evaluation of created methodology based on a set of parameters specifically created to rate threat modeling methodologies.

Keywords: device manufacturers, embedded devices, industrial control systems, threat modeling

Procedia PDF Downloads 54
9667 Digital Control Techniques for Power Electronic Devices

Authors: Rakesh Krishna, Abhishek Poddar

Abstract:

The paper discusses the work carried out on the implementation of control techniques like Digital Pulse Width Modulation (PWM) and Digital Pulse Fired control(PFC). These techniques are often used in devices like inverters, battery chargers, DC-to-DC converters can also be implemented on household devices like heaters. The advantage being the control and improved life span of device. In case of batteries using these techniques are known to increase the life span of battery in mobiles and other hand-held devices. 8051 microcontroller is used to implement these methods.Thyristors are used for switching operations.

Keywords: PWM, SVM, PFC, bidirectional inverters, snubber

Procedia PDF Downloads 535
9666 Factors Affecting Sustainability of a 3D Printed Object

Authors: Kadrefi Athanasia, Fronimaki Evgenia, Mavri Maria

Abstract:

3D Printing (3DP) is a distinct, disruptive technology that belongs to a wider group of manufacturing technologies, Additive Manufacturing (AM). In 3DP, a custom digital file turns into a solid object using a single computer and a 3D printer. Among multiple advantages, 3DP offers production with fewer steps compared to conventional manufacturing, lower production costs, and customizable designs. 3DP can be performed by several techniques, while the most common is Fused Deposition Modeling (FDM). FDM belongs to a wider group of AM techniques, material extrusion, where a digital file converts into a solid object using raw material (called filament) melted in high temperatures. As in most manufacturing procedures, environmental issues have been raised here, too. This study aims to review the literature on issues that determine technical and mechanical factors that affect the sustainability and resilience of a final 3D-printed object. The research focuses on the collection of papers that deal with 3D printing techniques and use keywords or phrases like ‘3D printed objects’, ‘factors of 3DP sustainability’, ‘waste materials,’ ‘infill patterns,’ and ‘support structures.’ After determining factors, a pilot survey will be conducted at the 3D Printing Lab in order to define the significance of each factor in the final 3D printed object.

Keywords: additive manufacturing, 3D printing, sustainable manufacturing, sustainable production

Procedia PDF Downloads 22
9665 Application of Electrochemical Impedance Spectroscopy to Monitor the Steel/Soil Interface During Cathodic Protection of Steel in Simulated Soil Solution

Authors: Mandlenkosi George Robert Mahlobo, Tumelo Seadira, Major Melusi Mabuza, Peter Apata Olubambi

Abstract:

Cathodic protection (CP) has been widely considered a suitable technique for mitigating corrosion of buried metal structures. Plenty of efforts have been made in developing techniques, in particular non-destructive techniques, for monitoring and quantifying the effectiveness of CP to ensure the sustainability and performance of buried steel structures. The aim of this study was to investigate the evolution of the electrochemical processes at the steel/soil interface during the application of CP on steel in simulated soil. Carbon steel was subjected to electrochemical tests with NS4 solution used as simulated soil conditions for 4 days before applying CP for a further 11 days. A previously modified non-destructive voltammetry technique was applied before and after the application of CP to measure the corrosion rate. Electrochemical impedance spectroscopy (EIS), in combination with mathematical modeling through equivalent electric circuits, was applied to determine the electrochemical behavior at the steel/soil interface. The measured corrosion rate was found to have decreased from 410 µm/yr to 8 µm/yr between days 5 and 14 because of the applied CP. Equivalent electrical circuits were successfully constructed and used to adequately model the EIS results. The modeling of the obtained EIS results revealed the formation of corrosion products via a mixed activation-diffusion mechanism during the first 4 days, while the activation mechanism prevailed in the presence of CP, resulting in a protective film. The x-ray diffraction analysis confirmed the presence of corrosion products and the predominant protective film corresponding to the calcareous deposit.

Keywords: carbon steel, cathodic protection, NS4 solution, voltammetry, EIS

Procedia PDF Downloads 24
9664 Modeling of Maximum Rainfall Using Poisson-Generalized Pareto Distribution in Kigali, Rwanda

Authors: Emmanuel Iyamuremye

Abstract:

Extreme rainfall events have caused significant damage to agriculture, ecology, and infrastructure, disruption of human activities, injury, and loss of life. They also have significant social, economic, and environmental consequences because they considerably damage urban as well as rural areas. Early detection of extreme maximum rainfall helps to implement strategies and measures, before they occur, hence mitigating the consequences. Extreme value theory has been used widely in modeling extreme rainfall and in various disciplines, such as financial markets, the insurance industry, failure cases. Climatic extremes have been analyzed by using either generalized extreme value (GEV) or generalized Pareto (GP) distributions, which provides evidence of the importance of modeling extreme rainfall from different regions of the world. In this paper, we focused on Peak Over Thresholds approach, where the Poisson-generalized Pareto distribution is considered as the proper distribution for the study of the exceedances. This research also considers the use of the generalized Pareto (GP) distribution with a Poisson model for arrivals to describe peaks over a threshold. The research used statistical techniques to fit models that used to predict extreme rainfall in Kigali. The results indicate that the proposed Poisson-GP distribution provides a better fit to maximum monthly rainfall data. Further, the Poisson-GP models are able to estimate various return levels. The research also found a slow increase in return levels for maximum monthly rainfall for higher return periods, and further, the intervals are increasingly wider as the return period is increasing.

Keywords: exceedances, extreme value theory, generalized Pareto distribution, Poisson generalized Pareto distribution

Procedia PDF Downloads 106
9663 [Keynote Speech]: Feature Selection and Predictive Modeling of Housing Data Using Random Forest

Authors: Bharatendra Rai

Abstract:

Predictive data analysis and modeling involving machine learning techniques become challenging in presence of too many explanatory variables or features. Presence of too many features in machine learning is known to not only cause algorithms to slow down, but they can also lead to decrease in model prediction accuracy. This study involves housing dataset with 79 quantitative and qualitative features that describe various aspects people consider while buying a new house. Boruta algorithm that supports feature selection using a wrapper approach build around random forest is used in this study. This feature selection process leads to 49 confirmed features which are then used for developing predictive random forest models. The study also explores five different data partitioning ratios and their impact on model accuracy are captured using coefficient of determination (r-square) and root mean square error (rsme).

Keywords: housing data, feature selection, random forest, Boruta algorithm, root mean square error

Procedia PDF Downloads 287
9662 From Modeling of Data Structures towards Automatic Programs Generating

Authors: Valentin P. Velikov

Abstract:

Automatic program generation saves time, human resources, and allows receiving syntactically clear and logically correct modules. The 4-th generation programming languages are related to drawing the data and the processes of the subject area, as well as, to obtain a frame of the respective information system. The application can be separated in interface and business logic. That means, for an interactive generation of the needed system to be used an already existing toolkit or to be created a new one.

Keywords: computer science, graphical user interface, user dialog interface, dialog frames, data modeling, subject area modeling

Procedia PDF Downloads 282
9661 Hybrid Artificial Bee Colony and Least Squares Method for Rule-Based Systems Learning

Authors: Ahcene Habbi, Yassine Boudouaoui

Abstract:

This paper deals with the problem of automatic rule generation for fuzzy systems design. The proposed approach is based on hybrid artificial bee colony (ABC) optimization and weighted least squares (LS) method and aims to find the structure and parameters of fuzzy systems simultaneously. More precisely, two ABC based fuzzy modeling strategies are presented and compared. The first strategy uses global optimization to learn fuzzy models, the second one hybridizes ABC and weighted least squares estimate method. The performances of the proposed ABC and ABC-LS fuzzy modeling strategies are evaluated on complex modeling problems and compared to other advanced modeling methods.

Keywords: automatic design, learning, fuzzy rules, hybrid, swarm optimization

Procedia PDF Downloads 409
9660 Supersonic Flow around a Dihedral Airfoil: Modeling and Experimentation Investigation

Authors: A. Naamane, M. Hasnaoui

Abstract:

Numerical modeling of fluid flows, whether compressible or incompressible, laminar or turbulent presents a considerable contribution in the scientific and industrial fields. However, the development of an approximate model of a supersonic flow requires the introduction of specific and more precise techniques and methods. For this purpose, the object of this paper is modeling a supersonic flow of inviscid fluid around a dihedral airfoil. Based on the thin airfoils theory and the non-dimensional stationary Steichen equation of a two-dimensional supersonic flow in isentropic evolution, we obtained a solution for the downstream velocity potential of the oblique shock at the second order of relative thickness that characterizes a perturbation parameter. This result has been dealt with by the asymptotic analysis and characteristics method. In order to validate our model, the results are discussed in comparison with theoretical and experimental results. Indeed, firstly, the comparison of the results of our model has shown that they are quantitatively acceptable compared to the existing theoretical results. Finally, an experimental study was conducted using the AF300 supersonic wind tunnel. In this experiment, we have considered the incident upstream Mach number over a symmetrical dihedral airfoil wing. The comparison of the different Mach number downstream results of our model with those of the existing theoretical data (relative margin between 0.07% and 4%) and with experimental results (concordance for a deflection angle between 1° and 11°) support the validation of our model with accuracy.

Keywords: asymptotic modelling, dihedral airfoil, supersonic flow, supersonic wind tunnel

Procedia PDF Downloads 109
9659 Causal Modeling of the Glucose-Insulin System in Type-I Diabetic Patients

Authors: J. Fernandez, N. Aguilar, R. Fernandez de Canete, J. C. Ramos-Diaz

Abstract:

In this paper, a simulation model of the glucose-insulin system for a patient undergoing diabetes Type 1 is developed by using a causal modeling approach under system dynamics. The OpenModelica simulation environment has been employed to build the so called causal model, while the glucose-insulin model parameters were adjusted to fit recorded mean data of a diabetic patient database. Model results under different conditions of a three-meal glucose and exogenous insulin ingestion patterns have been obtained. This simulation model can be useful to evaluate glucose-insulin performance in several circumstances, including insulin infusion algorithms in open-loop and decision support systems in closed-loop.

Keywords: causal modeling, diabetes, glucose-insulin system, diabetes, causal modeling, OpenModelica software

Procedia PDF Downloads 306
9658 Comparative Analysis of Two Modeling Approaches for Optimizing Plate Heat Exchangers

Authors: Fábio A. S. Mota, Mauro A. S. S. Ravagnani, E. P. Carvalho

Abstract:

In the present paper the design of plate heat exchangers is formulated as an optimization problem considering two mathematical modeling. The number of plates is the objective function to be minimized, considering implicitly some parameters configuration. Screening is the optimization method used to solve the problem. Thermal and hydraulic constraints are verified, not viable solutions are discarded and the method searches for the convergence to the optimum, case it exists. A case study is presented to test the applicability of the developed algorithm. Results show coherency with the literature.

Keywords: plate heat exchanger, optimization, modeling, simulation

Procedia PDF Downloads 484
9657 The Use of Relaxation Training in Special Schools for Children With Learning Disabilities

Authors: Birgit Heike Spohn

Abstract:

Several authors (e.g., Krowatschek & Reid, 2011; Winkler, 1998) pronounce themselves in favor of the use of relaxation techniques in school because those techniques could help children to cope with stress, improve power of concentration, learning, and social behavior as well as class climate. Children with learning disabilities might profit from those techniques in a special way because they contribute to improved learning behavior. There is no study addressing the frequency of the use of relaxation techniques in special schools for children with learning disabilities in German speaking countries. The paper presents a study in which all teachers of special schools for children with learning disabilities in a district of South Germany (n = 625) were questioned about the use of relaxation techniques in school using a standardized questionnaire. Variables addressed were the use of these techniques in the classroom, aspects of their use (kind of relaxation technique, frequency, and regularity of their use), and potential influencing factors. The results are discussed, and implications for further research are drawn.

Keywords: special education, learning disabilities, relaxation training, concentration

Procedia PDF Downloads 73
9656 Analytical Study of Data Mining Techniques for Software Quality Assurance

Authors: Mariam Bibi, Rubab Mehboob, Mehreen Sirshar

Abstract:

Satisfying the customer requirements is the ultimate goal of producing or developing any product. The quality of the product is decided on the bases of the level of customer satisfaction. There are different techniques which have been reported during the survey which enhance the quality of the product through software defect prediction and by locating the missing software requirements. Some mining techniques were proposed to assess the individual performance indicators in collaborative environment to reduce errors at individual level. The basic intention is to produce a product with zero or few defects thereby producing a best product quality wise. In the analysis of survey the techniques like Genetic algorithm, artificial neural network, classification and clustering techniques and decision tree are studied. After analysis it has been discovered that these techniques contributed much to the improvement and enhancement of the quality of the product.

Keywords: data mining, defect prediction, missing requirements, software quality

Procedia PDF Downloads 427
9655 Predicting Bridge Pier Scour Depth with SVM

Authors: Arun Goel

Abstract:

Prediction of maximum local scour is necessary for the safety and economical design of the bridges. A number of equations have been developed over the years to predict local scour depth using laboratory data and a few pier equations have also been proposed using field data. Most of these equations are empirical in nature as indicated by the past publications. In this paper, attempts have been made to compute local depth of scour around bridge pier in dimensional and non-dimensional form by using linear regression, simple regression and SVM (Poly and Rbf) techniques along with few conventional empirical equations. The outcome of this study suggests that the SVM (Poly and Rbf) based modeling can be employed as an alternate to linear regression, simple regression and the conventional empirical equations in predicting scour depth of bridge piers. The results of present study on the basis of non-dimensional form of bridge pier scour indicates the improvement in the performance of SVM (Poly and Rbf) in comparison to dimensional form of scour.

Keywords: modeling, pier scour, regression, prediction, SVM (Poly and Rbf kernels)

Procedia PDF Downloads 426
9654 Quality Assurance in Cardiac Disorder Detection Images

Authors: Anam Naveed, Asma Andleeb, Mehreen Sirshar

Abstract:

In the article, Image processing techniques have been applied on cardiac images for enhancing the image quality. Two types of methodologies considers for survey, invasive techniques and non-invasive techniques. Different image processes for improvement of cardiac image quality and reduce the amount of radiation exposure for invasive techniques are explored. Different image processing algorithms for enhancing the noninvasive cardiac image qualities are described. Beside these two methodologies, third methodology has applied on live streaming of heart rate on ECG window for extracting necessary information, removing noise and enhancing quality. Sensitivity analyses have been carried out to investigate the impacts of cardiac images for diagnosis of cardiac arteries disease and how the enhancement on images will help the cardiologist to diagnoses disease. The paper evaluates strengths and weaknesses of different techniques applied for improved the image quality and draw a conclusion. Some specific limitations must be considered for whole survey, like the patient heart beat must be 70-75 beats/minute while doing the angiography, similarly patient weight and exposure radiation amount has some limitation.

Keywords: cardiac images, CT angiography, critical analysis, exposure radiation, invasive techniques, invasive techniques, non-invasive techniques

Procedia PDF Downloads 317
9653 Empirical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;

Procedia PDF Downloads 48
9652 Ontology-Based Approach for Temporal Semantic Modeling of Social Networks

Authors: Souâad Boudebza, Omar Nouali, Faiçal Azouaou

Abstract:

Social networks have recently gained a growing interest on the web. Traditional formalisms for representing social networks are static and suffer from the lack of semantics. In this paper, we will show how semantic web technologies can be used to model social data. The SemTemp ontology aligns and extends existing ontologies such as FOAF, SIOC, SKOS and OWL-Time to provide a temporal and semantically rich description of social data. We also present a modeling scenario to illustrate how our ontology can be used to model social networks.

Keywords: ontology, semantic web, social network, temporal modeling

Procedia PDF Downloads 348
9651 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 261
9650 Multi-Agent TeleRobotic Security Control System: Requirements Definitions of Multi-Agent System Using The Behavioral Patterns Analysis (BPA) Approach

Authors: Assem El-Ansary

Abstract:

This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Multi-Agent TeleRobotic Security Control System (MTSCS). The event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are the Behavioral Pattern Analysis (BPA) modeling methodology, and the development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.

Keywords: analysis, multi-agent, TeleRobotics control, security, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases

Procedia PDF Downloads 409
9649 Seismic Performance Evaluation of Existing Building Using Structural Information Modeling

Authors: Byungmin Cho, Dongchul Lee, Taejin Kim, Minhee Lee

Abstract:

The procedure for the seismic retrofit of existing buildings includes the seismic evaluation. In the evaluation step, it is assessed whether the buildings have satisfactory performance against seismic load. Based on the results of that, the buildings are upgraded. To evaluate seismic performance of the buildings, it usually goes through the model transformation from elastic analysis to inelastic analysis. However, when the data is not delivered through the interwork, engineers should manually input the data. In this process, since it leads to inaccuracy and loss of information, the results of the analysis become less accurate. Therefore, in this study, the process for the seismic evaluation of existing buildings using structural information modeling is suggested. This structural information modeling makes the work economic and accurate. To this end, it is determined which part of the process could be computerized through the investigation of the process for the seismic evaluation based on ASCE 41. The structural information modeling process is developed to apply to the seismic evaluation using Perform 3D program usually used for the nonlinear response history analysis. To validate this process, the seismic performance of an existing building is investigated.

Keywords: existing building, nonlinear analysis, seismic performance, structural information modeling

Procedia PDF Downloads 351
9648 Image Segmentation Techniques: Review

Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo

Abstract:

Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.

Keywords: clustering-based, convolution-network, edge-based, region-growing

Procedia PDF Downloads 55
9647 The Effectiveness of Using Video Modeling Procedures on the ipad to Teach Play Skills Children with ASD

Authors: Esra Orum Cattik

Abstract:

This study evaluated the effects of using video modeling procedures on the iPad to teach play skills to children with autism spectrum disorders. A male student with autism spectrum disorders participated in this study. A multiple baseline-across-skills single-subject design was used to evaluate the effects of using video modeling procedures on the iPad. During baseline, no prompts were presented to participants. In the intervention phase, the teacher gave video model on iPad to the first skill and asked play with toys for him. When the first play skill completed the second play skill began intervention. This procedure continued till all three play skill completed intervention. Finally, the participant learned all three play skills to use video modeling presented on the iPad. Based upon findings of this study, suggestions have been made to future researches.

Keywords: autism spectrum disorders, play, play skills, video modeling, single subject design

Procedia PDF Downloads 381
9646 Analytical Investigation of Modeling and Simulation of Different Combinations of Sinusoidal Supplied Autotransformer under Linear Loading Conditions

Authors: M. Salih Taci, N. Tayebi, I. Bozkır

Abstract:

This paper investigates the operation of a sinusoidal supplied autotransformer on the different states of magnetic polarity of primary and secondary terminals for four different step-up and step-down analytical conditions. In this paper, a new analytical modeling and equations for dot-marked and polarity-based step-up and step-down autotransformer are presented. These models are validated by the simulation of current and voltage waveforms for each state. PSpice environment was used for simulation.

Keywords: autotransformer modeling, autotransformer simulation, step-up autotransformer, step-down autotransformer, polarity

Procedia PDF Downloads 282
9645 Numerical Modeling of a Retaining Wall in Soil Reinforced by Layers of Geogrids

Authors: M. Mellas, S. Baaziz, A. Mabrouki, D. Benmeddour

Abstract:

The reinforcement of massifs of backfill with horizontal layers of geosynthetics is an interesting economic solution, which ensures the stability of retaining walls. The mechanical behavior of reinforced soil by geosynthetic is complex, and requires studies and research to understand the mechanisms of rupture. The behavior of reinforcements in the soil and the behavior of the main elements of the system: reinforcement-wall-soil. The present study is interested in numerical modeling of a retaining wall in soil reinforced by horizontal layers of geogrids. This modeling makes use of the software FLAC3D. This work aims to analyze the effect of the length of the geogrid "L" where the soil massif is supporting a uniformly distributed surcharge "Q", taking into account the fixing elements rather than the layers of geogrids to the wall.

Keywords: retaining wall, geogrid, reinforced soil, numerical modeling, FLAC3D

Procedia PDF Downloads 456
9644 Assessing the Effectiveness of Machine Learning Algorithms for Cyber Threat Intelligence Discovery from the Darknet

Authors: Azene Zenebe

Abstract:

Deep learning is a subset of machine learning which incorporates techniques for the construction of artificial neural networks and found to be useful for modeling complex problems with large dataset. Deep learning requires a very high power computational and longer time for training. By aggregating computing power, high performance computer (HPC) has emerged as an approach to resolving advanced problems and performing data-driven research activities. Cyber threat intelligence (CIT) is actionable information or insight an organization or individual uses to understand the threats that have, will, or are currently targeting the organization. Results of review of literature will be presented along with results of experimental study that compares the performance of tree-based and function-base machine learning including deep learning algorithms using secondary dataset collected from darknet.

Keywords: deep-learning, cyber security, cyber threat modeling, tree-based machine learning, function-based machine learning, data science

Procedia PDF Downloads 120
9643 Modeling Exponential Growth Activity Using Technology: A Research with Bachelor of Business Administration Students

Authors: V. Vargas-Alejo, L. E. Montero-Moguel

Abstract:

Understanding the concept of function has been important in mathematics education for many years. In this study, the models built by a group of five business administration and accounting undergraduate students when carrying out a population growth activity are analyzed. The theoretical framework is the Models and Modeling Perspective. The results show how the students included tables, graphics, and algebraic representations in their models. Using technology was useful to interpret, describe, and predict the situation. The first model, the students built to describe the situation, was linear. After that, they modified and refined their ways of thinking; finally, they created exponential growth. Modeling the activity was useful to deep on mathematical concepts such as covariation, rate of change, and exponential function also to differentiate between linear and exponential growth.

Keywords: covariation reasoning, exponential function, modeling, representations

Procedia PDF Downloads 94
9642 Data Mining in Healthcare for Predictive Analytics

Authors: Ruzanna Muradyan

Abstract:

Medical data mining is a crucial field in contemporary healthcare that offers cutting-edge tactics with enormous potential to transform patient care. This abstract examines how sophisticated data mining techniques could transform the healthcare industry, with a special focus on how they might improve patient outcomes. Healthcare data repositories have dynamically evolved, producing a rich tapestry of different, multi-dimensional information that includes genetic profiles, lifestyle markers, electronic health records, and more. By utilizing data mining techniques inside this vast library, a variety of prospects for precision medicine, predictive analytics, and insight production become visible. Predictive modeling for illness prediction, risk stratification, and therapy efficacy evaluations are important points of focus. Healthcare providers may use this abundance of data to tailor treatment plans, identify high-risk patient populations, and forecast disease trajectories by applying machine learning algorithms and predictive analytics. Better patient outcomes, more efficient use of resources, and early treatments are made possible by this proactive strategy. Furthermore, data mining techniques act as catalysts to reveal complex relationships between apparently unrelated data pieces, providing enhanced insights into the cause of disease, genetic susceptibilities, and environmental factors. Healthcare practitioners can get practical insights that guide disease prevention, customized patient counseling, and focused therapies by analyzing these associations. The abstract explores the problems and ethical issues that come with using data mining techniques in the healthcare industry. In order to properly use these approaches, it is essential to find a balance between data privacy, security issues, and the interpretability of complex models. Finally, this abstract demonstrates the revolutionary power of modern data mining methodologies in transforming the healthcare sector. Healthcare practitioners and researchers can uncover unique insights, enhance clinical decision-making, and ultimately elevate patient care to unprecedented levels of precision and efficacy by employing cutting-edge methodologies.

Keywords: data mining, healthcare, patient care, predictive analytics, precision medicine, electronic health records, machine learning, predictive modeling, disease prognosis, risk stratification, treatment efficacy, genetic profiles, precision health

Procedia PDF Downloads 31
9641 The Methodology of System Modeling of Mechatronic Systems

Authors: Lakhoua Najeh

Abstract:

Aims of the work: After a presentation of the functionality of an example of a mechatronic system which is a paint mixer system, we present the concepts of modeling and safe operation. This paper briefly discusses how to model and protect the functioning of a mechatronic system relying mainly on functional analysis and safe operation techniques. Methods: For the study of an example of a mechatronic system, we use methods for external functional analysis that illustrate the relationships between a mechatronic system and its external environment. Thus, we present the Safe-Structured Analysis Design Technique method (Safe-SADT) which allows the representation of a mechatronic system. A model of operating safety and automation is proposed. This model enables us to use a functional analysis technique of the mechatronic system based on the GRAFCET (Graphe Fonctionnel de Commande des Etapes et Transitions: Step Transition Function Chart) method; study of the safe operation of the mechatronic system based on the Safe-SADT method; automation of the mechatronic system based on a software tool. Results: The expected results are to propose a model and safe operation of a mechatronic system. This methodology enables us to analyze the relevance of the different models based on Safe-SADT and GRAFCET in relation to the control and monitoring functions and to study the means allowing exploiting their synergy. Conclusion: In order to propose a general model of a mechatronic system, a model of analysis, safety operation and automation of a mechatronic system has been developed. This is how we propose to validate this methodology through a case study of a paint mixer system.

Keywords: mechatronic systems, system modeling, safe operation, Safe-SADT

Procedia PDF Downloads 212
9640 An Integrated Approach to the Carbonate Reservoir Modeling: Case Study of the Eastern Siberia Field

Authors: Yana Snegireva

Abstract:

Carbonate reservoirs are known for their heterogeneity, resulting from various geological processes such as diagenesis and fracturing. These complexities may cause great challenges in understanding fluid flow behavior and predicting the production performance of naturally fractured reservoirs. The investigation of carbonate reservoirs is crucial, as many petroleum reservoirs are naturally fractured, which can be difficult due to the complexity of their fracture networks. This can lead to geological uncertainties, which are important for global petroleum reserves. The problem outlines the key challenges in carbonate reservoir modeling, including the accurate representation of fractures and their connectivity, as well as capturing the impact of fractures on fluid flow and production. Traditional reservoir modeling techniques often oversimplify fracture networks, leading to inaccurate predictions. Therefore, there is a need for a modern approach that can capture the complexities of carbonate reservoirs and provide reliable predictions for effective reservoir management and production optimization. The modern approach to carbonate reservoir modeling involves the utilization of the hybrid fracture modeling approach, including the discrete fracture network (DFN) method and implicit fracture network, which offer enhanced accuracy and reliability in characterizing complex fracture systems within these reservoirs. This study focuses on the application of the hybrid method in the Nepsko-Botuobinskaya anticline of the Eastern Siberia field, aiming to prove the appropriateness of this method in these geological conditions. The DFN method is adopted to model the fracture network within the carbonate reservoir. This method considers fractures as discrete entities, capturing their geometry, orientation, and connectivity. But the method has significant disadvantages since the number of fractures in the field can be very high. Due to limitations in the amount of main memory, it is very difficult to represent these fractures explicitly. By integrating data from image logs (formation micro imager), core data, and fracture density logs, a discrete fracture network (DFN) model can be constructed to represent fracture characteristics for hydraulically relevant fractures. The results obtained from the DFN modeling approaches provide valuable insights into the East Siberia field's carbonate reservoir behavior. The DFN model accurately captures the fracture system, allowing for a better understanding of fluid flow pathways, connectivity, and potential production zones. The analysis of simulation results enables the identification of zones of increased fracturing and optimization opportunities for reservoir development with the potential application of enhanced oil recovery techniques, which were considered in further simulations on the dual porosity and dual permeability models. This approach considers fractures as separate, interconnected flow paths within the reservoir matrix, allowing for the characterization of dual-porosity media. The case study of the East Siberia field demonstrates the effectiveness of the hybrid model method in accurately representing fracture systems and predicting reservoir behavior. The findings from this study contribute to improved reservoir management and production optimization in carbonate reservoirs with the use of enhanced and improved oil recovery methods.

Keywords: carbonate reservoir, discrete fracture network, fracture modeling, dual porosity, enhanced oil recovery, implicit fracture model, hybrid fracture model

Procedia PDF Downloads 46
9639 Comparative Study of Different Enhancement Techniques for Computed Tomography Images

Authors: C. G. Jinimole, A. Harsha

Abstract:

One of the key problems facing in the analysis of Computed Tomography (CT) images is the poor contrast of the images. Image enhancement can be used to improve the visual clarity and quality of the images or to provide a better transformation representation for further processing. Contrast enhancement of images is one of the acceptable methods used for image enhancement in various applications in the medical field. This will be helpful to visualize and extract details of brain infarctions, tumors, and cancers from the CT image. This paper presents a comparison study of five contrast enhancement techniques suitable for the contrast enhancement of CT images. The types of techniques include Power Law Transformation, Logarithmic Transformation, Histogram Equalization, Contrast Stretching, and Laplacian Transformation. All these techniques are compared with each other to find out which enhancement provides better contrast of CT image. For the comparison of the techniques, the parameters Peak Signal to Noise Ratio (PSNR) and Mean Square Error (MSE) are used. Logarithmic Transformation provided the clearer and best quality image compared to all other techniques studied and has got the highest value of PSNR. Comparison concludes with better approach for its future research especially for mapping abnormalities from CT images resulting from Brain Injuries.

Keywords: computed tomography, enhancement techniques, increasing contrast, PSNR and MSE

Procedia PDF Downloads 284