Search results for: software process model
30450 Requirements Definitions of Real-Time System Using the Behavioral Patterns Analysis (BPA) Approach: The Healthcare Multi-Agent System
Authors: Assem El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach using the Healthcare Multi-Agent System. The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are: The Behavioral Pattern Analysis (BPA) modeling methodology. The development of an interactive software tool (DECISION), which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases, Healthcare Multi-Agent System
Procedia PDF Downloads 55230449 Optimization of Springback Prediction in U-Channel Process Using Response Surface Methodology
Authors: Muhamad Sani Buang, Shahrul Azam Abdullah, Juri Saedon
Abstract:
There is not much effective guideline on development of design parameters selection on springback for advanced high strength steel sheet metal in U-channel process during cold forming process. This paper presents the development of predictive model for springback in U-channel process on advanced high strength steel sheet employing Response Surface Methodology (RSM). The experimental was performed on dual phase steel sheet, DP590 in U-channel forming process while design of experiment (DoE) approach was used to investigates the effects of four factors namely blank holder force (BHF), clearance (C) and punch travel (Tp) and rolling direction (R) were used as input parameters using two level values by applying Full Factorial design (24). From a statistical analysis of variant (ANOVA), result showed that blank holder force (BHF), clearance (C) and punch travel (Tp) displayed significant effect on springback of flange angle (β2) and wall opening angle (β1), while rolling direction (R) factor is insignificant. The significant parameters are optimized in order to reduce the springback behavior using Central Composite Design (CCD) in RSM and the optimum parameters were determined. A regression model for springback was developed. The effect of individual parameters and their response was also evaluated. The results obtained from optimum model are in agreement with the experimental valuesKeywords: advance high strength steel, u-channel process, springback, design of experiment, optimization, response surface methodology (rsm)
Procedia PDF Downloads 54130448 Development of Ultrasounf Probe Holder for Automatic Scanning Asymmetric Reflector
Authors: Nabilah Ibrahim, Hafiz Mohd Zaini, Wan Fatin Liyana Mutalib
Abstract:
Ultrasound equipment or machine is capable to scan in two dimensional (2D) areas. However there are some limitations occur during scanning an object. The problem will occur when scanning process that involving the asymmetric object. In this project, the ultrasound probe holder for asymmetric reflector scanning in 3D image is proposed to make easier for scanning the phantom or object that has asymmetric shape. Initially, the constructed asymmetric phantom that construct will be used in 2D scanning. Next, the asymmetric phantom will be interfaced by the movement of ultrasound probe holder using the Arduino software. After that, the performance of the ultrasound probe holder will be evaluated by using the various asymmetric reflector or phantom in constructing a 3D imageKeywords: ultrasound 3D images, axial and lateral resolution, asymmetric reflector, Arduino software
Procedia PDF Downloads 56030447 Effectiveness Evaluation of a Machine Design Process Based on the Computation of the Specific Output
Authors: Barenten Suciu
Abstract:
In this paper, effectiveness of a machine design process is evaluated on the basis of the specific output calculus. Concretely, a screw-worm gear mechanical transmission is designed by using the classical and the 3D-CAD methods. Strength analysis and drawing of the designed parts is substantially aided by employing the SolidWorks software. Quality of the design process is assessed by manufacturing (printing) the parts, and by computing the efficiency, specific load, as well as the specific output (work) of the mechanical transmission. Influence of the stroke, travelling velocity and load on the mechanical output, is emphasized. Optimal design of the mechanical transmission becomes possible by the appropriate usage of the acquired results.Keywords: mechanical transmission, design, screw, worm-gear, efficiency, specific output, 3D-printing
Procedia PDF Downloads 14330446 A Nonlinear Approach for System Identification of a Li-Ion Battery Based on a Non-Linear Autoregressive Exogenous Model
Authors: Meriem Mossaddek, El Mehdi Laadissi, El Mehdi Loualid, Chouaib Ennawaoui, Sohaib Bouzaid, Abdelowahed Hajjaji
Abstract:
An electrochemical system is a subset of mechatronic systems that includes a wide variety of batteries and nickel-cadmium, lead-acid batteries, and lithium-ion. Those structures have several non-linear behaviors and uncertainties in their running range. This paper studies an effective technique for modeling Lithium-Ion (Li-Ion) batteries using a Nonlinear Auto-Regressive model with exogenous input (NARX). The Artificial Neural Network (ANN) is trained to employ the data collected from the battery testing process. The proposed model is implemented on a Li-Ion battery cell. Simulation of this model in MATLAB shows good accuracy of the proposed model.Keywords: lithium-ion battery, neural network, energy storage, battery model, nonlinear models
Procedia PDF Downloads 11630445 The Asymptotic Hole Shape in Long Pulse Laser Drilling: The Influence of Multiple Reflections
Authors: Torsten Hermanns, You Wang, Stefan Janssen, Markus Niessen, Christoph Schoeler, Ulrich Thombansen, Wolfgang Schulz
Abstract:
In long pulse laser drilling of metals, it can be demonstrated that the ablation shape approaches a so-called asymptotic shape such that it changes only slightly or not at all with further irradiation. These findings are already known from ultra short pulse (USP) ablation of dielectric and semiconducting materials. The explanation for the occurrence of an asymptotic shape in long pulse drilling of metals is identified, a model for the description of the asymptotic hole shape numerically implemented, tested and clearly confirmed by comparison with experimental data. The model assumes a robust process in that way that the characteristics of the melt flow inside the arising melt film does not change qualitatively by changing the laser or processing parameters. Only robust processes are technically controllable and thus of industrial interest. The condition for a robust process is identified by a threshold for the mass flow density of the assist gas at the hole entrance which has to be exceeded. Within a robust process regime the melt flow characteristics can be captured by only one model parameter, namely the intensity threshold. In analogy to USP ablation (where it is already known for a long time that the resulting hole shape results from a threshold for the absorbed laser fluency) it is demonstrated that in the case of robust long pulse ablation the asymptotic shape forms in that way that along the whole contour the absorbed heat flux density is equal to the intensity threshold. The intensity threshold depends on the special material and radiation properties and has to be calibrated be one reference experiment. The model is implemented in a numerical simulation which is called AsymptoticDrill and requires such a few amount of resources that it can run on common desktop PCs, laptops or even smart devices. Resulting hole shapes can be calculated within seconds what depicts a clear advantage over other simulations presented in literature in the context of industrial every day usage. Against this background the software additionally is equipped with a user-friendly GUI which allows an intuitive usage. Individual parameters can be adjusted using sliders while the simulation result appears immediately in an adjacent window. A platform independent development allow a flexible usage: the operator can use the tool to adjust the process in a very convenient manner on a tablet during the developer can execute the tool in his office in order to design new processes. Furthermore, at the best knowledge of the authors AsymptoticDrill is the first simulation which allows the import of measured real beam distributions and thus calculates the asymptotic hole shape on the basis of the real state of the specific manufacturing system. In this paper the emphasis is placed on the investigation of the effect of multiple reflections on the asymptotic hole shape which gain in importance when drilling holes with large aspect ratios.Keywords: asymptotic hole shape, intensity threshold, long pulse laser drilling, robust process
Procedia PDF Downloads 21430444 A Study on Unix Process Crash Based on Efficient Process Management Method
Authors: Guo Haonan, Chen Peiyu, Zhao Hanyu, Burra Venkata Durga Kumar
Abstract:
Unix and Unix-like operating systems are widely used due to their high stability but are limited by the parent-child process structure, and the child process depends on the parent process, so the crash of a single process may cause the entire process group or even the entire system to fail. Another possibility of unexpected process termination is that the system administrator inadvertently closed the terminal or pseudo-terminal where the application was launched, causing the application process to terminate unexpectedly. This paper mainly analyzes the reasons for the problems and proposes two solutions.Keywords: process management, daemon, login-bash and non-login bash, process group
Procedia PDF Downloads 13830443 Intelligent Agent Travel Reservation System Requirements Definitions Using the Behavioral Patterns Analysis (BPA) Approach
Authors: Assem El-Ansary
Abstract:
This paper illustrates the event-oriented Behavioral Pattern Analysis (BPA) modeling approach in developing an Intelligent Agent Reservation System (IARS). The Event defined in BPA is a real-life conceptual entity that is unrelated to any implementation. The major contributions of this research are developing the Behavioral Pattern Analysis (BPA) modeling methodology, and developing an interactive software tool (DECISION) which is based on a combination of the Analytic Hierarchy Process (AHP) and the ELECTRE Multi-Criteria Decision Making (MCDM) methods.Keywords: analysis, intelligent agent, reservation system, modeling methodology, software modeling, event-oriented, behavioral pattern, use cases
Procedia PDF Downloads 48530442 Software-Defined Networks in Utility Power Networks
Authors: Ava Salmanpour, Hanieh Saeedi, Payam Rouhi, Elahe Hamzeil, Shima Alimohammadi, Siamak Hossein Khalaj, Mohammad Asadian
Abstract:
Software-defined network (SDN) is a network architecture designed to control network using software application in a central manner. This ability enables remote control of the whole network regardless of the network technology. In fact, in this architecture network intelligence is separated from physical infrastructure, it means that required network components can be implemented virtually using software applications. Today, power networks are characterized by a high range of complexity with a large number of intelligent devices, processing both huge amounts of data and important information. Therefore, reliable and secure communication networks are required. SDNs are the best choice to meet this issue. In this paper, SDN networks capabilities and characteristics will be reviewed and different basic controllers will be compared. The importance of using SDNs to escalate efficiency and reliability in utility power networks is going to be discussed and the comparison between the SDN-based power networks and traditional networks will be explained.Keywords: software-defined network, SDNs, utility network, open flow, communication, gas and electricity, controller
Procedia PDF Downloads 11430441 Ethical Decision-Making in AI and Robotics Research: A Proposed Model
Authors: Sylvie Michel, Emmanuelle Gagnou, Joanne Hamet
Abstract:
Researchers in the fields of AI and Robotics frequently encounter ethical dilemmas throughout their research endeavors. Various ethical challenges have been pinpointed in the existing literature, including biases and discriminatory outcomes, diffusion of responsibility, and a deficit in transparency within AI operations. This research aims to pinpoint these ethical quandaries faced by researchers and shed light on the mechanisms behind ethical decision-making in the research process. By synthesizing insights from existing literature and acknowledging prevalent shortcomings, such as overlooking the heterogeneous nature of decision-making, non-accumulative results, and a lack of consensus on numerous factors due to limited empirical research, the objective is to conceptualize and validate a model. This model will incorporate influences from individual perspectives and situational contexts, considering potential moderating factors in the ethical decision-making process. Qualitative analyses were conducted based on direct observation of an AI/Robotics research team focusing on collaborative robotics for several months. Subsequently, semi-structured interviews with 16 team members were conducted. The entire process took place during the first semester of 2023. Observations were analyzed using an analysis grid, and the interviews underwent thematic analysis using Nvivo software. An initial finding involves identifying the ethical challenges that AI/robotics researchers confront, underlining a disparity between practical applications and theoretical considerations regarding ethical dilemmas in the realm of AI. Notably, researchers in AI prioritize the publication and recognition of their work, sparking the genesis of these ethical inquiries. Furthermore, this article illustrated that researchers tend to embrace a consequentialist ethical framework concerning safety (for humans engaging with robots/AI), worker autonomy in relation to robots, and the societal implications of labor (can robots displace jobs?). A second significant contribution entails proposing a model for ethical decision-making within the AI/Robotics research sphere. The model proposed adopts a process-oriented approach, delineating various research stages (topic proposal, hypothesis formulation, experimentation, conclusion, and valorization). Across these stages and the ethical queries, they entail, a comprehensive four-point comprehension of ethical decision-making is presented: recognition of the moral quandary; moral judgment, signifying the decision-maker's aptitude to discern the morally righteous course of action; moral intention, reflecting the ability to prioritize moral values above others; and moral behavior, denoting the application of moral intention to the situation. Variables such as political inclinations ((anti)-capitalism, environmentalism, veganism) seem to wield significant influence. Moreover, age emerges as a noteworthy moderating factor. AI and robotics researchers are continually confronted with ethical dilemmas during their research endeavors, necessitating thoughtful decision-making. The contribution involves introducing a contextually tailored model, derived from meticulous observations and insightful interviews, enabling the identification of factors that shape ethical decision-making at different stages of the research process.Keywords: ethical decision making, artificial intelligence, robotics, research
Procedia PDF Downloads 7930440 Planning a Haemodialysis Process by Minimum Time Control of Hybrid Systems with Sliding Motion
Authors: Radoslaw Pytlak, Damian Suski
Abstract:
The aim of the paper is to provide a computational tool for planning a haemodialysis process. It is shown that optimization methods can be used to obtain the most effective treatment focused on removing both urea and phosphorus during the process. In order to achieve that, the IV–compartment model of phosphorus kinetics is applied. This kinetics model takes into account a rebound phenomenon that can occur during haemodialysis and results in a hybrid model of the process. Furthermore, vector fields associated with the model equations are such that it is very likely that using the most intuitive objective functions in the planning problem could lead to solutions which include sliding motions. Therefore, building computational tools for solving the problem of planning a haemodialysis process has required constructing numerical algorithms for solving optimal control problems with hybrid systems. The paper concentrates on minimum time control of hybrid systems since this control objective is the most suitable for the haemodialysis process considered in the paper. The presented approach to optimal control problems with hybrid systems is different from the others in several aspects. First of all, it is assumed that a hybrid system can exhibit sliding modes. Secondly, the system’s motion on the switching surface is described by index 2 differential–algebraic equations, and that guarantees accurate tracking of the sliding motion surface. Thirdly, the gradients of the problem’s functionals are evaluated with the help of adjoint equations. The adjoint equations presented in the paper take into account sliding motion and exhibit jump conditions at transition times. The optimality conditions in the form of the weak maximum principle for optimal control problems with hybrid systems exhibiting sliding modes and with piecewise constant controls are stated. The presented sensitivity analysis can be used to construct globally convergent algorithms for solving considered problems. The paper presents numerical results of solving the haemodialysis planning problem.Keywords: haemodialysis planning process, hybrid systems, optimal control, sliding motion
Procedia PDF Downloads 19530439 A Model-Reference Sliding Mode for Dual-Stage Actuator Servo Control in HDD
Authors: S. Sonkham, U. Pinsopon, W. Chatlatanagulchai
Abstract:
This paper presents a method of sliding mode control (SMC) designing and developing for the servo system in a dual-stage actuator (DSA) hard disk drive. Mathematical modelling of hard disk drive actuators is obtained, extracted from measuring frequency response of the voice-coil motor (VCM) and PZT micro-actuator separately. Matlab software tools are used for mathematical model estimation and also for controller design and simulation. A model-reference approach for tracking requirement is selected as a proposed technique. The simulation results show that performance of a model-reference SMC controller design in DSA servo control can be satisfied in the tracking error, as well as keeping the positioning of the head within the boundary of +/-5% of track width under the presence of internal and external disturbance. The overall results of model-reference SMC design in DSA are met per requirement specifications and significant reduction in %off track is found when compared to the single-state actuator (SSA).Keywords: hard disk drive, dual-stage actuator, track following, hdd servo control, sliding mode control, model-reference, tracking control
Procedia PDF Downloads 36630438 Using AI Based Software as an Assessment Aid for University Engineering Assignments
Authors: Waleed Al-Nuaimy, Luke Anastassiou, Manjinder Kainth
Abstract:
As the process of teaching has evolved with the advent of new technologies over the ages, so has the process of learning. Educators have perpetually found themselves on the lookout for new technology-enhanced methods of teaching in order to increase learning efficiency and decrease ever expanding workloads. Shortly after the invention of the internet, web-based learning started to pick up in the late 1990s and educators quickly found that the process of providing learning material and marking assignments could change thanks to the connectivity offered by the internet. With the creation of early web-based virtual learning environments (VLEs) such as SPIDER and Blackboard, it soon became apparent that VLEs resulted in higher reported computer self-efficacy among students, but at the cost of students being less satisfied with the learning process . It may be argued that the impersonal nature of VLEs, and their limited functionality may have been the leading factors contributing to this reported dissatisfaction. To this day, often faced with the prospects of assigning colossal engineering cohorts their homework and assessments, educators may frequently choose optimally curated assessment formats, such as multiple-choice quizzes and numerical answer input boxes, so that automated grading software embedded in the VLEs can save time and mark student submissions instantaneously. A crucial skill that is meant to be learnt during most science and engineering undergraduate degrees is gaining the confidence in using, solving and deriving mathematical equations. Equations underpin a significant portion of the topics taught in many STEM subjects, and it is in homework assignments and assessments that this understanding is tested. It is not hard to see that this can become challenging if the majority of assignment formats students are engaging with are multiple-choice questions, and educators end up with a reduced perspective of their students’ ability to manipulate equations. Artificial intelligence (AI) has in recent times been shown to be an important consideration for many technologies. In our paper, we explore the use of new AI based software designed to work in conjunction with current VLEs. Using our experience with the software, we discuss its potential to solve a selection of problems ranging from impersonality to the reduction of educator workloads by speeding up the marking process. We examine the software’s potential to increase learning efficiency through its features which claim to allow more customized and higher-quality feedback. We investigate the usability of features allowing students to input equation derivations in a range of different forms, and discuss relevant observations associated with these input methods. Furthermore, we make ethical considerations and discuss potential drawbacks to the software, including the extent to which optical character recognition (OCR) could play a part in the perpetuation of errors and create disagreements between student intent and their submitted assignment answers. It is the intention of the authors that this study will be useful as an example of the implementation of AI in a practical assessment scenario insofar as serving as a springboard for further considerations and studies that utilise AI in the setting and marking of science and engineering assignments.Keywords: engineering education, assessment, artificial intelligence, optical character recognition (OCR)
Procedia PDF Downloads 12330437 Phenomena-Based Approach for Automated Generation of Process Options and Process Models
Authors: Parminder Kaur Heer, Alexei Lapkin
Abstract:
Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.Keywords: Phenomena, Process intensification, Process models , Process options
Procedia PDF Downloads 23430436 Design and Development of the Force Plate for the Study of Driving-Point Biodynamic Responses
Authors: Vikas Kumar, V. H. Saran, Arpit Mathur, Avik Kathuria
Abstract:
The evaluation of biodynamic responses of the human body to whole body vibration exposure is necessary to quantify the exposure effects. A force plate model has been designed with the help of CAD software, which was investigated by performing the modal, stress and strain analysis using finite element approach in the software. The results of the modal, stress and strain analysis were under the limits for measurements of biodynamic responses to whole body vibration. The physical model of the force plate was manufactured and fixed to the vibration simulator and further used in the experimentation for the evaluation of apparent mass responses of the ten recruited subjects standing in an erect posture exposed to vertical whole body vibration. The platform was excited with sinusoidal vibration at vibration magnitude: 1.0 and 1.5 m/s2 rms at different frequency of 2, 3, 4, 5, 6, 8, 10, 12.5, 16 and 20 Hz. The results of magnitude of normalised apparent mass have shown the trend observed in the many past studies. The peak in the normalised apparent mass has been observed at 4 & 5 Hz frequency of vertical whole body vibration. The nonlinearity with respect to vibration magnitude has been also observed in the normalised apparent mass responses.Keywords: whole body vibration, apparent mass, modeling, force plate
Procedia PDF Downloads 41830435 Presentation of a Mix Algorithm for Estimating the Battery State of Charge Using Kalman Filter and Neural Networks
Authors: Amin Sedighfar, M. R. Moniri
Abstract:
Determination of state of charge (SOC) in today’s world becomes an increasingly important issue in all the applications that include a battery. In fact, estimation of the SOC is a fundamental need for the battery, which is the most important energy storage in Hybrid Electric Vehicles (HEVs), smart grid systems, drones, UPS and so on. Regarding those applications, the SOC estimation algorithm is expected to be precise and easy to implement. This paper presents an online method for the estimation of the SOC of Valve-Regulated Lead Acid (VRLA) batteries. The proposed method uses the well-known Kalman Filter (KF), and Neural Networks (NNs) and all of the simulations have been done with MATLAB software. The NN is trained offline using the data collected from the battery discharging process. A generic cell model is used, and the underlying dynamic behavior of the model has used two capacitors (bulk and surface) and three resistors (terminal, surface, and end), where the SOC determined from the voltage represents the bulk capacitor. The aim of this work is to compare the performance of conventional integration-based SOC estimation methods with a mixed algorithm. Moreover, by containing the effect of temperature, the final result becomes more accurate.Keywords: Kalman filter, neural networks, state-of-charge, VRLA battery
Procedia PDF Downloads 19330434 Fast Bayesian Inference of Multivariate Block-Nearest Neighbor Gaussian Process (NNGP) Models for Large Data
Authors: Carlos Gonzales, Zaida Quiroz, Marcos Prates
Abstract:
Several spatial variables collected at the same location that share a common spatial distribution can be modeled simultaneously through a multivariate geostatistical model that takes into account the correlation between these variables and the spatial autocorrelation. The main goal of this model is to perform spatial prediction of these variables in the region of study. Here we focus on a geostatistical multivariate formulation that relies on sharing common spatial random effect terms. In particular, the first response variable can be modeled by a mean that incorporates a shared random spatial effect, while the other response variables depend on this shared spatial term, in addition to specific random spatial effects. Each spatial random effect is defined through a Gaussian process with a valid covariance function, but in order to improve the computational efficiency when the data are large, each Gaussian process is approximated to a Gaussian random Markov field (GRMF), specifically to the block nearest neighbor Gaussian process (Block-NNGP). This approach involves dividing the spatial domain into several dependent blocks under certain constraints, where the cross blocks allow capturing the spatial dependence on a large scale, while each individual block captures the spatial dependence on a smaller scale. The multivariate geostatistical model belongs to the class of Latent Gaussian Models; thus, to achieve fast Bayesian inference, it is used the integrated nested Laplace approximation (INLA) method. The good performance of the proposed model is shown through simulations and applications for massive data.Keywords: Block-NNGP, geostatistics, gaussian process, GRMF, INLA, multivariate models.
Procedia PDF Downloads 9830433 Exchanging Radiology Reporting System with Electronic Health Record: Designing a Conceptual Model
Authors: Azadeh Bashiri
Abstract:
Introduction: In order to better designing of electronic health record system in Iran, integration of health information systems based on a common language must be done to interpret and exchange this information with this system is required. Background: This study, provides a conceptual model of radiology reporting system using unified modeling language. The proposed model can solve the problem of integration this information system with electronic health record system. By using this model and design its service based, easily connect to electronic health record in Iran and facilitate transfer radiology report data. Methods: This is a cross-sectional study that was conducted in 2013. The student community was 22 experts that working at the Imaging Center in Imam Khomeini Hospital in Tehran and the sample was accorded with the community. Research tool was a questionnaire that prepared by the researcher to determine the information requirements. Content validity and test-retest method was used to measure validity and reliability of questioner respectively. Data analyzed with average index, using SPSS. Also, Visual Paradigm software was used to design a conceptual model. Result: Based on the requirements assessment of experts and related texts, administrative, demographic and clinical data and radiological examination results and if the anesthesia procedure performed, anesthesia data suggested as minimum data set for radiology report and based it class diagram designed. Also by identifying radiology reporting system process, use case was drawn. Conclusion: According to the application of radiology reports in electronic health record system for diagnosing and managing of clinical problem of the patient, provide the conceptual Model for radiology reporting system; in order to systematically design it, the problem of data sharing between these systems and electronic health records system would eliminate.Keywords: structured radiology report, information needs, minimum data set, electronic health record system in Iran
Procedia PDF Downloads 25430432 Students’ Experiential Knowledge Production in the Teaching-Learning Process of Universities
Authors: Didiosky Benítez-Erice, Frederik Questier, Dalgys Pérez-Luján
Abstract:
This paper aims to present two models around the production of students’ experiential knowledge in the teaching-learning process of higher education: the teacher-centered production model and the student-centered production model. From a range of knowledge management and experiential learning theories, the paper elaborates into the nature of students’ experiential knowledge and proposes further adjustments of existing second-generation knowledge management theories taking into account the particularities of higher education. Despite its theoretical nature the paper can be relevant for future studies that stress student-driven improvement and innovation at higher education institutions.Keywords: experiential knowledge, higher education, knowledge management, teaching-learning process
Procedia PDF Downloads 44630431 Hardware Implementation for the Contact Force Reconstruction in Tactile Sensor Arrays
Authors: María-Luisa Pinto-Salamanca, Wilson-Javier Pérez-Holguín
Abstract:
Reconstruction of contact forces is a fundamental technique for analyzing the properties of a touched object and is essential for regulating the grip force in slip control loops. This is based on the processing of the distribution, intensity, and direction of the forces during the capture of the sensors. Currently, efficient hardware alternatives have been used more frequently in different fields of application, allowing the implementation of computationally complex algorithms, as is the case with tactile signal processing. The use of hardware for smart tactile sensing systems is a research area that promises to improve the processing time and portability requirements of applications such as artificial skin and robotics, among others. The literature review shows that hardware implementations are present today in almost all stages of smart tactile detection systems except in the force reconstruction process, a stage in which they have been less applied. This work presents a hardware implementation of a model-driven reported in the literature for the contact force reconstruction of flat and rigid tactile sensor arrays from normal stress data. From the analysis of a software implementation of such a model, this implementation proposes the parallelization of tasks that facilitate the execution of matrix operations and a two-dimensional optimization function to obtain a vector force by each taxel in the array. This work seeks to take advantage of the parallel hardware characteristics of Field Programmable Gate Arrays, FPGAs, and the possibility of applying appropriate techniques for algorithms parallelization using as a guide the rules of generalization, efficiency, and scalability in the tactile decoding process and considering the low latency, low power consumption, and real-time execution as the main parameters of design. The results show a maximum estimation error of 32% in the tangential forces and 22% in the normal forces with respect to the simulation by the Finite Element Modeling (FEM) technique of Hertzian and non-Hertzian contact events, over sensor arrays of 10×10 taxels of different sizes. The hardware implementation was carried out on an MPSoC XCZU9EG-2FFVB1156 platform of Xilinx® that allows the reconstruction of force vectors following a scalable approach, from the information captured by means of tactile sensor arrays composed of up to 48 × 48 taxels that use various transduction technologies. The proposed implementation demonstrates a reduction in estimation time of x / 180 compared to software implementations. Despite the relatively high values of the estimation errors, the information provided by this implementation on the tangential and normal tractions and the triaxial reconstruction of forces allows to adequately reconstruct the tactile properties of the touched object, which are similar to those obtained in the software implementation and in the two FEM simulations taken as reference. Although errors could be reduced, the proposed implementation is useful for decoding contact forces for portable tactile sensing systems, thus helping to expand electronic skin applications in robotic and biomedical contexts.Keywords: contact forces reconstruction, forces estimation, tactile sensor array, hardware implementation
Procedia PDF Downloads 19630430 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria
Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov
Abstract:
This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model
Procedia PDF Downloads 6630429 Study the Effect of Tolerances for Press Tool Assembly: Computer Aided Tolerance Analysis
Authors: Subodh Kumar, Ramkisan Pawar, Gopal D. Belurkar
Abstract:
This paper describes a study for simple blanking tool. In blanking or piercing operation, punch and die should be concentric for proper cutting. In this study, tolerance analysis method is used to analyze the variation in the press tool assembly. Variation results into the eccentricity in between die and punch due to cumulative tolerance of parts used in assembly. 1D variation analysis were performed by CREO parametric computer aided design (CAD) Software Powered by CETOL 6σ computer aided tolerance analysis software. Use of CAD analysis software given the opportunity to find out the cause of variation in tool assembly. Accordingly, the new specification of tolerance and process setting for die set manufacturing has determined. Tolerance allocation and tolerance analysis method were performed iteratively to conclude that position tolerance as well as size tolerance of hole in top plate for bush and size tolerance of guide pillar were more responsible for eccentricity in punch and die. This work proposes optimum tolerance for press tool assembly parts to achieve 100 % yield for specified .015mm minimum tolerance zone.Keywords: blanking, GD&T (Geometric Dimension and Tolerancing), DPMU (defects per million unit), press tool, stackup analysis, tolerance allocation, yield percentage
Procedia PDF Downloads 36430428 Software Development for AASHTO and Ethiopian Roads Authority Flexible Pavement Design Methods
Authors: Amare Setegn Enyew, Bikila Teklu Wodajo
Abstract:
The primary aim of flexible pavement design is to ensure the development of economical and safe road infrastructure. However, failures can still occur due to improper or erroneous structural design. In Ethiopia, the design of flexible pavements relies on doing calculations manually and selecting pavement structure from catalogue. The catalogue offers, in eight different charts, alternative structures for combinations of traffic and subgrade classes, as outlined in the Ethiopian Roads Authority (ERA) Pavement Design Manual 2001. Furthermore, design modification is allowed in accordance with the structural number principles outlined in the AASHTO 1993 Guide for Design of Pavement Structures. Nevertheless, the manual calculation and design process involves the use of nomographs, charts, tables, and formulas, which increases the likelihood of human errors and inaccuracies, and this may lead to unsafe or uneconomical road construction. To address the challenge, a software called AASHERA has been developed for AASHTO 1993 and ERA design methods, using MATLAB language. The software accurately determines the required thicknesses of flexible pavement surface, base, and subbase layers for the two methods. It also digitizes design inputs and references like nomographs, charts, default values, and tables. Moreover, the software allows easier comparison of the two design methods in terms of results and cost of construction. AASHERA's accuracy has been confirmed through comparisons with designs from handbooks and manuals. The software can aid in reducing human errors, inaccuracies, and time consumption as compared to the conventional manual design methods employed in Ethiopia. AASHERA, with its validated accuracy, proves to be an indispensable tool for flexible pavement structure designers.Keywords: flexible pavement design, AASHTO 1993, ERA, MATLAB, AASHERA
Procedia PDF Downloads 6330427 Simulation-Based Optimization Approach for an Electro-Plating Production Process Based on Theory of Constraints and Data Envelopment Analysis
Authors: Mayada Attia Ibrahim
Abstract:
Evaluating and developing the electroplating production process is a key challenge in this type of process. The process is influenced by several factors such as process parameters, process costs, and production environments. Analyzing and optimizing all these factors together requires extensive analytical techniques that are not available in real-case industrial entities. This paper presents a practice-based framework for the evaluation and optimization of some of the crucial factors that affect the costs and production times associated with this type of process, energy costs, material costs, and product flow times. The proposed approach uses Design of Experiments, Discrete-Event Simulation, and Theory of Constraints were respectively used to identify the most significant factors affecting the production process and simulate a real production line to recognize the effect of these factors and assign possible bottlenecks. Several scenarios are generated as corrective strategies for improving the production line. Following that, data envelopment analysis CCR input-oriented DEA model is used to evaluate and optimize the suggested scenarios.Keywords: electroplating process, simulation, design of experiment, performance optimization, theory of constraints, data envelopment analysis
Procedia PDF Downloads 10030426 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 28430425 Hawkes Process-Based Reflexivity Analysis in the Cryptocurrency Market
Authors: Alev Atak
Abstract:
We study the endogeneity in the cryptocurrency market over the branching ratio of the Hawkes process and evaluate the movement of self-excitability in the financial markets. We consider a semi-parametric self-exciting point process regression model where the excitation function is assumed to be smooth and decreasing but otherwise unspecified, and the baseline intensity is assumed to be a linear function of the regressors. We apply the empirical analysis to the three largest crypto assets, i.e. Bitcoin - Ethereum - Ripple, and provide a comparison with other financial assets such as SP500, Gold, and the volatility index VIX observed from January 2015 to December 2020. The results depict variable and high levels of endogeneity in the basket of cryptocurrencies under investigation, underlining the evidence of a significant role of endogenous feedback mechanisms in the price formation process.Keywords: hawkes process, cryptocurrency, endogeneity, reflexivity
Procedia PDF Downloads 8130424 A Method for Identifying Unusual Transactions in E-commerce Through Extended Data Flow Conformance Checking
Authors: Handie Pramana Putra, Ani Dijah Rahajoe
Abstract:
The proliferation of smart devices and advancements in mobile communication technologies have permeated various facets of life with the widespread influence of e-commerce. Detecting abnormal transactions holds paramount significance in this realm due to the potential for substantial financial losses. Moreover, the fusion of data flow and control flow assumes a critical role in the exploration of process modeling and data analysis, contributing significantly to the accuracy and security of business processes. This paper introduces an alternative approach to identify abnormal transactions through a model that integrates both data and control flows. Referred to as the Extended Data Petri net (DPNE), our model encapsulates the entire process, encompassing user login to the e-commerce platform and concluding with the payment stage, including the mobile transaction process. We scrutinize the model's structure, formulate an algorithm for detecting anomalies in pertinent data, and elucidate the rationale and efficacy of the comprehensive system model. A case study validates the responsive performance of each system component, demonstrating the system's adeptness in evaluating every activity within mobile transactions. Ultimately, the results of anomaly detection are derived through a thorough and comprehensive analysis.Keywords: database, data analysis, DPNE, extended data flow, e-commerce
Procedia PDF Downloads 5730423 Seismic Assessment of Passive Control Steel Structure with Modified Parameter of Oil Damper
Authors: Ahmad Naqi
Abstract:
Today, the passively controlled buildings are extensively becoming popular due to its excellent lateral load resistance circumstance. Typically, these buildings are enhanced with a damping device that has high market demand. Some manufacturer falsified the damping device parameter during the production to achieve the market demand. Therefore, this paper evaluates the seismic performance of buildings equipped with damping devices, which their parameter modified to simulate the falsified devices, intentionally. For this purpose, three benchmark buildings of 4-, 10-, and 20-story were selected from JSSI (Japan Society of Seismic Isolation) manual. The buildings are special moment resisting steel frame with oil damper in the longitudinal direction only. For each benchmark buildings, two types of structural elements are designed to resist the lateral load with and without damping devices (hereafter, known as Trimmed & Conventional Building). The target building was modeled using STERA-3D, a finite element based software coded for study purpose. Practicing the software one can develop either three-dimensional Model (3DM) or Lumped Mass model (LMM). Firstly, the seismic performance of 3DM and LMM models was evaluated and found excellent coincide for the target buildings. The simplified model of LMM used in this study to produce 66 cases for both of the buildings. Then, the device parameters were modified by ± 40% and ±20% to predict many possible conditions of falsification. It is verified that the building which is design to sustain the lateral load with support of damping device (Trimmed Building) are much more under threat as a result of device falsification than those building strengthen by damping device (Conventional Building).Keywords: passive control system, oil damper, seismic assessment, lumped mass model
Procedia PDF Downloads 11530422 Reactivity Study on South African Calcium Based Material Using a pH-Stat and Citric Acid: A Statistical Approach
Authors: Hilary Rutto, Mbali Chiliza, Tumisang Seodigeng
Abstract:
The study on reactivity of calcined calcium-based material is very important in dry flue gas desulphurisation (FGD) process, so as to produce absorbent with high sulphur dioxide capture capacity during the hydration process. The effect of calcining temperature and time on the reactivity of calcined limestone material were investigated. In this study, the reactivity was measured using a pH stat apparatus and also confirming the result by performing citric acid reactivity test. The reactivity was calculated using the shrinking core model. Based on the experiments, a mathematical model is developed to correlate the effect of time and temperature to the reactivity of absorbent. The calcination process variables were temperature (700 -1000°C) and time (1-6 hrs). It was found that reactivity increases with an increase in time and temperature.Keywords: reactivity, citric acid, calcination, time
Procedia PDF Downloads 22030421 Preparation of Corn Flour Based Extruded Product and Evaluate Its Physical Characteristics
Authors: C. S. Saini
Abstract:
The composite flour blend consisting of corn, pearl millet, black gram and wheat bran in the ratio of 80:5:10:5 was taken to prepare the extruded product and their effect on physical properties of extrudate was studied. The extrusion process was conducted in laboratory by using twin screw extruder. The physical characteristics evaluated include lateral expansion, bulk density, water absorption index, water solubility index, rehydration ratio and moisture retention. The Central Composite Rotatable Design (CCRD) was used to decide the level of processing variables i.e. feed moisture content (%), screw speed (rpm), and barrel temperature (oC) for the experiment. The data obtained after extrusion process were analyzed by using response surface methodology. A second order polynomial model for the dependent variables was established to fit the experimental data. The numerical optimization studies resulted in 127°C of barrel temperature, 246 rpm of screw speed, and 14.5% of feed moisture as optimum variables to produce acceptable extruded product. The responses predicted by the software for the optimum process condition resulted in lateral expansion 126 %, bulk density 0.28 g/cm3, water absorption index 4.10 g/g, water solubility index 39.90 %, rehydration ratio 544 % and moisture retention 11.90 % with 75 % desirability.Keywords: black gram, corn flour, extrusion, physical characteristics
Procedia PDF Downloads 479