Search results for: analytical systems engineering process
25587 Systems Versioning: A Features-Based Meta-Modeling Approach
Authors: Ola A. Younis, Said Ghoul
Abstract:
Systems running these days are huge, complex and exist in many versions. Controlling these versions and tracking their changes became a very hard process as some versions are created using meaningless names or specifications. Many versions of a system are created with no clear difference between them. This leads to mismatching between a user’s request and the version he gets. In this paper, we present a system versions meta-modeling approach that produces versions based on system’s features. This model reduced the number of steps needed to configure a release and gave each version its unique specifications. This approach is applicable for systems that use features in its specification.Keywords: features, meta-modeling, semantic modeling, SPL, VCS, versioning
Procedia PDF Downloads 44625586 Exploring Antifragility Principles in Humanitarian Supply Chain: The key Role of Information Systems
Authors: Sylvie Michel, Sylvie Gerbaix, Marc Bidan
Abstract:
The COVID-19 pandemic has been a major and global disruption that has affected all supply chains on a worldwide scale. Consequently, the question posed by this communication is to understand how - in the face of such disruptions - supply chains, including their actors, management tools, and processes, react, survive, adapt, and even improve. To do so, the concepts of resilience and antifragility applied to a supply chain have been leveraged. This article proposes to perceive resilience as a step to surpass in moving towards antifragility. The research objective is to propose an analytical framework to measure and compare resilience and antifragility, with antifragility seen as a property of a system that improves when subjected to disruptions rather than merely resisting these disruptions, as is the case with resilience. A unique case study was studied - MSF logistics (France) - using a qualitative methodology. Semi-structured interviews were conducted in person and remotely in multiple phases: during and immediately after the COVID crisis (8 interviews from March 2020 to April 2021), followed by a new round from September to November 2023. A Delphi method was employed. The interviews were analyzed using coding and a thematic framework. One of the theoretical contributions is consolidating the field of supply chain resilience research by precisely characterizing the dimensions of resilience for a humanitarian supply chain (Reorganization, Collaboration mediated by IS, Humanitarian culture). In this regard, a managerial contribution of this study is providing a guide for managers to identify the four dimensions and sub-dimensions of supply chain resilience. This enables managers to focus their decisions and actions on dimensions that will enhance resilience. Most importantly, another contribution is comparing the concepts of resilience and antifragility and proposing an analytical framework for antifragility—namely, the mechanisms on which MSF logistics relied to capitalize on uncertainties, contingencies, and shocks rather than simply enduring them. For MSF Logistics, antifragility manifested through the ability to identify opportunities hidden behind the uncertainties and shocks of COVID-19, reducing vulnerability, and fostering a culture that encourages innovation and the testing of new ideas. Logistics, particularly in the humanitarian domain, must be able to adapt to environmental disruptions. In this sense, this study identifies and characterizes the dimensions of resilience implemented by humanitarian logistics. Moreover, this research goes beyond the concept of resilience to propose an analytical framework for the concept of antifragility. The organization studied emerged stronger from the COVID-19 crisis due to the mechanisms we identified, allowing us to characterize antifragility. Finally, the results show that the information system plays a key role in antifragility.Keywords: antifragility, humanitarian supply chain, information systems, qualitative research, resilience.
Procedia PDF Downloads 6425585 Contractor Selection by Using Analytical Network Process
Authors: Badr A. Al-Jehani
Abstract:
Nowadays, contractor selection is a critical activity of the project owner. Selecting the right contractor is essential to the project manager for the success of the project, and this cab happens by using the proper selecting method. Traditionally, the contractor is being selected based on his offered bid price. This approach focuses only on the price factor and forgetting other essential factors for the success of the project. In this research paper, the Analytic Network Process (ANP) method is used as a decision tool model to select the most appropriate contractor. This decision-making method can help the clients who work in the construction industry to identify contractors who are capable of delivering satisfactory outcomes. Moreover, this research paper provides a case study of selecting the proper contractor among three contractors by using ANP method. The case study identifies and computes the relative weight of the eight criteria and eleven sub-criteria using a questionnaire.Keywords: contractor selection, project management, decision-making, bidding
Procedia PDF Downloads 8825584 An Inverse Approach for Determining Creep Properties from a Miniature Thin Plate Specimen under Bending
Authors: Yang Zheng, Wei Sun
Abstract:
This paper describes a new approach which can be used to interpret the experimental creep deformation data obtained from miniaturized thin plate bending specimen test to the corresponding uniaxial data based on an inversed application of the reference stress method. The geometry of the thin plate is fully defined by the span of the support, l, the width, b, and the thickness, d. Firstly, analytical solutions for the steady-state, load-line creep deformation rate of the thin plates for a Norton’s power law under plane stress (b → 0) and plane strain (b → ∞) conditions were obtained, from which it can be seen that the load-line deformation rate of the thin plate under plane-stress conditions is much higher than that under the plane-strain conditions. Since analytical solution is not available for the plates with random b-values, finite element (FE) analyses are used to obtain the solutions. Based on the FE results obtained for various b/l ratios and creep exponent, n, as well as the analytical solutions under plane stress and plane strain conditions, an approximate, numerical solutions for the deformation rate are obtained by curve fitting. Using these solutions, a reference stress method is utilised to establish the conversion relationships between the applied load and the equivalent uniaxial stress and between the creep deformations of thin plate and the equivalent uniaxial creep strains. Finally, the accuracy of the empirical solution was assessed by using a set of “theoretical” experimental data.Keywords: bending, creep, thin plate, materials engineering
Procedia PDF Downloads 47425583 Predicting the Turbulence Intensity, Excess Energy Available and Potential Power Generated by Building Mounted Wind Turbines over Four Major UK City
Authors: Emejeamara Francis
Abstract:
The future of potentials wind energy applications within suburban/urban areas are currently faced with various problems. These include insufficient assessment of urban wind resource, and the effectiveness of commercial gust control solutions as well as unavailability of effective and cheaper valuable tools for scoping the potentials of urban wind applications within built-up environments. In order to achieve effective assessment of the potentials of urban wind installations, an estimation of the total energy that would be available to them were effective control systems to be used, and evaluating the potential power to be generated by the wind system is required. This paper presents a methodology of predicting the power generated by a wind system operating within an urban wind resource. This method was developed by using high temporal resolution wind measurements from eight potential sites within the urban and suburban environment as inputs to a vertical axis wind turbine multiple stream tube model. A relationship between the unsteady performance coefficient obtained from the stream tube model results and turbulence intensity was demonstrated. Hence, an analytical methodology for estimating the unsteady power coefficient at a potential turbine site is proposed. This is combined with analytical models that were developed to predict the wind speed and the excess energy (EEC) available in estimating the potential power generated by wind systems at different heights within a built environment. Estimates of turbulence intensities, wind speed, EEC and turbine performance based on the current methodology allow a more complete assessment of available wind resource and potential urban wind projects. This methodology is applied to four major UK cities namely Leeds, Manchester, London and Edinburgh and the potential to map the turbine performance at different heights within a typical urban city is demonstrated.Keywords: small-scale wind, turbine power, urban wind energy, turbulence intensity, excess energy content
Procedia PDF Downloads 27725582 Quality Based Approach for Efficient Biologics Manufacturing
Authors: Takashi Kaminagayoshi, Shigeyuki Haruyama
Abstract:
To improve the manufacturing efficiency of biologics, such as antibody drugs, a quality engineering framework was designed. Within this framework, critical steps and parameters in the manufacturing process were studied. Identification of these critical steps and critical parameters allows a deeper understanding of manufacturing capabilities, and suggests to process development department process control standards based on actual manufacturing capabilities as part of a PDCA (plan-do-check-act) cycle. This cycle can be applied to each manufacturing process so that it can be standardized, reducing the time needed to establish each new process.Keywords: antibody drugs, biologics, manufacturing efficiency, PDCA cycle, quality engineering
Procedia PDF Downloads 34525581 The Estimation of Human Vital Signs Complexity
Authors: L. Bikulciene, E. Venskaityte, G. Jarusevicius
Abstract:
Non-stationary and nonlinear signals generated by living complex systems defy traditional mechanistic approaches, which are based on homeostasis. Previous our studies have shown that the evaluation of the interactions of physiological signals by using special analysis methods is suitable for observation of physiological processes. It is demonstrated the possibility of using deep physiological model, based interpretation of the changes of the human body’s functional states combined with an application of the analytical method based on matrix theory for the physiological signals analysis, which was applied on high risk cardiac patients. It is shown that evaluation of cardiac signals interactions show peculiar for each individual functional changes at the onset of hemodynamic restoration procedure. Therefore we suggest that the alterations of functional state of the body, after patients overcome surgery can be complemented by the data received from the suggested approach of the evaluation of functional variables interactions.Keywords: cardiac diseases, complex systems theory, ECG analysis, matrix analysis
Procedia PDF Downloads 34425580 Systems Intelligence in Management (High Performing Organizations and People Score High in Systems Intelligence)
Authors: Raimo P. Hämäläinen, Juha Törmänen, Esa Saarinen
Abstract:
Systems thinking has been acknowledged as an important approach in the strategy and management literature ever since the seminal works of Ackhoff in the 1970´s and Senge in the 1990´s. The early literature was very much focused on structures and organizational dynamics. Understanding systems is important but making improvements also needs ways to understand human behavior in systems. Peter Senge´s book The Fifth Discipline gave the inspiration to the development of the concept of Systems Intelligence. The concept integrates the concepts of personal mastery and systems thinking. SI refers to intelligent behavior in the context of complex systems involving interaction and feedback. It is a competence related to the skills needed in strategy and the environment of modern industrial engineering and management where people skills and systems are in an increasingly important role. The eight factors of Systems Intelligence have been identified from extensive surveys and the factors relate to perceiving, attitude, thinking and acting. The personal self-evaluation test developed consists of 32 items which can also be applied in a peer evaluation mode. The concept and test extend to organizations too. One can talk about organizational systems intelligence. This paper reports the results of an extensive survey based on peer evaluation. The results show that systems intelligence correlates positively with professional performance. People in a managerial role score higher in SI than others. Age improves the SI score but there is no gender difference. Top organizations score higher in all SI factors than lower ranked ones. The SI-tests can also be used as leadership and management development tools helping self-reflection and learning. Finding ways of enhancing learning organizational development is important. Today gamification is a new promising approach. The items in the SI test have been used to develop an interactive card game following the Topaasia game approach. It is an easy way of engaging people in a process which both helps participants see and approach problems in their organization. It also helps individuals in identifying challenges in their own behavior and in improving in their SI.Keywords: gamification, management competence, organizational learning, systems thinking
Procedia PDF Downloads 9625579 Analytical Design of Fractional-Order PI Controller for Decoupling Control System
Authors: Truong Nguyen Luan Vu, Le Hieu Giang, Le Linh
Abstract:
The FOPI controller is proposed based on the main properties of the decoupling control scheme, as well as the fractional calculus. By using the simplified decoupling technique, the transfer function of decoupled apparent process is firstly separated into a set of n equivalent independent processes in terms of a ratio of the diagonal elements of original open-loop transfer function to those of dynamic relative gain array and the fraction – order PI controller is then developed for each control loops due to the Bode’s ideal transfer function that gives the desired fractional closed-loop response in the frequency domain. The simulation studies were carried out to evaluate the proposed design approach in a fair compared with the other existing methods in accordance with the structured singular value (SSV) theory that used to measure the robust stability of control systems under multiplicative output uncertainty. The simulation results indicate that the proposed method consistently performs well with fast and well-balanced closed-loop time responses.Keywords: ideal transfer function of bode, fractional calculus, fractional order proportional integral (FOPI) controller, decoupling control system
Procedia PDF Downloads 33125578 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study
Authors: K. Adu Michael, K. Alese Boniface
Abstract:
Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.Keywords: client/customer, problem statement, requirements engineering, software developers
Procedia PDF Downloads 40525577 An Analytical View of Albanian and French Legislation on Access to Health Care Benefits
Authors: Oljana Hoxhaj
Abstract:
The integration process of Albania into the European family carries many difficulties. In this context, the Albanian legislator is inclined to implement in the domestic legal framework models which have been successful in other countries. Our paper aims to present an analytical and comparative approach to the health system in Albania and France, mainly focusing on citizen’s access to these services. Different standards and cultures between states, in the context of an approximate model, will be the first challenge of our paper. Over the last few years, the Albanian government has undertaken concrete reforms in this sector, aiming to transform the vision on which the previous health system was structured. In this perspective, the state fulfills not only an obligation to its citizens, but also consolidates progressive steps toward alignment with European Union standards. The necessity to undertake a genuine reform in this area has come as an exigency of society, which has permanently identified problems within this sector, considering it ineffective, out of standards, and corrupt. The inclusion of health services on the Albanian government agenda reflects its will in the function of good governance, transparency, and broadening access to the provision of quality health services in the public and private sectors. The success of any initiative in the health system consists of giving priority to patient needs. Another objective that should be in the state's consideration is to create the premise to provide a comprehensive process on whose foundations partnership and broader co-operation with beneficiary entities are established in any decision-making that is directly related to their interests. Some other important and widespread impacts on the effective realization of citizens' access to the healthcare system coincide with the construction of appropriate infrastructure, increasing the professionalism and qualification of medical staff, and the allocation of a higher budget. France has one of the most effective healthcare models in Europe. That is why we have chosen to analyze this country, aiming to highlight the advantages of this system, as well as the commitment of the French state to drafting effective health policies. In the framework of the process of harmonization of the Albanian legislation with that of the European Union, through our work, we aim to identify the space to implement the whole of these legislative innovations in the Albanian legislation.Keywords: effective service, harmonization level, innovation, reform
Procedia PDF Downloads 11225576 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element
Procedia PDF Downloads 7225575 A Review of the Run to Run (R to R) Control in the Manufacturing Processes
Authors: Khalil Aghapouramin, Mostafa Ranjbar
Abstract:
Run- to- Run (R2 R) control was developed in order to monitor and control different semiconductor manufacturing processes based upon the fundamental engineering frameworks. This technology allows rectification in the optimum direction. This control always had a significant potency in which was appeared in a variety of processes. The term run to run refers to the case where the act of control would take with the aim of getting batches of silicon wafers which produced in a manufacturing process. In the present work, a brief review about run-to-run control investigated which mainly is effective in the manufacturing process.Keywords: Run-to-Run (R2R) control, manufacturing, process in engineering, manufacturing controls
Procedia PDF Downloads 49325574 Factors Affecting Employee Decision Making in an AI Environment
Authors: Yogesh C. Sharma, A. Seetharaman
Abstract:
The decision-making process in humans is a complicated system influenced by a variety of intrinsic and extrinsic factors. Human decisions have a ripple effect on subsequent decisions. In this study, the scope of human decision making is limited to employees. In an organisation, a person makes a variety of decisions from the time they are hired to the time they retire. The goal of this research is to identify various elements that influence decision-making. In addition, the environment in which a decision is made is a significant aspect of the decision-making process. Employees in today's workplace use artificial intelligence (AI) systems for automation and decision augmentation. The impact of AI systems on the decision-making process is examined in this study. This research is designed based on a systematic literature review. Based on gaps in the literature, limitations and the scope of future research have been identified. Based on these findings, a research framework has been designed to identify various factors affecting employee decision making. Employee decision making is influenced by technological advancement, data-driven culture, human trust, decision automation-augmentation, and workplace motivation. Hybrid human-AI systems require the development of new skill sets and organisational design. Employee psychological safety and supportive leadership influences overall job satisfaction.Keywords: employee decision making, artificial intelligence (AI) environment, human trust, technology innovation, psychological safety
Procedia PDF Downloads 10825573 Stereo Camera Based Speed-Hump Detection Process for Real Time Driving Assistance System in the Daytime
Authors: Hyun-Koo Kim, Yong-Hun Kim, Soo-Young Suk, Ju H. Park, Ho-Youl Jung
Abstract:
This paper presents an effective speed hump detection process at the day-time. we focus only on round types of speed humps in the day-time dynamic road environment. The proposed speed hump detection scheme consists mainly of two process as stereo matching and speed hump detection process. Our proposed process focuses to speed hump detection process. Speed hump detection process consist of noise reduction step, data fusion step, and speed hemp detection step. The proposed system is tested on Intel Core CPU with 2.80 GHz and 4 GB RAM tested in the urban road environments. The frame rate of test videos is 30 frames per second and the size of each frame of grabbed image sequences is 1280 pixels by 670 pixels. Using object-marked sequences acquired with an on-vehicle camera, we recorded speed humps and non-speed humps samples. Result of the tests, our proposed method can be applied in real-time systems by computation time is 13 ms. For instance; our proposed method reaches 96.1 %.Keywords: data fusion, round types speed hump, speed hump detection, surface filter
Procedia PDF Downloads 51025572 Analytical Study: An M-Learning App Reflecting the Factors Affecting Student’s Adoption of M-Learning
Authors: Ahmad Khachan, Ahmet Ozmen
Abstract:
This study aims to introduce a mobile bite-sized learning concept, a mobile application with social networks motivation factors that will encourage students to practice critical thinking, improve analytical skills and learn knowledge sharing. We do not aim to propose another e-learning or distance learning based tool like Moodle and Edmodo; instead, we introduce a mobile learning tool called Interactive M-learning Application. The tool reconstructs and strengthens the bonds between educators and learners and provides a foundation for integrating mobile devices in education. The application allows learners to stay connected all the time, share ideas, ask questions and learn from each other. It is built on Android since the Android has the largest platform share in the world and is dominating the market with 74.45% share in 2018. We have chosen Google-Firebase server for hosting because of flexibility, ease of hosting and real time update capabilities. The proposed m-learning tool was offered to four groups of university students in different majors. An improvement in the relation between the students, the teachers and the academic institution was obvious. Student’s performance got much better added to better analytical and critical skills advancement and moreover a willingness to adopt mobile learning in class. We have also compared our app with another tool in the same class for clarity and reliability of the results. The student’s mobile devices were used in this experimental study for diversity of devices and platform versions.Keywords: education, engineering, interactive software, undergraduate education
Procedia PDF Downloads 15525571 Conflict Resolution in Fuzzy Rule Base Systems Using Temporal Modalities Inference
Authors: Nasser S. Shebka
Abstract:
Fuzzy logic is used in complex adaptive systems where classical tools of representing knowledge are unproductive. Nevertheless, the incorporation of fuzzy logic, as it’s the case with all artificial intelligence tools, raised some inconsistencies and limitations in dealing with increased complexity systems and rules that apply to real-life situations and hinders the ability of the inference process of such systems, but it also faces some inconsistencies between inferences generated fuzzy rules of complex or imprecise knowledge-based systems. The use of fuzzy logic enhanced the capability of knowledge representation in such applications that requires fuzzy representation of truth values or similar multi-value constant parameters derived from multi-valued logic, which set the basis for the three t-norms and their based connectives which are actually continuous functions and any other continuous t-norm can be described as an ordinal sum of these three basic ones. However, some of the attempts to solve this dilemma were an alteration to fuzzy logic by means of non-monotonic logic, which is used to deal with the defeasible inference of expert systems reasoning, for example, to allow for inference retraction upon additional data. However, even the introduction of non-monotonic fuzzy reasoning faces a major issue of conflict resolution for which many principles were introduced, such as; the specificity principle and the weakest link principle. The aim of our work is to improve the logical representation and functional modelling of AI systems by presenting a method of resolving existing and potential rule conflicts by representing temporal modalities within defeasible inference rule-based systems. Our paper investigates the possibility of resolving fuzzy rules conflict in a non-monotonic fuzzy reasoning-based system by introducing temporal modalities and Kripke's general weak modal logic operators in order to expand its knowledge representation capabilities by means of flexibility in classifying newly generated rules, and hence, resolving potential conflicts between these fuzzy rules. We were able to address the aforementioned problem of our investigation by restructuring the inference process of the fuzzy rule-based system. This is achieved by using time-branching temporal logic in combination with restricted first-order logic quantifiers, as well as propositional logic to represent classical temporal modality operators. The resulting findings not only enhance the flexibility of complex rule-base systems inference process but contributes to the fundamental methods of building rule bases in such a manner that will allow for a wider range of applicable real-life situations derived from a quantitative and qualitative knowledge representational perspective.Keywords: fuzzy rule-based systems, fuzzy tense inference, intelligent systems, temporal modalities
Procedia PDF Downloads 9125570 Development of Paper Based Analytical Devices for Analysis of Iron (III) in Natural Water Samples
Authors: Sakchai Satienperakul, Manoch Thanomwat, Jutiporn Seedasama
Abstract:
A paper based analytical devices (PADs) for the analysis of Fe (III) ion in natural water samples is developed, using reagent from guava leaf extract. The extraction is simply performed in deionized water pH 7, where tannin extract is obtained and used as an alternative natural reagent. The PADs are fabricated by ink-jet printing using alkenyl ketene dimer (AKD) wax. The quantitation of Fe (III) is carried out using reagent from guava leaf extract prepared in acetate buffer at the ratio of 1:1. A color change to gray-purple is observed by naked eye when dropping sample contained Fe (III) ion on PADs channel. The reflective absorption measurement is performed for creating a standard curve. The linear calibration range is observed over the concentration range of 2-10 mg L-1. Detection limited of Fe (III) is observed at 2 mg L-1. In its optimum form, the PADs is stable for up to 30 days under oxygen free conditions. The small dimensions, low volume requirement and alternative natural reagent make the proposed PADs attractive for on-site environmental monitoring and analysis.Keywords: green chemical analysis, guava leaf extract, lab on a chip, paper based analytical device
Procedia PDF Downloads 24025569 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution
Authors: Nikolay P. Brayanov, Anna V. Stoynova
Abstract:
Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development
Procedia PDF Downloads 24425568 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)
Authors: Gule Teri
Abstract:
The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing
Procedia PDF Downloads 8025567 Advanced Numerical and Analytical Methods for Assessing Concrete Sewers and Their Remaining Service Life
Authors: Amir Alani, Mojtaba Mahmoodian, Anna Romanova, Asaad Faramarzi
Abstract:
Pipelines are extensively used engineering structures which convey fluid from one place to another. Most of the time, pipelines are placed underground and are encumbered by soil weight and traffic loads. Corrosion of pipe material is the most common form of pipeline deterioration and should be considered in both the strength and serviceability analysis of pipes. The study in this research focuses on concrete pipes in sewage systems (concrete sewers). This research firstly investigates how to involve the effect of corrosion as a time dependent process of deterioration in the structural and failure analysis of this type of pipe. Then three probabilistic time dependent reliability analysis methods including the first passage probability theory, the gamma distributed degradation model and the Monte Carlo simulation technique are discussed and developed. Sensitivity analysis indexes which can be used to identify the most important parameters that affect pipe failure are also discussed. The reliability analysis methods developed in this paper contribute as rational tools for decision makers with regard to the strengthening and rehabilitation of existing pipelines. The results can be used to obtain a cost-effective strategy for the management of the sewer system.Keywords: reliability analysis, service life prediction, Monte Carlo simulation method, first passage probability theory, gamma distributed degradation model
Procedia PDF Downloads 45625566 Numerical Studies on the Performance of the Finned-Tube Heat Exchanger
Authors: S. P. Praveen Kumar, Bong-Su Sin, Kwon-Hee Lee
Abstract:
Finned-tube heat exchangers are predominantly used in space conditioning systems, as well as other applications requiring heat exchange between two fluids. The design of finned-tube heat exchangers requires the selection of over a dozen design parameters by the designer such as tube pitch, tube diameter, tube thickness, etc. Finned-tube heat exchangers are common devices; however, their performance characteristics are complicated. In this paper, numerical studies have been carried out to analyze the performances of finned tube heat exchanger (without fins considered for experimental purpose) by predicting the characteristics of temperature difference and pressure drop. In this study, a design considering 5 design variables, maximizing the temperature difference and minimizing the pressure drop was suggested by applying DOE. In this process, L18 orthogonal array was adopted. Parametric analytical studies have been carried out using Analysis of Variance (ANOVA) to determine the relative importance of each variable with respect to the temperature difference and the pressure drop. Following the results, the final design was suggested by predicting the optimum design therefore confirming the optimized condition.Keywords: heat exchanger, fluid analysis, heat transfer, design of experiment, analysis of variance
Procedia PDF Downloads 44625565 The Neuropsychology of Autism and ADHD
Authors: Anvikshaa Bisen, Krish Makkar
Abstract:
Professionals misdiagnose autism by ticking off symptoms on a checklist without questioning the causes of said symptoms, and without understanding the innate neurophysiology of the autistic brain. A dysfunctional cingulate gyrus (CG) hyperfocuses attention in the left frontal lobe (logical/analytical) with no ability to access the right frontal lobe (emotional/creative), which plays a central role in spontaneity, social behavior, and nonverbal abilities. Autistic people live in a specialized inner space that is entirely intellectual, free from emotional and social distractions. They have no innate biological way of emotionally connecting with other people. Autistic people process their emotions intellectually, a process that can take 24 hours, by which time it is too late to have felt anything. An inactive amygdala makes it impossible for autistic people to experience fear. Because they do not feel emotion, they have no emotional memories. All memories are of events that happened about which they felt no emotion at the time and feel no emotion when talking about it afterward.Keywords: autism, Asperger, Asd, neuropsychology, neuroscience
Procedia PDF Downloads 4825564 An Excel-Based Educational Platform for Design Analyses of Pump-Pipe Systems
Authors: Mohamed M. El-Awad
Abstract:
This paper describes an educational platform for design analyses of pump-pipe systems by using Microsoft Excel, its Solver add-in, and the associated VBA programming language. The paper demonstrates the capabilities of the Excel-based platform that suits the iterative nature of the design process better than the use of design charts and data tables. While VBA is used for the development of a user-defined function for determining the standard pipe diameter, Solver is used for optimising the pipe diameter of the pipeline and for determining the operating point of the selected pump.Keywords: design analyses, pump-pipe systems, Excel, solver, VBA
Procedia PDF Downloads 16625563 Agroforestry Systems and Practices and Its Adoption in Kilombero Cluster of Sagcot, Tanzania
Authors: Lazaro E. Nnko, Japhet J. Kashaigili, Gerald C. Monela, Pantaleo K. T. Munishi
Abstract:
Agroforestry systems and practices are perceived to improve livelihood and sustainable management of natural resources. However, their adoption in various regions differs with the biophysical conditions and societal characteristics. This study was conducted in Kilombero District to investigate the factors influencing the adoption of different agroforestry systems and practices in agro-ecosystems and farming systems. A household survey, key informant interviews, and focus group discussion was used for data collection in three villages. Descriptive statistics and multinomial logistic regression in SPSS were applied for analysis. Results show that Igima and Ngajengwa villages had home garden practices dominated, as revealed by 63.3% and 66.7%, respectively, while Mbingu village had mixed intercropping practice with 56.67%. Agrosilvopasture systems were dominant in Igima and Ngajengwa villages with 56.7% and 66.7%, respectively, while in Mbingu village, the dominant system was agrosilviculture with 66.7%. The results from multinomial logistic regression show that different explanatory variable was statistical significance as predictors of the adoption of agroforestry systems and practices. Residence type and sex were the most dominant factor influencing the adoption of agroforestry systems. Duration of stay in the village, availability of extension education, residence, and sex were the dominant factor influencing the adoption of agroforestry practices. The most important and statistically significant factors among these were residence type and sex. The study concludes that agroforestry will be more successful if the local priorities, which include social-economic need characteristics of the society, will be considered in designing systems and practices. The socio-economic need of the community should be addressed in the process of expanding the adoption of agroforestry systems and practices.Keywords: agroforestry adoption, agroforestry systems, agroforestry practices, agroforestry, Kilombero
Procedia PDF Downloads 11825562 D6tions: A Serious Game to Learn Software Engineering Process and Design
Authors: Hector G. Perez-Gonzalez, Miriam Vazquez-Escalante, Sandra E. Nava-Muñoz, Francisco E. Martinez-Perez, Alberto S. Nunez-Varela
Abstract:
The software engineering teaching process has been the subject of many studies. To improve this process, researchers have proposed merely illustrative techniques in the classroom, such as topic presentations and dynamics between students on one side or attempts to involve students in real projects with companies and institutions to bring them to a real software development problem on the other hand. Simulators and serious games have been used as auxiliary tools to introduce students to topics that are too abstract when these are presented in the traditional way. Most of these tools cover a limited area of the huge software engineering scope. To address this problem, we have developed D6tions, an educational serious game that simulates the software engineering process and is designed to experiment the different stages a software engineer (playing roles as project leader or as a developer or designer) goes through, while participating in a software project. We describe previous approaches to this problem, how D6tions was designed, its rules, directions, and the results we obtained of the use of this game involving undergraduate students playing the game.Keywords: serious games, software engineering, software engineering education, software engineering teaching process
Procedia PDF Downloads 49325561 An Analytical Wall Function for 2-D Shock Wave/Turbulent Boundary Layer Interactions
Authors: X. Wang, T. J. Craft, H. Iacovides
Abstract:
When handling the near-wall regions of turbulent flows, it is necessary to account for the viscous effects which are important over the thin near-wall layers. Low-Reynolds- number turbulence models do this by including explicit viscous and also damping terms which become active in the near-wall regions, and using very fine near-wall grids to properly resolve the steep gradients present. In order to overcome the cost associated with the low-Re turbulence models, a more advanced wall function approach has been implemented within OpenFoam and tested together with a standard log-law based wall function in the prediction of flows which involve 2-D shock wave/turbulent boundary layer interactions (SWTBLIs). On the whole, from the calculation of the impinging shock interaction, the three turbulence modelling strategies, the Lauder-Sharma k-ε model with Yap correction (LS), the high-Re k-ε model with standard wall function (SWF) and analytical wall function (AWF), display good predictions of wall-pressure. However, the SWF approach tends to underestimate the tendency of the flow to separate as a result of the SWTBLI. The analytical wall function, on the other hand, is able to reproduce the shock-induced flow separation and returns predictions similar to those of the low-Re model, using a much coarser mesh.Keywords: SWTBLIs, skin-friction, turbulence modeling, wall function
Procedia PDF Downloads 34625560 The Exercise of Deliberative Democracy on Public Administrations Agencies' Decisions
Authors: Mauricio Filho, Carina Castro
Abstract:
The object of this project is to analyze long-time public agents that passed through several governments and see themselves in the position of having to deliberate with new agents, recently settled in the public administration. For theoretical ends, internal deliberation is understood as the one practiced on the public administration agencies, without any direct participation from the general public in the process. The assumption is: agents with longer periods of public service tend to step away from momentary political discussions that guide the current administration and seek to concentrate on institutionalized routines and procedures, making the most politically aligned individuals with the current government deliberate with less "passion" and more exchanging of knowledge and information. The theoretical framework of this research is institutionalism, which is guided by a more pragmatic view, facing the fluidity of reality in ways showing the multiple relations between agents and their respective institutions. The critical aspirations of this project rest on the works of professors Cass Sunstein, Adrian Vermeule, Philipp Pettit and in literature from both institutional theory and economic analysis of law, greatly influenced by the Chicago Law School. Methodologically, the paper is a theoretical review and pretends to be unfolded, in a future moment, in empirical tests for verification. This work has as its main analytical tool the appeal to theoretical and doctrinaire areas from the Juridical Sciences, by adopting the deductive and analytical method.Keywords: institutions, state, law, agencies
Procedia PDF Downloads 26425559 Optimizing the Public Policy Information System under the Environment of E-Government
Authors: Qian Zaijian
Abstract:
E-government is one of the hot issues in the current academic research of public policy and management. As the organic integration of information and communication technology (ICT) and public administration, e-government is one of the most important areas in contemporary information society. Policy information system is a basic subsystem of public policy system, its operation affects the overall effect of the policy process or even exerts a direct impact on the operation of a public policy and its success or failure. The basic principle of its operation is information collection, processing, analysis and release for a specific purpose. The function of E-government for public policy information system lies in the promotion of public access to the policy information resources, information transmission through e-participation, e-consultation in the process of policy analysis and processing of information and electronic services in policy information stored, to promote the optimization of policy information systems. However, due to many factors, the function of e-government to promote policy information system optimization has its practical limits. In the building of E-government in our country, we should take such path as adhering to the principle of freedom of information, eliminating the information divide (gap), expanding e-consultation, breaking down information silos and other major path, so as to promote the optimization of public policy information systems.Keywords: China, e-consultation, e-democracy, e-government, e-participation, ICTs, public policy information systems
Procedia PDF Downloads 86325558 An Analytical Method for Solving General Riccati Equation
Authors: Y. Pala, M. O. Ertas
Abstract:
In this paper, the general Riccati equation is analytically solved by a new transformation. By the method developed, looking at the transformed equation, whether or not an explicit solution can be obtained is readily determined. Since the present method does not require a proper solution for the general solution, it is especially suitable for equations whose proper solutions cannot be seen at first glance. Since the transformed second order linear equation obtained by the present transformation has the simplest form that it can have, it is immediately seen whether or not the original equation can be solved analytically. The present method is exemplified by several examples.Keywords: Riccati equation, analytical solution, proper solution, nonlinear
Procedia PDF Downloads 354