Search results for: validation process
15583 Combined Effect of Moving and Open Boundary Conditions in the Simulation of Inland Inundation Due to Far Field Tsunami
Authors: M. Ashaque Meah, Md. Fazlul Karim, M. Shah Noor, Nazmun Nahar Papri, M. Khalid Hossen, M. Ismoen
Abstract:
Tsunami and inundation modelling due to far field tsunami propagation in a limited area is a very challenging numerical task because it involves many aspects such as the formation of various types of waves and the irregularities of coastal boundaries. To compute the effect of far field tsunami and extent of inland inundation due to far field tsunami along the coastal belts of west coast of Malaysia and Southern Thailand, a formulated boundary condition and a moving boundary condition are simultaneously used. In this study, a boundary fitted curvilinear grid system is used in order to incorporate the coastal and island boundaries accurately as the boundaries of the model domain are curvilinear in nature and the bending is high. The tsunami response of the event 26 December 2004 along the west open boundary of the model domain is computed to simulate the effect of far field tsunami. Based on the data of the tsunami source at the west open boundary of the model domain, a boundary condition is formulated and applied to simulate the tsunami response along the coastal and island boundaries. During the simulation process, a moving boundary condition is initiated instead of fixed vertical seaside wall. The extent of inland inundation and tsunami propagation pattern are computed. Some comparisons are carried out to test the validation of the simultaneous use of the two boundary conditions. All simulations show excellent agreement with the data of observation.Keywords: open boundary condition, moving boundary condition, boundary-fitted curvilinear grids, far-field tsunami, shallow water equations, tsunami source, Indonesian tsunami of 2004
Procedia PDF Downloads 44615582 Edmonton Urban Growth Model as a Support Tool for the City Plan Growth Scenarios Development
Authors: Sinisa J. Vukicevic
Abstract:
Edmonton is currently one of the youngest North American cities and has achieved significant growth over the past 40 years. Strong urban shift requires a new approach to how the city is envisioned, planned, and built. This approach is evidence-based scenario development, and an urban growth model was a key support tool in framing Edmonton development strategies, developing urban policies, and assessing policy implications. The urban growth model has been developed using the Metronamica software platform. The Metronamica land use model evaluated the dynamic of land use change under the influence of key development drivers (population and employment), zoning, land suitability, and land and activity accessibility. The model was designed following the Big City Moves ideas: become greener as we grow, develop a rebuildable city, ignite a community of communities, foster a healing city, and create a city of convergence. The Big City Moves were converted to three development scenarios: ‘Strong Central City’, ‘Node City’, and ‘Corridor City’. Each scenario has a narrative story that expressed scenario’s high level goal, scenario’s approach to residential and commercial activities, to transportation vision, and employment and environmental principles. Land use demand was calculated for each scenario according to specific density targets. Spatial policies were analyzed according to their level of importance within the policy set definition for the specific scenario, but also through the policy measures. The model was calibrated on the way to reproduce known historical land use pattern. For the calibration, we used 2006 and 2011 land use data. The validation is done independently, which means we used the data we did not use for the calibration. The model was validated with 2016 data. In general, the modeling process contain three main phases: ‘from qualitative storyline to quantitative modelling’, ‘model development and model run’, and ‘from quantitative modelling to qualitative storyline’. The model also incorporates five spatial indicators: distance from residential to work, distance from residential to recreation, distance to river valley, urban expansion and habitat fragmentation. The major finding of this research could be looked at from two perspectives: the planning perspective and technology perspective. The planning perspective evaluates the model as a tool for scenario development. Using the model, we explored the land use dynamic that is influenced by a different set of policies. The model enables a direct comparison between the three scenarios. We explored the similarities and differences of scenarios and their quantitative indicators: land use change, population change (and spatial allocation), job allocation, density (population, employment, and dwelling unit), habitat connectivity, proximity to objects of interest, etc. From the technology perspective, the model showed one very important characteristic: the model flexibility. The direction for policy testing changed many times during the consultation process and model flexibility in applying all these changes was highly appreciated. The model satisfied our needs as scenario development and evaluation tool, but also as a communication tool during the consultation process.Keywords: urban growth model, scenario development, spatial indicators, Metronamica
Procedia PDF Downloads 9515581 Impact of Tablet Based Learning on Continuous Assessment (ESPRIT Smart School Framework)
Authors: Mehdi Attia, Sana Ben Fadhel, Lamjed Bettaieb
Abstract:
Mobile technology has become a part of our daily lives and assist learners (despite their level and age) in their leaning process using various apparatus and mobile devices (laptop, tablets, etc.). This paper presents a new learning framework based on tablets. This solution has been developed and tested in ESPRIT “Ecole Supérieure Privée d’Igénieurie et de Technologies”, a Tunisian school of engineering. This application is named ESSF: Esprit Smart School Framework. In this work, the main features of the proposed solution are listed, particularly its impact on the learners’ evaluation process. Learner’s assessment has always been a critical component of the learning process as it measures students’ knowledge. However, traditional evaluation methods in which the learner is evaluated once or twice each year cannot reflect his real level. This is why a continuous assessment (CA) process becomes necessary. In this context we have proved that ESSF offers many important features that enhance and facilitate the implementation of the CA process.Keywords: continuous assessment, mobile learning, tablet based learning, smart school, ESSF
Procedia PDF Downloads 33415580 Audit Is a Production Performance Tool
Authors: Lattari Samir
Abstract:
The performance of a production process is the result of proper operation where the management tools appear as the key to success through process management which consists of managing and implementing a quality policy, organizing and planning the manufacturing, and thus defining an efficient logic as the main areas covered by production management. To carry out this delicate mission, which requires reconciling often contradictory objectives, the auditor is called upon, who must be able to express an opinion on the effectiveness of the operation of the "production" function. To do this, the auditor must structure his mission in three phases, namely, the preparation phase to assimilate the particularities of this function, the implementation phase and the conclusion phase. The audit is a systematic and independent examination of all the stages of a manufacturing process intended to determine whether the pre-established arrangements for the combination of production factors are respected, whether their implementation is effective and whether they are relevant in relation to the goals.Keywords: audit, performance of process, independent examination, management tools, audit of accounts
Procedia PDF Downloads 7515579 End To End Process to Automate Batch Application
Authors: Nagmani Lnu
Abstract:
Often, Quality Engineering refers to testing the applications that either have a User Interface (UI) or an Application Programming Interface (API). We often find mature test practices, standards, and automation regarding UI or API testing. However, another kind is present in almost all types of industries that deal with data in bulk and often get handled through something called a Batch Application. This is primarily an offline application companies develop to process large data sets that often deal with multiple business rules. The challenge gets more prominent when we try to automate batch testing. This paper describes the approaches taken to test a Batch application from a Financial Industry to test the payment settlement process (a critical use case in all kinds of FinTech companies), resulting in 100% test automation in Test Creation and Test execution. One can follow this approach for any other batch use cases to achieve a higher efficiency in their testing process.Keywords: batch testing, batch test automation, batch test strategy, payments testing, payments settlement testing
Procedia PDF Downloads 6015578 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process
Authors: Hong-Ming Chen
Abstract:
This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.Keywords: optimization, interest rate model, jump process, deterministic
Procedia PDF Downloads 16115577 Development of a CFD Model for PCM Based Energy Storage in a Vertical Triplex Tube Heat Exchanger
Authors: Pratibha Biswal, Suyash Morchhale, Anshuman Singh Yadav, Shubham Sanjay Chobe
Abstract:
Energy demands are increasing whereas energy sources, especially non-renewable sources are limited. Due to the intermittent nature of renewable energy sources, it has become the need of the hour to find new ways to store energy. Out of various energy storage methods, latent heat thermal storage devices are becoming popular due to their high energy density per unit mass and volume at nearly constant temperature. This work presents a computational fluid dynamics (CFD) model using ANSYS FLUENT 19.0 for energy storage characteristics of a phase change material (PCM) filled in a vertical triplex tube thermal energy storage system. A vertical triplex tube heat exchanger, just like its name consists of three concentric tubes (pipe sections) for parting the device into three fluid domains. The PCM is filled in the middle domain with heat transfer fluids flowing in the outer and innermost domains. To enhance the heat transfer inside the PCM, eight fins have been incorporated between the internal and external tubes. These fins run radially outwards from the outer-wall of innermost tube to the inner-wall of the middle tube dividing the middle domain (between innermost and middle tube) into eight sections. These eight sections are then filled with a PCM. The validation is carried with earlier work and a grid independence test is also presented. Further studies on freezing and melting process were carried out. The results are presented in terms of pictorial representation of isotherms and liquid fractionKeywords: heat exchanger, thermal energy storage, phase change material, CFD, latent heat
Procedia PDF Downloads 15315576 How Envisioning Process Is Constructed: An Exploratory Research Comparing Three International Public Televisions
Authors: Alexandre Bedard, Johane Brunet, Wendellyn Reid
Abstract:
Public Television is constantly trying to maintain and develop its audience. And to achieve those goals, it needs a strong and clear vision. Vision or envision is a multidimensional process; it is simultaneously a conduit that orients and fixes the future, an idea that comes before the strategy and a mean by which action is accomplished, from a business perspective. Also, vision is often studied from a prescriptive and instrumental manner. Based on our understanding of the literature, we were able to explain how envisioning, as a process, is a creative one; it takes place in the mind and uses wisdom and intelligence through a process of evaluation, analysis and creation. Through an aggregation of the literature, we build a model of the envisioning process, based on past experiences, perceptions and knowledge and influenced by the context, being the individual, the organization and the environment. With exploratory research in which vision was deciphered through the discourse, through a qualitative and abductive approach and a grounded theory perspective, we explored three extreme cases, with eighteen interviews with experts, leaders, politicians, actors of the industry, etc. and more than twenty hours of interviews in three different countries. We compared the strategy, the business model, and the political and legal forces. We also looked at the history of each industry from an inertial point of view. Our analysis of the data revealed that a legitimacy effect due to the audience, the innovation and the creativity of the institutions was at the cornerstone of what would influence the envisioning process. This allowed us to identify how different the process was for Canadian, French and UK public broadcasters, although we concluded that the three of them had a socially constructed vision for their future, based on stakeholder management and an emerging role for the managers: ideas brokers.Keywords: envisioning process, international comparison, television, vision
Procedia PDF Downloads 13215575 Biomimetic Paradigms in Architectural Conceptualization: Science, Technology, Engineering, Arts and Mathematics in Higher Education
Authors: Maryam Kalkatechi
Abstract:
The application of algorithms in architecture has been realized as geometric forms which are increasingly being used by architecture firms. The abstraction of ideas in a formulated algorithm is not possible. There is still a gap between design innovation and final built in prescribed formulas, even the most aesthetical realizations. This paper presents the application of erudite design process to conceptualize biomimetic paradigms in architecture. The process is customized to material and tectonics. The first part of the paper outlines the design process elements within four biomimetic pre-concepts. The pre-concepts are chosen from plants family. These include the pine leaf, the dandelion flower; the cactus flower and the sun flower. The choice of these are related to material qualities and natural pattern of the tectonics of these plants. It then focuses on four versions of tectonic comprehension of one of the biomimetic pre-concepts. The next part of the paper discusses the implementation of STEAM in higher education in architecture. This is shown by the relations within the design process and the manifestation of the thinking processes. The A in the SETAM, in this case, is only achieved by the design process, an engaging event as a performing arts, in which the conceptualization and development is realized in final built.Keywords: biomimetic paradigm, erudite design process, tectonic, STEAM (Science, Technology, Engineering, Arts, Mathematic)
Procedia PDF Downloads 21115574 Bleeding-Heart Altruists and Calculating Utilitarians: Applying Process Dissociation to Self-sacrificial Dilemmas
Authors: David Simpson, Kyle Nash
Abstract:
There is considerable evidence linking slow, deliberative reasoning (system 2) with utilitarian judgments in dilemmas involving the sacrificing of another person for the greater good (other-sacrificial dilemmas). Joshua Greene has argued, based on this kind of evidence, that system 2 drives utilitarian judgments. However, the evidence on whether system 2 is associated with utilitarian judgments in self-sacrificial dilemmas is more mixed. We employed process dissociation to measure a self-sacrificial utilitarian (SU) parameter and an other-sacrificial (OU) utilitarian parameter. It was initially predicted that contra Greene, the cognitive reflection test (CRT) would only be positively correlated with the OU parameter and not the SU parameter. However, Greene’s hypothesis was corroborated: the CRT positively correlated with both the OU parameter and the SU parameter. By contrast, the CRT did not correlate with the other two moral parameters we extracted (altruism and deontology).Keywords: dual-process model, utilitarianism, altruism, reason, emotion, process dissociation
Procedia PDF Downloads 15315573 Hybrid Data-Driven Drilling Rate of Penetration Optimization Scheme Guided by Geological Formation and Historical Data
Authors: Ammar Alali, Mahmoud Abughaban, William Contreras Otalvora
Abstract:
Optimizing the drilling process for cost and efficiency requires the optimization of the rate of penetration (ROP). ROP is the measurement of the speed at which the wellbore is created, in units of feet per hour. It is the primary indicator of measuring drilling efficiency. Maximization of the ROP can indicate fast and cost-efficient drilling operations; however, high ROPs may induce unintended events, which may lead to nonproductive time (NPT) and higher net costs. The proposed ROP optimization solution is a hybrid, data-driven system that aims to improve the drilling process, maximize the ROP, and minimize NPT. The system consists of two phases: (1) utilizing existing geological and drilling data to train the model prior, and (2) real-time adjustments of the controllable dynamic drilling parameters [weight on bit (WOB), rotary speed (RPM), and pump flow rate (GPM)] that direct influence on the ROP. During the first phase of the system, geological and historical drilling data are aggregated. After, the top-rated wells, as a function of high instance ROP, are distinguished. Those wells are filtered based on NPT incidents, and a cross-plot is generated for the controllable dynamic drilling parameters per ROP value. Subsequently, the parameter values (WOB, GPM, RPM) are calculated as a conditioned mean based on physical distance, following Inverse Distance Weighting (IDW) interpolation methodology. The first phase is concluded by producing a model of drilling best practices from the offset wells, prioritizing the optimum ROP value. This phase is performed before the commencing of drilling. Starting with the model produced in phase one, the second phase runs an automated drill-off test, delivering live adjustments in real-time. Those adjustments are made by directing the driller to deviate two of the controllable parameters (WOB and RPM) by a small percentage (0-5%), following the Constrained Random Search (CRS) methodology. These minor incremental variations will reveal new drilling conditions, not explored before through offset wells. The data is then consolidated into a heat-map, as a function of ROP. A more optimum ROP performance is identified through the heat-map and amended in the model. The validation process involved the selection of a planned well in an onshore oil field with hundreds of offset wells. The first phase model was built by utilizing the data points from the top-performing historical wells (20 wells). The model allows drillers to enhance decision-making by leveraging existing data and blending it with live data in real-time. An empirical relationship between controllable dynamic parameters and ROP was derived using Artificial Neural Networks (ANN). The adjustments resulted in improved ROP efficiency by over 20%, translating to at least 10% saving in drilling costs. The novelty of the proposed system lays is its ability to integrate historical data, calibrate based geological formations, and run real-time global optimization through CRS. Those factors position the system to work for any newly drilled well in a developing field event.Keywords: drilling optimization, geological formations, machine learning, rate of penetration
Procedia PDF Downloads 13115572 Simultaneous Determination of Cefazolin and Cefotaxime in Urine by HPLC
Authors: Rafika Bibi, Khaled Khaladi, Hind Mokran, Mohamed Salah Boukhechem
Abstract:
A high performance liquid chromatographic method with ultraviolet detection at 264nm was developed and validate for quantitative determination and separation of cefazolin and cefotaxime in urine, the mobile phase consisted of acetonitrile and phosphate buffer pH4,2(15 :85) (v/v) pumped through ODB 250× 4,6 mm, 5um column at a flow rate of 1ml/min, loop of 20ul. In this condition, the validation of this technique showed that it is linear in a range of 0,01 to 10ug/ml with a good correlation coefficient ( R>0,9997), retention time of cefotaxime, cefazolin was 9.0, 10.1 respectively, the statistical evaluation of the method was examined by means of within day (n=6) and day to day (n=5) and was found to be satisfactory with high accuracy and precision.Keywords: cefazolin, cefotaxime, HPLC, bioscience, biochemistry, pharmaceutical
Procedia PDF Downloads 36315571 Spatial Point Process Analysis of Dengue Fever in Tainan, Taiwan
Authors: Ya-Mei Chang
Abstract:
This research is intended to apply spatio-temporal point process methods to the dengue fever data in Tainan. The spatio-temporal intensity function of the dataset is assumed to be separable. The kernel estimation is a widely used approach to estimate intensity functions. The intensity function is very helpful to study the relation of the spatio-temporal point process and some covariates. The covariate effects might be nonlinear. An nonparametric smoothing estimator is used to detect the nonlinearity of the covariate effects. A fitted parametric model could describe the influence of the covariates to the dengue fever. The correlation between the data points is detected by the K-function. The result of this research could provide useful information to help the government or the stakeholders making decisions.Keywords: dengue fever, spatial point process, kernel estimation, covariate effect
Procedia PDF Downloads 35115570 Kou Jump Diffusion Model: An Application to the SP 500; Nasdaq 100 and Russell 2000 Index Options
Authors: Wajih Abbassi, Zouhaier Ben Khelifa
Abstract:
The present research points towards the empirical validation of three options valuation models, the ad-hoc Black-Scholes model as proposed by Berkowitz (2001), the constant elasticity of variance model of Cox and Ross (1976) and the Kou jump-diffusion model (2002). Our empirical analysis has been conducted on a sample of 26,974 options written on three indexes, the S&P 500, Nasdaq 100 and the Russell 2000 that were negotiated during the year 2007 just before the sub-prime crisis. We start by presenting the theoretical foundations of the models of interest. Then we use the technique of trust-region-reflective algorithm to estimate the structural parameters of these models from cross-section of option prices. The empirical analysis shows the superiority of the Kou jump-diffusion model. This superiority arises from the ability of this model to portray the behavior of market participants and to be closest to the true distribution that characterizes the evolution of these indices. Indeed the double-exponential distribution covers three interesting properties that are: the leptokurtic feature, the memory less property and the psychological aspect of market participants. Numerous empirical studies have shown that markets tend to have both overreaction and under reaction over good and bad news respectively. Despite of these advantages there are not many empirical studies based on this model partly because probability distribution and option valuation formula are rather complicated. This paper is the first to have used the technique of nonlinear curve-fitting through the trust-region-reflective algorithm and cross-section options to estimate the structural parameters of the Kou jump-diffusion model.Keywords: jump-diffusion process, Kou model, Leptokurtic feature, trust-region-reflective algorithm, US index options
Procedia PDF Downloads 42915569 Enhanced CNN for Rice Leaf Disease Classification in Mobile Applications
Authors: Kayne Uriel K. Rodrigo, Jerriane Hillary Heart S. Marcial, Samuel C. Brillo
Abstract:
Rice leaf diseases significantly impact yield production in rice-dependent countries, affecting their agricultural sectors. As part of precision agriculture, early and accurate detection of these diseases is crucial for effective mitigation practices and minimizing crop losses. Hence, this study proposes an enhancement to the Convolutional Neural Network (CNN), a widely-used method for Rice Leaf Disease Image Classification, by incorporating MobileViTV2—a recently advanced architecture that combines CNN and Vision Transformer models while maintaining fewer parameters, making it suitable for broader deployment on edge devices. Our methodology utilizes a publicly available rice disease image dataset from Kaggle, which was validated by a university structural biologist following the guidelines provided by the Philippine Rice Institute (PhilRice). Modifications to the dataset include renaming certain disease categories and augmenting the rice leaf image data through rotation, scaling, and flipping. The enhanced dataset was then used to train the MobileViTV2 model using the Timm library. The results of our approach are as follows: the model achieved notable performance, with 98% accuracy in both training and validation, 6% training and validation loss, and a Receiver Operating Characteristic (ROC) curve ranging from 95% to 100% for each label. Additionally, the F1 score was 97%. These metrics demonstrate a significant improvement compared to a conventional CNN-based approach, which, in a previous 2022 study, achieved only 78% accuracy after using 5 convolutional layers and 2 dense layers. Thus, it can be concluded that MobileViTV2, with its fewer parameters, outperforms traditional CNN models, particularly when applied to Rice Leaf Disease Image Identification. For future work, we recommend extending this model to include datasets validated by international rice experts and broadening the scope to accommodate biotic factors such as rice pest classification, as well as abiotic stressors such as climate, soil quality, and geographic information, which could improve the accuracy of disease prediction.Keywords: convolutional neural network, MobileViTV2, rice leaf disease, precision agriculture, image classification, vision transformer
Procedia PDF Downloads 2515568 Lean Manufacturing Implementation in Fused Plastic Bags Industry
Authors: Tareq Issa
Abstract:
Lean manufacturing is concerned with the implementation of several tools and methodologies that aim for the continuous elimination of wastes throughout manufacturing process flow in the production system. This research addresses the implementation of lean principles and tools in a small-medium industry focusing on 'fused' plastic bags production company in Amman, Jordan. In this production operation, the major type of waste to eliminate include material, waiting-transportation, and setup wastes. The primary goal is to identify and implement selected lean strategies to eliminate waste in the manufacturing process flow. A systematic approach was used for the implementation of lean principles and techniques, through the application of Value Stream Mapping analysis. The current state value stream map was constructed to improve the plastic bags manufacturing process through identifying opportunities to eliminate waste and its sources. Also, the future-state value stream map was developed describing improvements in the overall manufacturing process resulting from eliminating wastes. The implementation of VSM, 5S, Kanban, Kaizen, and Reduced lot size methods have provided significant benefits and results. Productivity has increased to 95.4%, delivery schedule attained at 99-100%, reduction in total inventory to 1.4 days and the setup time for the melting process was reduced to about 30 minutes.Keywords: lean implementation, plastic bags industry, value stream map, process flow
Procedia PDF Downloads 17515567 The Using of Smart Power Concepts in Military Targeting Process
Authors: Serdal AKYUZ
Abstract:
The smart power is the use of soft and hard power together in consideration of existing circumstances. Soft power can be defined as the capability of changing perception of any target mass by employing policies based on legality. The hard power, generally, uses military and economic instruments which are the concrete indicator of general power comprehension. More than providing a balance between soft and hard power, smart power creates a proactive combination by assessing existing resources. Military targeting process (MTP), as stated in smart power methodology, benefits from a wide scope of lethal and non-lethal weapons to reach intended end state. The Smart powers components can be used in military targeting process similar to using of lethal or non-lethal weapons. This paper investigates the current use of Smart power concept, MTP and presents a new approach to MTP from smart power concept point of view.Keywords: future security environment, hard power, military targeting process, soft power, smart power
Procedia PDF Downloads 47615566 Enhancement of MIMO H₂S Gas Sweetening Separator Tower Using Fuzzy Logic Controller Array
Authors: Muhammad M. A. S. Mahmoud
Abstract:
Natural gas sweetening process is a controlled process that must be done at maximum efficiency and with the highest quality. In this work, due to complexity and non-linearity of the process, the H₂S gas separation and the intelligent fuzzy controller, which is used to enhance the process, are simulated in MATLAB – Simulink. The new design of fuzzy control for Gas Separator is discussed in this paper. The design is based on the utilization of linear state-estimation to generate the internal knowledge-base that stores input-output pairs. The obtained input/output pairs are then used to design a feedback fuzzy controller. The proposed closed-loop fuzzy control system maintains the system asymptotically-stability while it enhances the system time response to achieve better control of the concentration of the output gas from the tower. Simulation studies are carried out to illustrate the Gas Separator system performance.Keywords: gas separator, gas sweetening, intelligent controller, fuzzy control
Procedia PDF Downloads 47115565 A Tool for Assessing Performance and Structural Quality of Business Process
Authors: Mariem Kchaou, Wiem Khlif, Faiez Gargouri
Abstract:
Modeling business processes is an essential task when evaluating, improving, or documenting existing business processes. To be efficient in such tasks, a business process model (BPM) must have high structural quality and high performance. Evidently, evaluating the performance of a business process model is a necessary step to reduce time, cost, while assessing the structural quality aims to improve the understandability and the modifiability of the BPMN model. To achieve these objectives, a set of structural and performance measures have been proposed. Since the diversity of measures, we propose a framework that integrates both structural and performance aspects for classifying them. Our measure classification is based on business process model perspectives (e.g., informational, functional, organizational, behavioral, and temporal), and the elements (activity, event, actor, etc.) involved in computing the measures. Then, we implement this framework in a tool assisting the structural quality and the performance of a business process. The tool helps the designers to select an appropriate subset of measures associated with the corresponding perspective and to calculate and interpret their values in order to improve the structural quality and the performance of the model.Keywords: performance, structural quality, perspectives, tool, classification framework, measures
Procedia PDF Downloads 15715564 The Use of Artificial Intelligence to Harmonization in the Lawmaking Process
Authors: Supriyadi, Andi Intan Purnamasari, Aminuddin Kasim, Sulbadana, Mohammad Reza
Abstract:
The development of the Industrial Revolution Era 4.0 brought a significant influence in the administration of countries in all parts of the world, including Indonesia, not only in the administration and economic sectors but the ways and methods of forming laws should also be adjusted. Until now, the process of making laws carried out by the Parliament with the Government still uses the classical method. The law-making process still uses manual methods, such as typing harmonization of regulations, so that it is not uncommon for errors to occur, such as writing errors, copying articles and so on, things that require a high level of accuracy and relying on inventory and harmonization carried out manually by humans. However, this method often creates several problems due to errors and inaccuracies on the part of officers who harmonize laws after discussion and approval; this has a very serious impact on the system of law formation in Indonesia. The use of artificial intelligence in the process of forming laws seems to be justified and becomes the answer in order to minimize the disharmony of various laws and regulations. This research is normative research using the Legislative Approach and the Conceptual Approach. This research focuses on the question of how to use Artificial Intelligence for Harmonization in the Lawmaking Process.Keywords: artificial intelligence, harmonization, laws, intelligence
Procedia PDF Downloads 16215563 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study
Authors: K. Adu Michael, K. Alese Boniface
Abstract:
Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.Keywords: client/customer, problem statement, requirements engineering, software developers
Procedia PDF Downloads 40615562 Modelling and Optimization of Laser Cutting Operations
Authors: Hany Mohamed Abdu, Mohamed Hassan Gadallah, El-Giushi Mokhtar, Yehia Mahmoud Ismail
Abstract:
Laser beam cutting is one nontraditional machining process. This paper optimizes the parameters of Laser beam cutting machining parameters of Stainless steel (316L) by considering the effect of input parameters viz. power, oxygen pressure, frequency and cutting speed. Statistical design of experiments are carried in three different levels and process responses such as 'Average kerf taper (Ta)' and 'Surface Roughness (Ra)' are measured accordingly. A quadratic mathematical model (RSM) for each of the responses is developed as a function of the process parameters. Responses predicted by the models (as per Taguchi’s L27 OA) are employed to search for an optimal parametric combination to achieve desired yield of the process. RSM models are developed for mean responses, S/N ratio, and standard deviation of responses. Optimization models are formulated as single objective problem subject to process constraints. Models are formulated based on Analysis of Variance (ANOVA) using MATLAB environment. Optimum solutions are compared with Taguchi Methodology results.Keywords: optimization, laser cutting, robust design, kerf width, Taguchi method, RSM and DOE
Procedia PDF Downloads 62015561 Probing Multiple Relaxation Process in Zr-Cu Base Alloy Using Mechanical Spectroscopy
Authors: A. P. Srivastava, D. Srivastava, D. J. Browne
Abstract:
Relaxation dynamics of Zr44Cu40Al8Ag8 bulk metallic glass (BMG) has been probed using dynamic mechanical analyzer. The BMG sample was casted in the form of a plate of dimension 55 mm x 40 mm x 3 mm using tilt casting technique. X-ray diffraction and transmission electron microscope have been used for the microstructural characterization of as-cast BMG. For the mechanical spectroscopy study, samples in the form of a bar of size 55 mm X 2 mm X 3 mm were machined from the BMG plate. The mechanical spectroscopy was performed on dynamic mechanical analyzer (DMA) by 50 mm 3-point bending method in a nitrogen atmosphere. It was observed that two glass transition process were competing in supercooled liquid region around temperature 390°C and 430°C. The supercooled liquid state was completely characterized using DMA and differential scanning calorimeter (DSC). In addition to the main α-relaxation process, presence of β relaxation process around temperature 360°C; below the glass transition temperature was also observed. The β relaxation process could be described by Arrhenius law with the activation energy of 160 kJ/mole. The volume of the flow unit associated with this relaxation process has been estimated. The results from DMA study has been used to characterize the shear transformation zone in terms of activation volume and size. High fragility parameter value of 34 and higher activation volume indicates that this alloy could show good plasticity in supercooled liquid region. The possible mechanism for the relaxation processes has been discussed.Keywords: DMA, glass transition, metallic glass, thermoplastic forming
Procedia PDF Downloads 29615560 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing
Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall
Abstract:
Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear
Procedia PDF Downloads 29815559 Rounded-off Measurements and Their Implication on Control Charts
Authors: Ran Etgar
Abstract:
The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.Keywords: inaccurate measurement, SPC, statistical process control, rounded-off, control chart
Procedia PDF Downloads 4115558 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6
Authors: M. Moslehpour, S. Khorsandi
Abstract:
Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.Keywords: NDP, IPsec, SEND, CGA, modifier, malicious node, self-computing, distributed-computing
Procedia PDF Downloads 27815557 Detection of Powdery Mildew Disease in Strawberry Using Image Texture and Supervised Classifiers
Authors: Sultan Mahmud, Qamar Zaman, Travis Esau, Young Chang
Abstract:
Strawberry powdery mildew (PM) is a serious disease that has a significant impact on strawberry production. Field scouting is still a major way to find PM disease, which is not only labor intensive but also almost impossible to monitor disease severity. To reduce the loss caused by PM disease and achieve faster automatic detection of the disease, this paper proposes an approach for detection of the disease, based on image texture and classified with support vector machines (SVMs) and k-nearest neighbors (kNNs). The methodology of the proposed study is based on image processing which is composed of five main steps including image acquisition, pre-processing, segmentation, features extraction and classification. Two strawberry fields were used in this study. Images of healthy leaves and leaves infected with PM (Sphaerotheca macularis) disease under artificial cloud lighting condition. Colour thresholding was utilized to segment all images before textural analysis. Colour co-occurrence matrix (CCM) was introduced for extraction of textural features. Forty textural features, related to a physiological parameter of leaves were extracted from CCM of National television system committee (NTSC) luminance, hue, saturation and intensity (HSI) images. The normalized feature data were utilized for training and validation, respectively, using developed classifiers. The classifiers have experimented with internal, external and cross-validations. The best classifier was selected based on their performance and accuracy. Experimental results suggested that SVMs classifier showed 98.33%, 85.33%, 87.33%, 93.33% and 95.0% of accuracy on internal, external-I, external-II, 4-fold cross and 5-fold cross-validation, respectively. Whereas, kNNs results represented 90.0%, 72.00%, 74.66%, 89.33% and 90.3% of classification accuracy, respectively. The outcome of this study demonstrated that SVMs classified PM disease with a highest overall accuracy of 91.86% and 1.1211 seconds of processing time. Therefore, overall results concluded that the proposed study can significantly support an accurate and automatic identification and recognition of strawberry PM disease with SVMs classifier.Keywords: powdery mildew, image processing, textural analysis, color co-occurrence matrix, support vector machines, k-nearest neighbors
Procedia PDF Downloads 12015556 Systemic Functional Grammar Analysis of Barack Obama's Second Term Inaugural Speech
Authors: Sadiq Aminu, Ahmed Lamido
Abstract:
This research studies Barack Obama’s second inaugural speech using Halliday’s Systemic Functional Grammar (SFG). SFG is a text grammar which describes how language is used, so that the meaning of the text can be better understood. The primary source of data in this research work is Barack Obama’s second inaugural speech which was obtained from the internet. The analysis of the speech was based on the ideational and textual metafunctions of Systemic Functional Grammar. Specifically, the researcher analyses the Process Types and Participants (ideational) and the Theme/Rheme (textual). It was found that material process (process of doing) was the most frequently used ‘Process type’ and ‘We’ which refers to the people of America was the frequently used ‘Theme’. Application of the SFG theory, therefore, gives a better meaning to Barack Obama’s speech.Keywords: ideational, metafunction, rheme, textual, theme
Procedia PDF Downloads 15915555 Evaluation of Agricultural Drought Impact in the Crop Productivity of East Gojjam Zone
Authors: Walelgn Dilnesa Cherie, Fasikaw Atanaw Zimale, Bekalu W. Asres
Abstract:
The most catastrophic condition for agricultural production is a drought event, which is also one of the most hydro-metrological-related hazards. According to the combined susceptibility of plants to meteorological and hydrological conditions, agricultural drought is defined as the magnitude, severity, and duration of a drought that affects crop production. The accurate and timely assessment of agricultural drought can lead to the development of risk management strategies, appropriate proactive mechanisms for the protection of farmers, and the improvement of food security. The evaluation of agricultural drought in the East Gojjam zone was the primary subject of this study. To identify the agricultural drought, soil moisture anomalies, soil moisture deficit indices, and Normalized Difference Vegetation Indices (NDVI) are used. The measured welting point, field capacity, and soil moisture were utilized to validate the soil water deficit indices computed from the satellite data. The soil moisture and soil water deficit indices in 2013 in all woredas were minimum; this makes vegetation stress also in all woredas. The soil moisture content decreased in 2013/2014/2019, and 2021 in Dejen, 2014, and 2019 in Awobel Woreda. The max/ min values of NDVI in 2013 are minimum; it dominantly shows vegetation stress and an observed agricultural drought that happened in all woredas. The validation process of satellite and in-situ soil moisture and soil water deficit indices shows a good agreement with a value of R²=0.87 and 0.56, respectively. The study area becomes drought detected region, so government officials, policymakers, and environmentalists pay attention to the protection of drought effects.Keywords: NDVI, agricultural drought, SWDI, soil moisture
Procedia PDF Downloads 8615554 How to Enhance Performance of Universities by Implementing Balanced Scorecard with Using FDM and ANP
Authors: Neda Jalaliyoon, Nooh Abu Bakar, Hamed Taherdoost
Abstract:
The present research recommended balanced scorecard (BSC) framework to appraise the performance of the universities. As the original model of balanced scorecard has four perspectives in order to implement BSC in present research the same model with “financial perspective”, “customer”,” internal process” and “learning and growth” is used as well. With applying fuzzy Delphi method (FDM) and questionnaire sixteen measures of performance were identified. Moreover, with using the analytic network process (ANP) the weights of the selected indicators were determined. Results indicated that the most important BSC’s aspect were Internal Process (0.3149), Customer (0.2769), Learning and Growth (0.2049), and Financial (0.2033) respectively. The proposed BSC framework can help universities to enhance their efficiency in competitive environment.Keywords: balanced scorecard, higher education, fuzzy delphi method, analytic network process (ANP)
Procedia PDF Downloads 426