Search results for: standard procedures process
19114 The Impact of Artificial Intelligence on Digital Construction
Authors: Omil Nady Mahrous Maximous
Abstract:
The construction industry is currently experiencing a shift towards digitisation. This transformation is driven by adopting technologies like Building Information Modelling (BIM), drones, and augmented reality (AR). These advancements are revolutionizing the process of designing, constructing, and operating projects. BIM, for instance, is a new way of communicating and exploiting technology such as software and machinery. It enables the creation of a replica or virtual model of buildings or infrastructure projects. It facilitates simulating construction procedures, identifying issues beforehand, and optimizing designs accordingly. Drones are another tool in this revolution, as they can be utilized for site surveys, inspections, and even deliveries. Moreover, AR technology provides real-time information to workers involved in the project. Implementing these technologies in the construction industry has brought about improvements in efficiency, safety measures, and sustainable practices. BIM helps minimize rework and waste materials, while drones contribute to safety by reducing workers' exposure to areas. Additionally, AR plays a role in worker safety by delivering instructions and guidance during operations. Although the digital transformation within the construction industry is still in its early stages, it holds the potential to reshape project delivery methods entirely. By embracing these technologies, construction companies can boost their profitability while simultaneously reducing their environmental impact and ensuring safer practices.Keywords: architectural education, construction industry, digital learning environments, immersive learning BIM, digital construction, construction technologies, digital transformation artificial intelligence, collaboration, digital architecture, digital design theory, material selection, space construction
Procedia PDF Downloads 5819113 Audit Is a Production Performance Tool
Authors: Lattari Samir
Abstract:
The performance of a production process is the result of proper operation where the management tools appear as the key to success through process management which consists of managing and implementing a quality policy, organizing and planning the manufacturing, and thus defining an efficient logic as the main areas covered by production management. To carry out this delicate mission, which requires reconciling often contradictory objectives, the auditor is called upon, who must be able to express an opinion on the effectiveness of the operation of the "production" function. To do this, the auditor must structure his mission in three phases, namely, the preparation phase to assimilate the particularities of this function, the implementation phase and the conclusion phase. The audit is a systematic and independent examination of all the stages of a manufacturing process intended to determine whether the pre-established arrangements for the combination of production factors are respected, whether their implementation is effective and whether they are relevant in relation to the goals.Keywords: audit, performance of process, independent examination, management tools, audit of accounts
Procedia PDF Downloads 7519112 End To End Process to Automate Batch Application
Authors: Nagmani Lnu
Abstract:
Often, Quality Engineering refers to testing the applications that either have a User Interface (UI) or an Application Programming Interface (API). We often find mature test practices, standards, and automation regarding UI or API testing. However, another kind is present in almost all types of industries that deal with data in bulk and often get handled through something called a Batch Application. This is primarily an offline application companies develop to process large data sets that often deal with multiple business rules. The challenge gets more prominent when we try to automate batch testing. This paper describes the approaches taken to test a Batch application from a Financial Industry to test the payment settlement process (a critical use case in all kinds of FinTech companies), resulting in 100% test automation in Test Creation and Test execution. One can follow this approach for any other batch use cases to achieve a higher efficiency in their testing process.Keywords: batch testing, batch test automation, batch test strategy, payments testing, payments settlement testing
Procedia PDF Downloads 6019111 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process
Authors: Hong-Ming Chen
Abstract:
This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.Keywords: optimization, interest rate model, jump process, deterministic
Procedia PDF Downloads 16119110 How Envisioning Process Is Constructed: An Exploratory Research Comparing Three International Public Televisions
Authors: Alexandre Bedard, Johane Brunet, Wendellyn Reid
Abstract:
Public Television is constantly trying to maintain and develop its audience. And to achieve those goals, it needs a strong and clear vision. Vision or envision is a multidimensional process; it is simultaneously a conduit that orients and fixes the future, an idea that comes before the strategy and a mean by which action is accomplished, from a business perspective. Also, vision is often studied from a prescriptive and instrumental manner. Based on our understanding of the literature, we were able to explain how envisioning, as a process, is a creative one; it takes place in the mind and uses wisdom and intelligence through a process of evaluation, analysis and creation. Through an aggregation of the literature, we build a model of the envisioning process, based on past experiences, perceptions and knowledge and influenced by the context, being the individual, the organization and the environment. With exploratory research in which vision was deciphered through the discourse, through a qualitative and abductive approach and a grounded theory perspective, we explored three extreme cases, with eighteen interviews with experts, leaders, politicians, actors of the industry, etc. and more than twenty hours of interviews in three different countries. We compared the strategy, the business model, and the political and legal forces. We also looked at the history of each industry from an inertial point of view. Our analysis of the data revealed that a legitimacy effect due to the audience, the innovation and the creativity of the institutions was at the cornerstone of what would influence the envisioning process. This allowed us to identify how different the process was for Canadian, French and UK public broadcasters, although we concluded that the three of them had a socially constructed vision for their future, based on stakeholder management and an emerging role for the managers: ideas brokers.Keywords: envisioning process, international comparison, television, vision
Procedia PDF Downloads 13219109 Biomimetic Paradigms in Architectural Conceptualization: Science, Technology, Engineering, Arts and Mathematics in Higher Education
Authors: Maryam Kalkatechi
Abstract:
The application of algorithms in architecture has been realized as geometric forms which are increasingly being used by architecture firms. The abstraction of ideas in a formulated algorithm is not possible. There is still a gap between design innovation and final built in prescribed formulas, even the most aesthetical realizations. This paper presents the application of erudite design process to conceptualize biomimetic paradigms in architecture. The process is customized to material and tectonics. The first part of the paper outlines the design process elements within four biomimetic pre-concepts. The pre-concepts are chosen from plants family. These include the pine leaf, the dandelion flower; the cactus flower and the sun flower. The choice of these are related to material qualities and natural pattern of the tectonics of these plants. It then focuses on four versions of tectonic comprehension of one of the biomimetic pre-concepts. The next part of the paper discusses the implementation of STEAM in higher education in architecture. This is shown by the relations within the design process and the manifestation of the thinking processes. The A in the SETAM, in this case, is only achieved by the design process, an engaging event as a performing arts, in which the conceptualization and development is realized in final built.Keywords: biomimetic paradigm, erudite design process, tectonic, STEAM (Science, Technology, Engineering, Arts, Mathematic)
Procedia PDF Downloads 21119108 Simulation of Internal Flow Field of Pitot-Tube Jet Pump
Authors: Iqra Noor, Ihtzaz Qamar
Abstract:
Pitot-tube Jet pump, single-stage pump with low flow rate and high head, consists of a radial impeller that feeds water to rotating cavity. Water then enters stationary pitot-tube collector (diffuser), which discharges to the outside. By means of ANSYS Fluent 15.0, the internal flow characteristics for Pitot-tube Jet pump with standard pitot and curved pitot are studied. Under design condition, realizable k-e turbulence model and SIMPLEC algorithm are used to calculate 3D flow field inside both pumps. The simulation results reveal that energy is imparted to the flow by impeller and inside the rotor, forced vortex type flow is observed. Total pressure decreases inside pitot-tube whereas static pressure increases. Changing pitot-tube from standard to curved shape results in minimum flow circulation inside pitot-tube and leads to a higher pump performance.Keywords: CFD, flow circulation, high pressure pump, impeller, internal flow, pickup tube pump, rectangle channels, rotating casing, turbulence
Procedia PDF Downloads 16019107 In Online and Laboratory We Trust: Comparing Trust Game Behavior in Three Environments
Authors: Kaisa M. Herne, Hanna E. Björkstedt
Abstract:
Comparisons of online and laboratory environments are important for assessing whether the environment influences behavioral results. Trust game behavior was examined in three environments: 1) The standard laboratory setting with physically present participants (laboratory), 2) An online environment with an online meeting before playing the trust game (online plus a meeting); and 3) An online environment without a meeting (online without a meeting). In laboratory, participants were present in a classroom and played the trust game anonymously via computers. Online plus a meeting mimicked the laboratory in that participants could see each other in an online meeting before sessions started, whereas online without a meeting was a standard online experiment in which participants did not see each other at any stages of the experiment. Participants were recruited through pools of student subjects at two universities. The trust game was identical in all conditions; it was played with the same software, anonymously, and with stranger matching. There were no statistically significant differences between the treatment conditions regarding trust or trustworthiness. Results suggest that conducting trust game experiments online will yield similar results to experiments implemented in a laboratory.Keywords: laboratory vs. online experiment, trust behavior, trust game, trustworthiness behavior
Procedia PDF Downloads 7819106 Bleeding-Heart Altruists and Calculating Utilitarians: Applying Process Dissociation to Self-sacrificial Dilemmas
Authors: David Simpson, Kyle Nash
Abstract:
There is considerable evidence linking slow, deliberative reasoning (system 2) with utilitarian judgments in dilemmas involving the sacrificing of another person for the greater good (other-sacrificial dilemmas). Joshua Greene has argued, based on this kind of evidence, that system 2 drives utilitarian judgments. However, the evidence on whether system 2 is associated with utilitarian judgments in self-sacrificial dilemmas is more mixed. We employed process dissociation to measure a self-sacrificial utilitarian (SU) parameter and an other-sacrificial (OU) utilitarian parameter. It was initially predicted that contra Greene, the cognitive reflection test (CRT) would only be positively correlated with the OU parameter and not the SU parameter. However, Greene’s hypothesis was corroborated: the CRT positively correlated with both the OU parameter and the SU parameter. By contrast, the CRT did not correlate with the other two moral parameters we extracted (altruism and deontology).Keywords: dual-process model, utilitarianism, altruism, reason, emotion, process dissociation
Procedia PDF Downloads 15319105 Numerical Investigation on Design Method of Timber Structures Exposed to Parametric Fire
Authors: Robert Pečenko, Karin Tomažič, Igor Planinc, Sabina Huč, Tomaž Hozjan
Abstract:
Timber is favourable structural material due to high strength to weight ratio, recycling possibilities, and green credentials. Despite being flammable material, it has relatively high fire resistance. Everyday engineering practice around the word is based on an outdated design of timber structures considering standard fire exposure, while modern principles of performance-based design enable use of advanced non-standard fire curves. In Europe, standard for fire design of timber structures EN 1995-1-2 (Eurocode 5) gives two methods, reduced material properties method and reduced cross-section method. In the latter, fire resistance of structural elements depends on the effective cross-section that is a residual cross-section of uncharred timber reduced additionally by so called zero strength layer. In case of standard fire exposure, Eurocode 5 gives a fixed value of zero strength layer, i.e. 7 mm, while for non-standard parametric fires no additional comments or recommendations for zero strength layer are given. Thus designers often implement adopted 7 mm rule also for parametric fire exposure. Since the latest scientific evidence suggests that proposed value of zero strength layer can be on unsafe side for standard fire exposure, its use in the case of a parametric fire is also highly questionable and more numerical and experimental research in this field is needed. Therefore, the purpose of the presented study is to use advanced calculation methods to investigate the thickness of zero strength layer and parametric charring rates used in effective cross-section method in case of parametric fire. Parametric studies are carried out on a simple solid timber beam that is exposed to a larger number of parametric fire curves Zero strength layer and charring rates are determined based on the numerical simulations which are performed by the recently developed advanced two step computational model. The first step comprises of hygro-thermal model which predicts the temperature, moisture and char depth development and takes into account different initial moisture states of timber. In the second step, the response of timber beam simultaneously exposed to mechanical and fire load is determined. The mechanical model is based on the Reissner’s kinematically exact beam model and accounts for the membrane, shear and flexural deformations of the beam. Further on, material non-linear and temperature dependent behaviour is considered. In the two step model, the char front temperature is, according to Eurocode 5, assumed to have a fixed temperature of around 300°C. Based on performed study and observations, improved levels of charring rates and new thickness of zero strength layer in case of parametric fires are determined. Thus, the reduced cross section method is substantially improved to offer practical recommendations for designing fire resistance of timber structures. Furthermore, correlations between zero strength layer thickness and key input parameters of the parametric fire curve (for instance, opening factor, fire load, etc.) are given, representing a guideline for a more detailed numerical and also experimental research in the future.Keywords: advanced numerical modelling, parametric fire exposure, timber structures, zero strength layer
Procedia PDF Downloads 16819104 A Review of Pharmacological Prevention of Peri-and Post-Procedural Myocardial Injury After Percutaneous Coronary Intervention
Authors: Syed Dawood Md. Taimur, Md. Hasanur Rahman, Syeda Fahmida Afrin, Farzana Islam
Abstract:
The concept of myocardial injury, although first recognized from animal studies, is now recognized as a clinical phenomenon that may result in microvascular damage, no-reflow phenomenon, myocardial stunning, myocardial hibernation and ischemic preconditioning. The final consequence of this event is left ventricular (LV) systolic dysfunction leading to increased morbidity and mortality. The typical clinical case of reperfusion injury occurs in acute myocardial infarction (MI) with ST segment elevation in which an occlusion of a major epicardial coronary artery is followed by recanalization of the artery. This may occur either spontaneously or by means of thrombolysis and/or by primary percutaneous coronary intervention (PCI) with efficient platelet inhibition by aspirin (acetylsalicylic acid), clopidogrel and glycoprotein IIb/IIIa inhibitors. In recent years, percutaneous coronary intervention (PCI) has become a well-established technique for the treatment of coronary artery disease. PCI improves symptoms in patients with coronary artery disease and it has been increasing the safety of procedures. However, peri- and post-procedural myocardial injury, including angiographical slow coronary flow, microvascular embolization, and elevated levels of cardiac enzyme, such as creatine kinase and troponin-T and -I, has also been reported even in elective cases. Furthermore, myocardial reperfusion injury at the beginning of myocardial reperfusion, which causes tissue damage and cardiac dysfunction, may occur in cases of the acute coronary syndrome. Because patients with myocardial injury is related to larger myocardial infarction and have a worse long-term prognosis than those without myocardial injury, it is important to prevent myocardial injury during and/or after PCI in patients with coronary artery disease. To date, many studies have demonstrated that adjunctive pharmacological treatment suppresses myocardial injury and increases coronary blood flow during PCI procedures. In this review, we highlight the usefulness of pharmacological treatment in combination with PCI in attenuating myocardial injury in patients with coronary artery disease.Keywords: coronary artery disease, percutaneous coronary intervention, myocardial injury, pharmacology
Procedia PDF Downloads 45119103 Effect of Dietary Sour Lemon Peel Essential Oil on Serum Parameters in Rainbow Trout (Oncorhynchus mykiss) Fingerlings against Deltamethrin Stress
Authors: Maryam Amiri Resketi, Sakineh Yeganeh, Khosro Jani Khalili
Abstract:
The aim of this study was to investigate the effect of dietary lemon peel essential oil (Citrus limon) on serum parameters and liver enzyme activity of rainbow trout (Oncorhynchus mykiss) was exposed to deltamethrin. The 96-hour lethal concentrations of the toxin on rainbow trout (Oncorhynchus mykiss), was determined according to standard procedures O.E.C.D in static (Static). 96-hour LC50 was obtained 0.0082 mg/l by using statistical methods Probit program version. The maximum allowable concentration of deltamethrin was calculated 0.00082 mg/l in natural environment and was used for this experiment. Eight treatments were designed based on 3 levels of lemon essential oil 200, 400 and 600 mg/kg and 2 levels of deltamethrin 0 and 0.00082. Rainbow trout with an average weight of 95.14 ± 3.8 g were distributed in 300-liter tanks and cultured for eight weeks. Fish were fed in an amount of 2% of body weight. Water changes were done on a daily basis (90 percent of the tank). About the tanks containing 10 % deltamethrin, after dewatering, suitable concentration of toxin was added to water. At the end of the test, serum biochemical parameters (total protein, albumin, glucose, cholesterol, and triglycerides) and liver enzymes (ALP, AST, ALT and LDH) were evaluated. In treatments without and with toxin, increasing 400 mg/kg oil increased total protein and albumin levels and lower cholesterol and triglycerides were observed (p < 0.05). Rise to the level of 400 mg/kg of lemon peel essential oil treatments contain pesticides, reduced the amount of enzymes ALP, ALT and LDH compared to treatment of toxin-free lemon peel essential oil (p < 0.05). The results showed that usage of lemon peel essential oil in fish diet can increase the immune system parameters and strengthen it with strong antioxidant activity followed by reducing the effect of deltamethrin on the immune system of fish and effective dose can prevent the adverse effects of toxin due to the weakening of the fish immune system at the time of toxic pollutant entrance in fish farms.Keywords: deltamethrin, Oncorhynchus mykiss, LC5096h, lemon peel (citrus limon) essential oil, serum parameters, liver enzymes
Procedia PDF Downloads 20119102 Fluid Prescribing Post Laparotomies
Authors: Gusa Hall, Barrie Keeler, Achal Khanna
Abstract:
Introduction: NICE guidelines have highlighted the consequences of IV fluid mismanagement. The main aim of this study was to audit fluid prescribing post laparotomies to identify if fluids were prescribed in accordance to NICE guidelines. Methodology: Retrospective database search of eight specific laparotomy procedures (colectomy right and left, Hartmann’s procedure, small bowel resection, perforated ulcer, abdominal perineal resection, anterior resection, pan proctocolectomy, subtotal colectomy) highlighted 29 laparotomies between April 2019 and May 2019. Two of 29 patients had secondary procedures during the same admission, n=27 (patients). Database case notes were reviewed for date of procedure, length of admission, fluid prescribed and amount, nasal gastric tube output, daily bloods results for electrolytes sodium and potassium and operational losses. Results: n=27 based on 27 identified patients between April 2019 – May 2019, 93% (25/27) received IV fluids, only 19% (5/27) received the correct IV fluids in accordance to NICE guidelines, 93% (25/27) who received IV fluids had the correct electrolytes levels (sodium & potassium), 100% (27/27) patients received blood tests (U&E’s) for correct electrolytes levels. 0% (0/27) no documentation on operational losses. IV fluids matched nasogastric tube output in 100% (3/3) of the number of patients that had a nasogastric tube in situ. Conclusion: A PubMed database literature review on barriers to safer IV prescribing highlighted educational interventions focused on prescriber knowledge rather than how to execute the prescribing task. This audit suggests IV fluids post laparotomies are not being prescribed consistently in accordance to NICE guidelines. Surgical management plans should be clearer on IV fluids and electrolytes requirements for the following 24 hours after the plan has been initiated. In addition, further teaching and training around IV prescribing is needed together with frequent surgical audits on IV fluid prescribing post-surgery to evaluate improvements.Keywords: audit, IV Fluid prescribing, laparotomy, NICE guidelines
Procedia PDF Downloads 12019101 Spatial Point Process Analysis of Dengue Fever in Tainan, Taiwan
Authors: Ya-Mei Chang
Abstract:
This research is intended to apply spatio-temporal point process methods to the dengue fever data in Tainan. The spatio-temporal intensity function of the dataset is assumed to be separable. The kernel estimation is a widely used approach to estimate intensity functions. The intensity function is very helpful to study the relation of the spatio-temporal point process and some covariates. The covariate effects might be nonlinear. An nonparametric smoothing estimator is used to detect the nonlinearity of the covariate effects. A fitted parametric model could describe the influence of the covariates to the dengue fever. The correlation between the data points is detected by the K-function. The result of this research could provide useful information to help the government or the stakeholders making decisions.Keywords: dengue fever, spatial point process, kernel estimation, covariate effect
Procedia PDF Downloads 35119100 Conversion of Carcinogenic Liquid-Wastes of Poly Vinyl Chloride (PVC) Industry to an Environmentally Safe Product: Corrosion Inhibitor and Biocide
Authors: Mohamed A. Hegazy
Abstract:
Most of Poly Vinyl Chloride (PVC) petrochemical companies produce huge amount of byproduct which characterized as carcinogenic liquid-wastes, insoluble in water, highly corrosive and highly offensive. This byproduct is partially use, a small part, in the production of hydrochloric acid and the huge part is a waste. Therefore, the aim of this work was to conversion of such PVC wastes, to an environmentally safe product that act as a corrosion Inhibitor for metals in aqueous media and as a biocide for microorganisms. This conversion method was accomplished mainly to protect the environment and to produce high economic value-products. The conversion process was established and the final product was tested for the toxicity, water solubility in comparison to the crude product. Furthermore, the end product was tested as a corrosion inhibitor in 1M HCl and as a broad-spectrum biocide against standard microbial strains and against the environmentally isolated Sulfate-reducing bacteria (SRB) microbial community.Keywords: PVC, surfactant, corrosion inhibitor, biocide, SRB
Procedia PDF Downloads 12319099 Vortices Structure in Internal Laminar and Turbulent Flows
Authors: Farid Gaci, Zoubir Nemouchi
Abstract:
A numerical study of laminar and turbulent fluid flows in 90° bend of square section was carried out. Three-dimensional meshes, based on hexahedral cells, were generated. The QUICK scheme was employed to discretize the convective term in the transport equations. The SIMPLE algorithm was adopted to treat the velocity-pressure coupling. The flow structure obtained showed interesting features such as recirculation zones and counter-rotating pairs of vortices. The performance of three different turbulence models was evaluated: the standard k- ω model, the SST k-ω model and the Reynolds Stress Model (RSM). Overall, it was found that, the multi-equation model performed better than the two equation models. In fact, the existence of four pairs of counter rotating cells, in the straight duct upstream of the bend, were predicted by the RSM closure but not by the standard eddy viscosity model nor the SST k-ω model. The analysis of the results led to a better understanding of the induced three dimensional secondary flows and the behavior of the local pressure coefficient and the friction coefficient.Keywords: curved duct, counter-rotating cells, secondary flow, laminar, turbulent
Procedia PDF Downloads 33619098 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models
Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah
Abstract:
In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model
Procedia PDF Downloads 24219097 Lean Manufacturing Implementation in Fused Plastic Bags Industry
Authors: Tareq Issa
Abstract:
Lean manufacturing is concerned with the implementation of several tools and methodologies that aim for the continuous elimination of wastes throughout manufacturing process flow in the production system. This research addresses the implementation of lean principles and tools in a small-medium industry focusing on 'fused' plastic bags production company in Amman, Jordan. In this production operation, the major type of waste to eliminate include material, waiting-transportation, and setup wastes. The primary goal is to identify and implement selected lean strategies to eliminate waste in the manufacturing process flow. A systematic approach was used for the implementation of lean principles and techniques, through the application of Value Stream Mapping analysis. The current state value stream map was constructed to improve the plastic bags manufacturing process through identifying opportunities to eliminate waste and its sources. Also, the future-state value stream map was developed describing improvements in the overall manufacturing process resulting from eliminating wastes. The implementation of VSM, 5S, Kanban, Kaizen, and Reduced lot size methods have provided significant benefits and results. Productivity has increased to 95.4%, delivery schedule attained at 99-100%, reduction in total inventory to 1.4 days and the setup time for the melting process was reduced to about 30 minutes.Keywords: lean implementation, plastic bags industry, value stream map, process flow
Procedia PDF Downloads 17519096 The Using of Smart Power Concepts in Military Targeting Process
Authors: Serdal AKYUZ
Abstract:
The smart power is the use of soft and hard power together in consideration of existing circumstances. Soft power can be defined as the capability of changing perception of any target mass by employing policies based on legality. The hard power, generally, uses military and economic instruments which are the concrete indicator of general power comprehension. More than providing a balance between soft and hard power, smart power creates a proactive combination by assessing existing resources. Military targeting process (MTP), as stated in smart power methodology, benefits from a wide scope of lethal and non-lethal weapons to reach intended end state. The Smart powers components can be used in military targeting process similar to using of lethal or non-lethal weapons. This paper investigates the current use of Smart power concept, MTP and presents a new approach to MTP from smart power concept point of view.Keywords: future security environment, hard power, military targeting process, soft power, smart power
Procedia PDF Downloads 47619095 Enhancement of MIMO H₂S Gas Sweetening Separator Tower Using Fuzzy Logic Controller Array
Authors: Muhammad M. A. S. Mahmoud
Abstract:
Natural gas sweetening process is a controlled process that must be done at maximum efficiency and with the highest quality. In this work, due to complexity and non-linearity of the process, the H₂S gas separation and the intelligent fuzzy controller, which is used to enhance the process, are simulated in MATLAB – Simulink. The new design of fuzzy control for Gas Separator is discussed in this paper. The design is based on the utilization of linear state-estimation to generate the internal knowledge-base that stores input-output pairs. The obtained input/output pairs are then used to design a feedback fuzzy controller. The proposed closed-loop fuzzy control system maintains the system asymptotically-stability while it enhances the system time response to achieve better control of the concentration of the output gas from the tower. Simulation studies are carried out to illustrate the Gas Separator system performance.Keywords: gas separator, gas sweetening, intelligent controller, fuzzy control
Procedia PDF Downloads 47119094 The Affect of Ethnic Minority People: A Prediction by Gender and Marital Status
Authors: A. K. M. Rezaul Karim, Abu Yusuf Mahmud, S. H. Mahmud
Abstract:
The study aimed to investigate whether the affect (experience of feeling or emotion) of ethnic minority people can be predicted by gender and marital status. Toward this end, positive affect and negative affect of 103 adult indigenous persons were measured. Analysis of data in multiple regressions demonstrated that both gender and marital status are significantly associated with positive affect (Gender: β=.318, p < .001; Marital status: β=.201, p < .05), but not with negative affect. Results indicated that the indigenous males have 0.32 standard deviations increased positive affect as compared to the indigenous females and that married individuals have 0.20 standard deviations increased positive affect as compared to their unmarried counterparts. These findings advance our understanding that gender and marital status inequalities in the experience of emotion are not specific to the mainstream society; rather it is a generalized picture of all societies. In general, men possess more positive affect than females; married persons possess more positive affect than the unmarried persons.Keywords: positive affect, negative affect, ethnic minority, gender, marital status
Procedia PDF Downloads 44819093 A Four Free Element Radiofrequency Coil with High B₁ Homogeneity for Magnetic Resonance Imaging
Authors: Khalid Al-Snaie
Abstract:
In this paper, the design and the testing of a symmetrical radiofrequency prototype coil with high B₁ magnetic field homogeneity are presented. The developed coil comprises four tuned coaxial circular loops that can produce a relatively homogeneous radiofrequency field. In comparison with a standard Helmholtz pair that provides 2nd-order homogeneity, it aims to provide fourth-order homogeneity of the B₁ field while preserving the simplicity of implementation. Electrical modeling of the probe, including all couplings, is used to ensure these requirements. Results of comparison tests, in free space and in a spectro-imager, between a standard Helmholtz pair and the presented prototype coil are introduced. In terms of field homogeneity, an improvement of 30% is observed. Moreover, the proposed prototype coil possesses a better quality factor (+25% on average) and a noticeable improvement in sensitivity (+20%). Overall, this work, which includes both theoretical and experimental aspects, aims to contribute to the study and understanding of four-element radio frequency (RF) systems derived from Helmholtz coils for Magnetic Resonance ImagingKeywords: B₁ homogeneity, MRI, NMR, radiofrequency, RF coil, free element systems
Procedia PDF Downloads 9019092 A Tool for Assessing Performance and Structural Quality of Business Process
Authors: Mariem Kchaou, Wiem Khlif, Faiez Gargouri
Abstract:
Modeling business processes is an essential task when evaluating, improving, or documenting existing business processes. To be efficient in such tasks, a business process model (BPM) must have high structural quality and high performance. Evidently, evaluating the performance of a business process model is a necessary step to reduce time, cost, while assessing the structural quality aims to improve the understandability and the modifiability of the BPMN model. To achieve these objectives, a set of structural and performance measures have been proposed. Since the diversity of measures, we propose a framework that integrates both structural and performance aspects for classifying them. Our measure classification is based on business process model perspectives (e.g., informational, functional, organizational, behavioral, and temporal), and the elements (activity, event, actor, etc.) involved in computing the measures. Then, we implement this framework in a tool assisting the structural quality and the performance of a business process. The tool helps the designers to select an appropriate subset of measures associated with the corresponding perspective and to calculate and interpret their values in order to improve the structural quality and the performance of the model.Keywords: performance, structural quality, perspectives, tool, classification framework, measures
Procedia PDF Downloads 15719091 Mobile Augmented Reality for Collaboration in Operation
Authors: Chong-Yang Qiao
Abstract:
Mobile augmented reality (MAR) tracking targets from the surroundings and aids operators for interactive data and procedures visualization, potential equipment and system understandably. Operators remotely communicate and coordinate with each other for the continuous tasks, information and data exchange between control room and work-site. In the routine work, distributed control system (DCS) monitoring and work-site manipulation require operators interact in real-time manners. The critical question is the improvement of user experience in cooperative works through applying Augmented Reality in the traditional industrial field. The purpose of this exploratory study is to find the cognitive model for the multiple task performance by MAR. In particular, the focus will be on the comparison between different tasks and environment factors which influence information processing. Three experiments use interface and interaction design, the content of start-up, maintenance and stop embedded in the mobile application. With the evaluation criteria of time demands and human errors, and analysis of the mental process and the behavior action during the multiple tasks, heuristic evaluation was used to find the operators performance with different situation factors, and record the information processing in recognition, interpretation, judgment and reasoning. The research will find the functional properties of MAR and constrain the development of the cognitive model. Conclusions can be drawn that suggest MAR is easy to use and useful for operators in the remote collaborative works.Keywords: mobile augmented reality, remote collaboration, user experience, cognition model
Procedia PDF Downloads 19719090 Utilization of Standard Paediatric Observation Chart to Evaluate Infants under Six Months Presenting with Non-Specific Complaints
Authors: Michael Zhang, Nicholas Marriage, Valerie Astle, Marie-Louise Ratican, Jonathan Ash, Haddijatou Hughes
Abstract:
Objective: Young infants are often brought to the Emergency Department (ED) with a variety of complaints, some of them are non-specific and present as a diagnostic challenge to the attending clinician. Whilst invasive investigations such as blood tests and lumbar puncture are necessary in some cases to exclude serious infections, some basic clinical tools in additional to thorough clinical history can be useful to assess the risks of serious conditions in these young infants. This study aimed to examine the utilization of one of clinical tools in this regard. Methods: This retrospective observational study examined the medical records of infants under 6 months presenting to a mixed urban ED between January 2013 and December 2014. The infants deemed to have non-specific complaints or diagnoses by the emergency clinicians were selected for analysis. The ones with clear systemic diagnoses were excluded. Among all relevant clinical information and investigation results, utilization of Standard Paediatric Observation Chart (SPOC) was particularly scrutinized in these medical records. This specific chart was developed by the expert clinicians in local health department. It categorizes important clinical signs into some color-coded zones as a visual cue for serious implication of some abnormalities. An infant is regarded as SPOC positive when fulfills 1 red zone or 2 yellow zones criteria, and the attending clinician would be prompted to investigate and treat for potential serious conditions accordingly. Results: Eight hundred and thirty-five infants met the inclusion criteria for this project. The ones admitted to the hospital for further management were more likely to have SPOC positive criteria than the discharged infants (Odds ratio: 12.26, 95% CI: 8.04 – 18.69). Similarly, Sepsis alert criteria on SPOC were positive in a higher percentage of patients with serious infections (56.52%) in comparison to those with mild conditions (15.89%) (p < 0.001). The SPOC sepsis criteria had a sensitivity of 56.5% (95% CI: 47.0% - 65.7%) and a moderate specificity of 84.1% (95% CI: 80.8% - 87.0%) to identify serious infections. Applying to this infant population, with a 17.4% prevalence of serious infection, the positive predictive value was only 42.8% (95% CI: 36.9% - 49.0%). However, the negative predictive value was high at 90.2% (95% CI: 88.1% - 91.9%). Conclusions: Standard Paediatric Observation Chart has been applied as a useful clinical tool in the clinical practice to help identify and manage young sick infants in ED effectively.Keywords: clinical tool, infants, non-specific complaints, Standard Paediatric Observation Chart
Procedia PDF Downloads 25219089 The Use of Artificial Intelligence to Harmonization in the Lawmaking Process
Authors: Supriyadi, Andi Intan Purnamasari, Aminuddin Kasim, Sulbadana, Mohammad Reza
Abstract:
The development of the Industrial Revolution Era 4.0 brought a significant influence in the administration of countries in all parts of the world, including Indonesia, not only in the administration and economic sectors but the ways and methods of forming laws should also be adjusted. Until now, the process of making laws carried out by the Parliament with the Government still uses the classical method. The law-making process still uses manual methods, such as typing harmonization of regulations, so that it is not uncommon for errors to occur, such as writing errors, copying articles and so on, things that require a high level of accuracy and relying on inventory and harmonization carried out manually by humans. However, this method often creates several problems due to errors and inaccuracies on the part of officers who harmonize laws after discussion and approval; this has a very serious impact on the system of law formation in Indonesia. The use of artificial intelligence in the process of forming laws seems to be justified and becomes the answer in order to minimize the disharmony of various laws and regulations. This research is normative research using the Legislative Approach and the Conceptual Approach. This research focuses on the question of how to use Artificial Intelligence for Harmonization in the Lawmaking Process.Keywords: artificial intelligence, harmonization, laws, intelligence
Procedia PDF Downloads 16119088 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study
Authors: K. Adu Michael, K. Alese Boniface
Abstract:
Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.Keywords: client/customer, problem statement, requirements engineering, software developers
Procedia PDF Downloads 40519087 Probing Multiple Relaxation Process in Zr-Cu Base Alloy Using Mechanical Spectroscopy
Authors: A. P. Srivastava, D. Srivastava, D. J. Browne
Abstract:
Relaxation dynamics of Zr44Cu40Al8Ag8 bulk metallic glass (BMG) has been probed using dynamic mechanical analyzer. The BMG sample was casted in the form of a plate of dimension 55 mm x 40 mm x 3 mm using tilt casting technique. X-ray diffraction and transmission electron microscope have been used for the microstructural characterization of as-cast BMG. For the mechanical spectroscopy study, samples in the form of a bar of size 55 mm X 2 mm X 3 mm were machined from the BMG plate. The mechanical spectroscopy was performed on dynamic mechanical analyzer (DMA) by 50 mm 3-point bending method in a nitrogen atmosphere. It was observed that two glass transition process were competing in supercooled liquid region around temperature 390°C and 430°C. The supercooled liquid state was completely characterized using DMA and differential scanning calorimeter (DSC). In addition to the main α-relaxation process, presence of β relaxation process around temperature 360°C; below the glass transition temperature was also observed. The β relaxation process could be described by Arrhenius law with the activation energy of 160 kJ/mole. The volume of the flow unit associated with this relaxation process has been estimated. The results from DMA study has been used to characterize the shear transformation zone in terms of activation volume and size. High fragility parameter value of 34 and higher activation volume indicates that this alloy could show good plasticity in supercooled liquid region. The possible mechanism for the relaxation processes has been discussed.Keywords: DMA, glass transition, metallic glass, thermoplastic forming
Procedia PDF Downloads 29519086 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing
Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall
Abstract:
Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear
Procedia PDF Downloads 29819085 In-Process Integration of Resistance-Based, Fiber Sensors during the Braiding Process for Strain Monitoring of Carbon Fiber Reinforced Composite Materials
Authors: Oscar Bareiro, Johannes Sackmann, Thomas Gries
Abstract:
Carbon fiber reinforced polymer composites (CFRP) are used in a wide variety of applications due to its advantageous properties and design versatility. The braiding process enables the manufacture of components with good toughness and fatigue strength. However, failure mechanisms of CFRPs are complex and still present challenges associated with their maintenance and repair. Within the broad scope of structural health monitoring (SHM), strain monitoring can be applied to composite materials to improve reliability, reduce maintenance costs and safely exhaust service life. Traditional SHM systems employ e.g. fiber optics, piezoelectrics as sensors, which are often expensive, time consuming and complicated to implement. A cost-efficient alternative can be the exploitation of the conductive properties of fiber-based sensors such as carbon, copper, or constantan - a copper-nickel alloy – that can be utilized as sensors within composite structures to achieve strain monitoring. This allows the structure to provide feedback via electrical signals to a user which are essential for evaluating the structural condition of the structure. This work presents a strategy for the in-process integration of resistance-based sensors (Elektrisola Feindraht AG, CuNi23Mn, Ø = 0.05 mm) into textile preforms during its manufacture via the braiding process (Herzog RF-64/120) to achieve strain monitoring of braided composites. For this, flat samples of instrumented composite laminates of carbon fibers (Toho Tenax HTS40 F13 24K, 1600 tex) and epoxy resin (Epikote RIMR 426) were manufactured via vacuum-assisted resin infusion. These flat samples were later cut out into test specimens and the integrated sensors were wired to the measurement equipment (National Instruments, VB-8012) for data acquisition during the execution of mechanical tests. Quasi-static tests were performed (tensile, 3-point bending tests) following standard protocols (DIN EN ISO 527-1 & 4, DIN EN ISO 14132); additionally, dynamic tensile tests were executed. These tests were executed to assess the sensor response under different loading conditions and to evaluate the influence of the sensor presence on the mechanical properties of the material. Several orientations of the sensor with regards to the applied loading and sensor placements inside the laminate were tested. Strain measurements from the integrated sensors were made by programming a data acquisition code (LabView) written for the measurement equipment. Strain measurements from the integrated sensors were then correlated to the strain/stress state for the tested samples. From the assessment of the sensor integration approach it can be concluded that it allows for a seamless sensor integration into the textile preform. No damage to the sensor or negative effect on its electrical properties was detected during inspection after integration. From the assessment of the mechanical tests of instrumented samples it can be concluded that the presence of the sensors does not alter significantly the mechanical properties of the material. It was found that there is a good correlation between resistance measurements from the integrated sensors and the applied strain. It can be concluded that the correlation is of sufficient accuracy to determinate the strain state of a composite laminate based solely on the resistance measurements from the integrated sensors.Keywords: braiding process, in-process sensor integration, instrumented composite material, resistance-based sensor, strain monitoring
Procedia PDF Downloads 106