Search results for: computational methods
13239 Simulation-Based Evaluation of Indoor Air Quality and Comfort Control in Non-Residential Buildings
Authors: Torsten Schwan, Rene Unger
Abstract:
Simulation of thermal and electrical building performance more and more becomes part of an integrative planning process. Increasing requirements on energy efficiency, the integration of volatile renewable energy, smart control and storage management often cause tremendous challenges for building engineers and architects. This mainly affects commercial or non-residential buildings. Their energy consumption characteristics significantly distinguish from residential ones. This work focuses on the many-objective optimization problem indoor air quality and comfort, especially in non-residential buildings. Based on a brief description of intermediate dependencies between different requirements on indoor air treatment it extends existing Modelica-based building physics models with additional system states to adequately represent indoor air conditions. Interfaces to corresponding HVAC (heating, ventilation, and air conditioning) system and control models enable closed-loop analyzes of occupants' requirements and energy efficiency as well as profitableness aspects. A complex application scenario of a nearly-zero-energy school building shows advantages of presented evaluation process for engineers and architects. This way, clear identification of air quality requirements in individual rooms together with realistic model-based description of occupants' behavior helps to optimize HVAC system already in early design stages. Building planning processes can be highly improved and accelerated by increasing integration of advanced simulation methods. Those methods mainly provide suitable answers on engineers' and architects' questions regarding more exuberant and complex variety of suitable energy supply solutions.Keywords: indoor air quality, dynamic simulation, energy efficient control, non-residential buildings
Procedia PDF Downloads 23513238 Risks in the Islamic Banking Model and Methods Adopted to Manage Them
Authors: K. P. Fasalu Rahman
Abstract:
The financial services industry of Islam include large number of institutions, such as investment banks and commercial banks, investment companies and mutual insurance companies. All types of these financial institutions should have to deal with many issues and risks in their field of work. Islamic banks should expect to face two types of risks: risks that are similar to those faced by conventional financial intermediaries and risks that are unique to the Islamic Banks due to their compliance with the Shariah. The use of financial services and products that comply with the Shariah principles cause special issues for supervision and risk management. Risks are uncertain future events that could influence the achievement of the bank’s objectives, including strategic, operational, financial and compliance objectives. In Islamic banks, effective risk management deserves special attention. As an operational problem, risk management is the classification and identification of methods, processes, and risks in banks to supervise, monitor and measure them. In comparison to conventional banks, Islamic banks face big difficulties in identifying and managing risks due to bigger complexities emerging from the profit loss sharing (PLS) concept and nature of particular risks of Islamic financing. As the developing of managing risks tool becomes very essential, especially in Islamic banking as most of the products are depending on PLS principle, identifying and measuring each type of risk is highly important and critical in any Islamic finance based systems. This paper highlights the special and general risks surrounding Islamic banking. And it investigates in detail the need for risk management in Islamic banks. In addition to analyzing the effectiveness of risk management strategies adopted by Islamic financial institutions at present, this research is also suggesting strategies for improving risk management process of Islamic banks in future.Keywords: Islamic banking, management, risk, risk management
Procedia PDF Downloads 14513237 A Machine Learning Pipeline for Real-Time Activity Detection on Low Computational Power Devices for Metaverse Applications
Authors: Amit Kumar, Amanpreet Chander, Ashish Sahani
Abstract:
This paper presents our recent work on real-time human activity detection based on the media pipe pipeline and machine learning algorithms. The proposed system can detect human activities, including running, jumping, squatting, bending to the left or right, and standing still. This is a robust solution for developing a yoga, dance, metaverse, and fitness application that checks for the correction of the pose without having any additional monitor like a personal trainer. MediaPipe solution offers an open-source cross-platform which utilizes a two-step detector-tracker ML pipeline for live detection of key landmarks on our body which can be used for motion data collection. The prediction of real-time poses uses a variety of machine learning techniques and different types of analysis. Without primarily relying on powerful desktop environments for inference, our method achieves real-time performance on the majority of contemporary mobile phones, desktops/laptops, Python, or even the web. Experimental results show that our method outperforms the existing method in terms of accuracy and real-time capability, achieving an accuracy of 99.92% on testing datasets.Keywords: human activity detection, media pipe, machine learning, metaverse applications
Procedia PDF Downloads 18413236 Competency Model as a Key Tool for Managing People in Organizations: Presentation of a Model
Authors: Andrea ČopíKová
Abstract:
Competency Based Management is a new approach to management, which solves organization’s challenges with complexity and with the aim to find and solve organization’s problems and learn how to avoid these in future. They teach the organizations to create, apart from the state of stability – that is temporary, vital organization, which is permanently able to utilize and profit from internal and external opportunities. The aim of this paper is to propose a process of competency model design, based on which a competency model for a financial department manager in a production company will be created. Competency models are very useful tool in many personnel processes in any organization. They are used for acquiring and selection of employees, designing training and development activities, employees’ evaluation, and they can be used as a guide for a career planning and as a tool for succession planning especially for managerial positions. When creating a competency model the method AHP (Analytic Hierarchy Process) and quantitative pair-wise comparison (Saaty’s method) will be used; these methods belong among the most used methods for the determination of weights, and it is used in the AHP procedure. The introduction part of the paper consists of the research results pertaining to the use of competency model in practice and then the issue of competency and competency models is explained. The application part describes in detail proposed methodology for the creation of competency models, based on which the competency model for the position of financial department manager in a foreign manufacturing company, will be created. In the conclusion of the paper, the final competency model will be shown for above mentioned position. The competency model divides selected competencies into three groups that are managerial, interpersonal and functional. The model describes in detail individual levels of competencies, their target value (required level) and the level of importance.Keywords: analytic hierarchy process, competency, competency model, quantitative pairwise comparison
Procedia PDF Downloads 24613235 Requests and Responses to Requests in Jordanian Arabic
Authors: Raghad Abu Salma, Beatrice Szczepek Reed
Abstract:
Politeness is one of the most researched areas in pragmatics as it is key to interpersonal interactional phenomena. Many studies, particularly in linguistics, have focused on developing politeness theories and exploring linguistic devices used in communication to construct and establish social norms. However, the question of what constitutes polite language remains a point of ongoing debate. Prior research primarily examined politeness in English and its native speaking communities, oversimplifying the notion of politeness and associating it with surface-level language use. There is also a dearth of literature on politeness in Arabic, particularly in the context of Jordanian Arabic. Prior research investigating politeness in Arabic make generalized claims about politeness in Arabic without taking the linguistic variations into account or providing empirical evidence. This proposed research aims to explore how Jordanian Arabic influences its first language users in making and responding to requests, exploring participants' perceptions of politeness and the linguistic choices they make in their interactions. The study focuses on Jordanian expats living in London, UK providing an intercultural perspective that prior research does not consider. This study employs a mixed-methods approach combining discourse completion tasks (DCTs) with semi-structured interviews. While DCTs provide insight into participants’ linguistic choices, semi-structured interviews glean insight into participants' perceptions of politeness and their linguistic choices impacted by cultural norms and diverse experiences. This paper discusses previous research on politeness in Arabic, identifies research gaps, and discusses different methods for data collection. This paper also presents preliminary findings from the ongoing study.Keywords: politeness, pragmatics, jordanian arabic, intercultural politeness
Procedia PDF Downloads 8313234 Free Vibration of Axially Functionally Graded Simply Supported Beams Using Differential Transformation Method
Authors: A. Selmi
Abstract:
Free vibration analysis of homogenous and axially functionally graded simply supported beams within the context of Euler-Bernoulli beam theory is presented in this paper. The material properties of the beams are assumed to obey the linear law distribution. The effective elastic modulus of the composite was predicted by using the rule of mixture. Here, the complexities which appear in solving differential equation of transverse vibration of composite beams which limit the analytical solution to some special cases are overcome using a relatively new approach called the Differential Transformation Method. This technique is applied for solving differential equation of transverse vibration of axially functionally graded beams. Natural frequencies and corresponding normalized mode shapes are calculated for different Young’s modulus ratios. MATLAB code is designed to solve the transformed differential equation of the beam. Comparison of the present results with the exact solutions proves the effectiveness, the accuracy, the simplicity, and computational stability of the differential transformation method. The effect of the Young’s modulus ratio on the normalized natural frequencies and mode shapes is found to be very important.Keywords: differential transformation method, functionally graded material, mode shape, natural frequency
Procedia PDF Downloads 31413233 Characterization of the Microorganisms Associated with Pleurotus ostractus and Pleurotus tuber-Regium Spent Mushroom Substrate
Authors: Samuel E. Okere, Anthony E. Ataga
Abstract:
Introduction: The microbial ecology of Pleurotus osteratus and Pleurotus tuber–regium spent mushroom substrate (SMS) were characterized to determine other ways of its utilization. Materials and Methods: The microbiological properties of the spent mushroom substrate were determined using standard methods. This study was carried out at the Microbiology Laboratory University of Port Harcourt, Rivers State, Nigeria. Results: Quantitative microbiological analysis revealed that Pleurotus osteratus spent mushroom substrate (POSMS) contained 7.9x10⁵ and 1.2 x10³ cfu/g of total heterotrophic bacteria and total fungi count respectively while Pleurotus tuber-regium spent mushroom substrate (PTSMS) contained 1.38x10⁶ and 9.0 x10² cfu/g of total heterotrophic bacteria count and total fungi count respectively. The fungi species encountered from Pleurotus tuber-regium spent mushroom substrate (PTSMS) include Aspergillus and Cladosporum species, while Aspergillus and Penicillium species were encountered from Pleurotus osteratus spent mushroom substrate (POSMS). However, the bacteria species encountered from Pleurotus tuber-regium spent mushroom substrate include Bacillus, Acinetobacter, Alcaligenes, Actinobacter, and Pseudomonas species while Bacillus, Actinobacteria, Aeromonas, Lactobacillus and Aerococcus species were encountered from Pleurotus osteratus spent mushroom substrate (POSMS). Conclusion: Therefore based on the findings from this study, it can be concluded that spent mushroom substrate contain microorganisms that can be utilized both in bioremediation of oil-polluted soils as they contain important hydrocarbon utilizing microorganisms such as Penicillium, Aspergillus and Bacillus species and also as sources of plant growth-promoting rhizobacteria (PGPR) such as Pseudomonas and Bacillus species which can induce resistance on plants. However, further studies are recommended, especially to molecularly characterize these microorganisms.Keywords: characterization, microorganisms, mushroom, spent substrate
Procedia PDF Downloads 16813232 Thermal-Fluid Characteristics of Heating Element in Rotary Heat Exchanger in Accordance with Fouling Phenomena
Authors: Young Mun Lee, Seon Ho Kim, Seok Min Choi, JeongJu Kim, Seungyeong Choi, Hyung Hee Cho
Abstract:
To decrease sulfur oxide in the flue gas from coal power plant, a flue gas de-sulfurization facility is operated. In the reactor, a chemical reaction occurs with a temperature change of the gas so that sulfur oxide is removed and cleaned air is emitted. In this process, temperature change induces a serious problem which is a cold erosion of stack. To solve this problem, the rotary heat exchanger is managed before the stack. In the heat exchanger, a heating element is equipped to increase a heat transfer area. Heat transfer and pressure loss is a big issue to improve a performance. In this research, thermal-fluid characteristics of the heating element are analyzed by computational fluid dynamics. Fouling simulation is also conducted to calculate a performance of heating element. Numerical analysis is performed on the situation where plugging phenomenon has already occurred and existed in the inlet region of the heating element. As the pressure of the rear part of the plugging decreases suddenly and the flow velocity becomes slower, it is found that the flow is gathered from both sides as it develops in the flow direction, and it is confirmed that the pressure difference due to plugging is increased.Keywords: heating element, plugging, rotary heat exchanger, thermal fluid characteristics
Procedia PDF Downloads 48913231 Reed: An Approach Towards Quickly Bootstrapping Multilingual Acoustic Models
Authors: Bipasha Sen, Aditya Agarwal
Abstract:
Multilingual automatic speech recognition (ASR) system is a single entity capable of transcribing multiple languages sharing a common phone space. Performance of such a system is highly dependent on the compatibility of the languages. State of the art speech recognition systems are built using sequential architectures based on recurrent neural networks (RNN) limiting the computational parallelization in training. This poses a significant challenge in terms of time taken to bootstrap and validate the compatibility of multiple languages for building a robust multilingual system. Complex architectural choices based on self-attention networks are made to improve the parallelization thereby reducing the training time. In this work, we propose Reed, a simple system based on 1D convolutions which uses very short context to improve the training time. To improve the performance of our system, we use raw time-domain speech signals directly as input. This enables the convolutional layers to learn feature representations rather than relying on handcrafted features such as MFCC. We report improvement on training and inference times by atleast a factor of 4x and 7.4x respectively with comparable WERs against standard RNN based baseline systems on SpeechOcean's multilingual low resource dataset.Keywords: convolutional neural networks, language compatibility, low resource languages, multilingual automatic speech recognition
Procedia PDF Downloads 12713230 Application of Large Eddy Simulation-Immersed Boundary Volume Penalization Method for Heat and Mass Transfer in Granular Layers
Authors: Artur Tyliszczak, Ewa Szymanek, Maciej Marek
Abstract:
Flow through granular materials is important to a vast array of industries, for instance in construction industry where granular layers are used for bulkheads and isolators, in chemical engineering and catalytic reactors where large surfaces of packed granular beds intensify chemical reactions, or in energy production systems, where granulates are promising materials for heat storage and heat transfer media. Despite the common usage of granulates and extensive research performed in this field, phenomena occurring between granular solid elements or between solids and fluid are still not fully understood. In the present work we analyze the heat exchange process between the flowing medium (gas, liquid) and solid material inside the granular layers. We consider them as a composite of isolated solid elements and inter-granular spaces in which a gas or liquid can flow. The structure of the layer is controlled by shapes of particular granular elements (e.g., spheres, cylinders, cubes, Raschig rings), its spatial distribution or effective characteristic dimension (total volume or surface area). We will analyze to what extent alteration of these parameters influences on flow characteristics (turbulent intensity, mixing efficiency, heat transfer) inside the layer and behind it. Analysis of flow inside granular layers is very complicated because the use of classical experimental techniques (LDA, PIV, fibber probes) inside the layers is practically impossible, whereas the use of probes (e.g. thermocouples, Pitot tubes) requires drilling of holes inside the solid material. Hence, measurements of the flow inside granular layers are usually performed using for instance advanced X-ray tomography. In this respect, theoretical or numerical analyses of flow inside granulates seem crucial. Application of discrete element methods in combination with the classical finite volume/finite difference approaches is problematic as a mesh generation process for complex granular material can be very arduous. A good alternative for simulation of flow in complex domains is an immersed boundary-volume penalization (IB-VP) in which the computational meshes have simple Cartesian structure and impact of solid objects on the fluid is mimicked by source terms added to the Navier-Stokes and energy equations. The present paper focuses on application of the IB-VP method combined with large eddy simulation (LES). The flow solver used in this work is a high-order code (SAILOR), which was used previously in various studies, including laminar/turbulent transition in free flows and also for flows in wavy channels, wavy pipes and over various shape obstacles. In these cases a formal order of approximation turned out to be in between 1 and 2, depending on the test case. The current research concentrates on analyses of the flows in dense granular layers with elements distributed in a deterministic regular manner and validation of the results obtained using LES-IB method and body-fitted approach. The comparisons are very promising and show very good agreement. It is found that the size, number of elements and their distribution have huge impact on the obtained results. Ordering of the granular elements (or lack of it) affects both the pressure drop and efficiency of the heat transfer as it significantly changes mixing process.Keywords: granular layers, heat transfer, immersed boundary method, numerical simulations
Procedia PDF Downloads 14113229 Blueprinting of a Normalized Supply Chain Processes: Results in Implementing Normalized Software Systems
Authors: Bassam Istanbouli
Abstract:
With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them. In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies; the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system. Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.Keywords: blueprint, ERP, modular, normalized
Procedia PDF Downloads 14213228 Pregnant Women and Mothers in Prison, Mother and Baby Units and Mental Health
Authors: Rachel Dolan
Abstract:
Background: Over two thirds of women in prison in England are mothers, and estimates suggest between 100 and 200 women per year give birth during imprisonment. There are currently six mother and baby units (MBUs) in prisons in England which admit women and babies up to the age of 18 months. Although there are only 65 places available, and despite positive impacts, they are rarely full. Mental illness may influence the number of admissions, as may interpretation of admission criteria. They are the only current alternative to separation for imprisoned mothers and their babies. Aims: To identify the factors that affect the decision to apply for/be offered a place in a prison MBU; to measure the impact of a placement upon maternal mental health and wellbeing; To measure the Initial outcomes for mother and child. Methods: A mixed methods approach - 100 pregnant women in English prisons are currently being recruited from prisons in England. Quantitative measures will establish the prevalence of mental disorder, personality disorder, substance misuse and quality of life. Qualitative interviews will document the experiences of pregnancy and motherhood in prison. Results: Preliminary quantitative findings suggest the most prevalent mental disorders are anxiety and depression and approximately half the participants meet the criteria for one or more personality disorders. The majority of participants to date have been offered a place in a prison MBU, and those in a prison with an MBU prior to applying are more likely to be admitted. Those with a previous history of childcare issues, who are known to social services are less likely to be offered a place. Qualitative findings suggest that many women are often hungry and uncomfortable during pregnancy, many have feelings of guilt about having a child in prison and that feelings of anxiety and worry are exacerbated by lack of information.Keywords: mothers, prison, mother and baby units, mental health
Procedia PDF Downloads 29713227 Ointment of Rosella Flower Petals Extract (Hibiscus sabdariffa): Pharmaceutical Preparations Formulation Development of Herbs for Antibacterial S. aureus
Authors: Muslihatus Syarifah
Abstract:
Introduction: Rosella flower petals can be used as an antibacterial because it contains alkaloids, flavonoids, phenolics, and terpenoids) for the . Bacteria activity is S. aureus can cause skin infections and pengobatanya most appropriate use of topical preparations. Ointment is a topical preparation comprising the active substance and ointment base. Not all the base matches the active substances or any type of disease. In this study using flavonoid active substances contained in rosella flower petals (Hibiscus sabdariffa) to be made ointment by testing a variety of different bases in order to obtain a suitable basis for the formulation of ointment extract rosella flower petals. Methods: Experimental research with research methods Post test control group design using the ointment is hydrocarbon sample, absorption, leached water and dissolved water. Then tested for bacteria S. aureus with different concentrations of 1%, 2%, 4%, 8%, 16, 32%. Data were analyzed using One Way ANOVA followed by Post Hoc test. Results: Ointment with a hydrocarbon base, absorption, leached water and dissolved water having no change in physical properties during storage. Base affect the physical properties of an ointment that adhesion, dispersive power and pH. The physical properties of the ointment with different concentrations produce different physical properties including adhesion, dispersive power and pH. The higher the concentration the higher dispersive power, but the smaller the adhesion and pH. Conclusion: Differences bases, storage time, the concentration of the extract can affect the physical properties of the ointment. Concentration of extract in the ointment extract rosella flower petals is 32%.Keywords: rosella, physical properties, ointments, antibacterial
Procedia PDF Downloads 37213226 Synthesis, Characterization and Biological Activites of Azomethine Derivatives
Authors: Lynda Golea, Rachid Chebaki
Abstract:
Schiff bases contain heterocyclic structural units with N and O donor atoms which plays an important role in coordination chemistry. Azomethine groups are a broad class of widely used compounds with applications in many fields, including analytical, inorganic chemistry and biological. Schiff's base is of promising research interest due to the widespread antibacterial resistance in medical science. In addition, the research is essential to generate Schiff base metal complexes with various applications. Schiff complexes have been used as drugs and have antibacterial, antifungal, antiviral, and anti-inflammatory properties. The various donor atoms they contain offer a special ability for metal binding. In this research on the physicochemical properties of azomethine groups, we synthesized and studied the Schiff base compounds by a condensation reaction of tryptamines and acetophenone in ethanol. The structure of the prepared compound was interpreted using 1H NMR, 13C NMR, UV-vis and FT-IR. A computational analysis at the level of DFT with functional B3LYP in conjunction with the base 6-311+G (d, p) was conducted to study its electronic and molecular structure. The biological study was performed on three bacterial strains usually causing infection, including Gram-positive and Gram-negative, for antibacterial activity. Results showed moderate biological activity and proportional activity with increasing concentration.Keywords: azomethine, HOMO, LUMO, RMN, molecular docking
Procedia PDF Downloads 6613225 Turbulent Channel Flow Synthesis using Generative Adversarial Networks
Authors: John M. Lyne, K. Andrea Scott
Abstract:
In fluid dynamics, direct numerical simulations (DNS) of turbulent flows require large amounts of nodes to appropriately resolve all scales of energy transfer. Due to the size of these databases, sharing these datasets amongst the academic community is a challenge. Recent work has been done to investigate the use of super-resolution to enable database sharing, where a low-resolution flow field is super-resolved to high resolutions using a neural network. Recently, Generative Adversarial Networks (GAN) have grown in popularity with impressive results in the generation of faces, landscapes, and more. This work investigates the generation of unique high-resolution channel flow velocity fields from a low-dimensional latent space using a GAN. The training objective of the GAN is to generate samples in which the distribution of the generated samplesis ideally indistinguishable from the distribution of the training data. In this study, the network is trained using samples drawn from a statistically stationary channel flow at a Reynolds number of 560. Results show that the turbulent statistics and energy spectra of the generated flow fields are within reasonable agreement with those of the DNS data, demonstrating that GANscan produce the intricate multi-scale phenomena of turbulence.Keywords: computational fluid dynamics, channel flow, turbulence, generative adversarial network
Procedia PDF Downloads 21013224 De-Novo Structural Elucidation from Mass/NMR Spectra
Authors: Ismael Zamora, Elisabeth Ortega, Tatiana Radchenko, Guillem Plasencia
Abstract:
The structure elucidation based on Mass Spectra (MS) data of unknown substances is an unresolved problem that affects many different fields of application. The recent overview of software available for structure elucidation of small molecules has shown the demand for efficient computational tool that will be able to perform structure elucidation of unknown small molecules and peptides. We developed an algorithm for De-Novo fragment analysis based on MS data that proposes a set of scored and ranked structures that are compatible with the MS and MSMS spectra. Several different algorithms were developed depending on the initial set of fragments and the structure building processes. Also, in all cases, several scores for the final molecule ranking were computed. They were validated with small and middle databases (DB) with the eleven test set compounds. Similar results were obtained from any of the databases that contained the fragments of the expected compound. We presented an algorithm. Or De-Novo fragment analysis based on only mass spectrometry (MS) data only that proposed a set of scored/ranked structures that was validated on different types of databases and showed good results as proof of concept. Moreover, the solutions proposed by Mass Spectrometry were submitted to the prediction of NMR spectra in order to elucidate which of the proposed structures was compatible with the NMR spectra collected.Keywords: De Novo, structure elucidation, mass spectrometry, NMR
Procedia PDF Downloads 30113223 Medical Workforce Knowledge of Adrenaline (Epinephrine) Administration in Anaphylaxis in Adults Considerably Improved with Training in an UK Hospital from 2010 to 2017
Authors: Jan C. Droste, Justine Burns, Nithin Narayan
Abstract:
Introduction: Life-threatening detrimental effects of inappropriate adrenaline (epinephrine) administration, e.g., by giving the wrong dose, in the context of anaphylaxis management is well documented in the medical literature. Half of the fatal anaphylactic reactions in the UK are iatrogenic, and the median time to a cardio-respiratory arrest can be as short as 5 minutes. It is therefore imperative that hospital doctors of all grades have active and accurate knowledge of the correct route, site, and dosage of administration of adrenaline. Given this time constraint and the potential fatal outcome with inappropriate management of anaphylaxis, it is alarming that surveys over the last 15 years have repeatedly shown only a minority of doctors to have accurate knowledge of adrenaline administration as recommended by the UK Resuscitation Council guidelines (2008 updated 2012). This comparison of survey results of the medical workforce over several years in a small NHS District General Hospital was conducted in order to establish the effect of the employment of multiple educational methods regarding adrenaline administration in anaphylaxis in adults. Methods: Between 2010 and 2017, several education methods and tools were used to repeatedly inform the medical workforce (doctors and advanced clinical practitioners) in a single district general hospital regarding the treatment of anaphylaxis in adults. Whilst the senior staff remained largely the same cohort, junior staff had changed fully in every survey. Examples included: (i) Formal teaching -in Grand Rounds; during the junior doctors’ induction process; advanced life support courses (ii) In-situ simulation training performed by the clinical skills simulation team –several ad hoc sessions and one 3-day event in 2017 visiting 16 separate clinical areas performing an acute anaphylaxis scenario using actors- around 100 individuals from multi-disciplinary teams were involved (iii) Hospital-wide distribution of the simulation event via the Trust’s Simulation Newsletter (iv) Laminated algorithms were attached to the 'crash trolleys' (v) A short email 'alert' was sent to all medical staff 3 weeks prior to the survey detailing the emergency treatment of anaphylaxis (vi) In addition, the performance of the surveys themselves represented a teaching opportunity when gaps in knowledge could be addressed. Face to face surveys were carried out in 2010 ('pre-intervention), 2015, and 2017, in the latter two occasions including advanced clinical practitioners (ACP). All surveys consisted of convenience samples. If verbal consent to conduct the survey was obtained, the medical practitioners' answers were recorded immediately on a data collection sheet. Results: There was a sustained improvement in the knowledge of the medical workforce from 2010 to 2017: Answers improved regarding correct drug by 11% (84%, 95%, and 95%); the correct route by 20% (76%, 90%, and 96%); correct site by 40% (43%, 83%, and 83%) and the correct dose by 45% (27%, 54%, and 72%). Overall, knowledge of all components -correct drug, route, site, and dose-improved from 13% in 2010 to 62% in 2017. Conclusion: This survey comparison shows knowledge of the medical workforce regarding adrenaline administration for treatment of anaphylaxis in adults can be considerably improved by employing a variety of educational methods.Keywords: adrenaline, anaphylaxis, epinephrine, medical education, patient safety
Procedia PDF Downloads 13113222 Demand Forecasting to Reduce Dead Stock and Loss Sales: A Case Study of the Wholesale Electric Equipment and Part Company
Authors: Korpapa Srisamai, Pawee Siriruk
Abstract:
The purpose of this study is to forecast product demands and develop appropriate and adequate procurement plans to meet customer needs and reduce costs. When the product exceeds customer demands or does not move, it requires the company to support insufficient storage spaces. Moreover, some items, when stored for a long period of time, cause deterioration to dead stock. A case study of the wholesale company of electronic equipment and components, which has uncertain customer demands, is considered. The actual purchasing orders of customers are not equal to the forecast provided by the customers. In some cases, customers have higher product demands, resulting in the product being insufficient to meet the customer's needs. However, some customers have lower demands for products than estimates, causing insufficient storage spaces and dead stock. This study aims to reduce the loss of sales opportunities and the number of remaining goods in the warehouse, citing 30 product samples of the company's most popular products. The data were collected during the duration of the study from January to October 2022. The methods used to forecast are simple moving averages, weighted moving average, and exponential smoothing methods. The economic ordering quantity and reorder point are used to calculate to meet customer needs and track results. The research results are very beneficial to the company. The company can reduce the loss of sales opportunities by 20% so that the company has enough products to meet customer needs and can reduce unused products by up to 10% dead stock. This enables the company to order products more accurately, increasing profits and storage space.Keywords: demand forecast, reorder point, lost sale, dead stock
Procedia PDF Downloads 13013221 “Student Veterans’ Transition to Nursing Education: Barriers and Facilitators
Authors: Bruce Hunter
Abstract:
Background: The transition for student veterans from military service to higher education can be a challenging endeavor, especially for those pursuing an education in nursing. While the experiences and perspectives of each student veteran is unique, their successful integration into an academic environment can be influenced by a complex array of barriers and facilitators. This mixed-methods study aims to explore the themes and concepts that can be found in the transition experiences of student veterans in nursing education, with a focus on identifying the barriers they face and the facilitators that support their success. Methods: This study utilizes an explanatory mixed-methods approach. The research participants include student veterans enrolled in nursing programs across three academic institutions in the Southeastern United States. Quantitative Phase: A Likert scale instrument is distributed to a sample of student veterans in nursing programs. The survey assesses demographic information, academic experiences, social experiences, and perceptions of institutional support. Quantitative data is analyzed using descriptive statistics to assess demographics and to identify barriers and facilitators to the transition. Qualitative Phase: Two open-ended questions were posed to student veterans to explore their lived experiences, barriers, and facilitators during the transition to nursing education and to further explain the quantitative findings. Thematic analysis with line-by-line coding is employed to identify recurring themes and narratives that may shed light on the barriers and facilitators encountered. Results: This study found that the successful academic integration of student veterans lies in recognizing the diversity of values and attitudes among student veterans, understanding the potential challenges they face, and engaging in initiative-taking steps to create an inclusive and supportive academic environment that accommodates the unique experiences of this demographic. Addressing these academic and social integration concerns can contribute to a more understanding environment for student veterans in the BSN program. Conclusion: Providing support during this transitional period is crucial not only for retaining veterans, but also for bolstering their success in achieving the status of registered nurses. Acquiring an understanding of military culture emerges as an essential initial step for nursing faculty in student veteran retention and for successful completion of their programs. Participants found that their transition experience lacked meaningful social interactions, which could foster a positive learning environment, enhance their emotional well-being, and could contribute significantly to their overall success and satisfaction in their nursing education journey. Recognizing and promoting academic and social integration is important in helping veterans experience a smooth transition into and through the unfamiliar academic environment of nursing education.Keywords: nursing, education, student veterans, barriers, facilitators
Procedia PDF Downloads 5313220 Probability Modeling and Genetic Algorithms in Small Wind Turbine Design Optimization: Mentored Interdisciplinary Undergraduate Research at LaGuardia Community College
Authors: Marina Nechayeva, Malgorzata Marciniak, Vladimir Przhebelskiy, A. Dragutan, S. Lamichhane, S. Oikawa
Abstract:
This presentation is a progress report on a faculty-student research collaboration at CUNY LaGuardia Community College (LaGCC) aimed at designing a small horizontal axis wind turbine optimized for the wind patterns on the roof of our campus. Our project combines statistical and engineering research. Our wind modeling protocol is based upon a recent wind study by a faculty-student research group at MIT, and some of our blade design methods are adopted from a senior engineering project at CUNY City College. Our use of genetic algorithms has been inspired by the work on small wind turbines’ design by David Wood. We combine these diverse approaches in our interdisciplinary project in a way that has not been done before and improve upon certain techniques used by our predecessors. We employ several estimation methods to determine the best fitting parametric probability distribution model for the local wind speed data obtained through correlating short-term on-site measurements with a long-term time series at the nearby airport. The model serves as a foundation for engineering research that focuses on adapting and implementing genetic algorithms (GAs) to engineering optimization of the wind turbine design using Blade Element Momentum Theory. GAs are used to create new airfoils with desirable aerodynamic specifications. Small scale models of best performing designs are 3D printed and tested in the wind tunnel to verify the accuracy of relevant calculations. Genetic algorithms are applied to selected airfoils to determine the blade design (radial cord and pitch distribution) that would optimize the coefficient of power profile of the turbine. Our approach improves upon the traditional blade design methods in that it lets us dispense with assumptions necessary to simplify the system of Blade Element Momentum Theory equations, thus resulting in more accurate aerodynamic performance calculations. Furthermore, it enables us to design blades optimized for a whole range of wind speeds rather than a single value. Lastly, we improve upon known GA-based methods in that our algorithms are constructed to work with XFoil generated airfoils data which enables us to optimize blades using our own high glide ratio airfoil designs, without having to rely upon available empirical data from existing airfoils, such as NACA series. Beyond its immediate goal, this ongoing project serves as a training and selection platform for CUNY Research Scholars Program (CRSP) through its annual Aerodynamics and Wind Energy Research Seminar (AWERS), an undergraduate summer research boot camp, designed to introduce prospective researchers to the relevant theoretical background and methodology, get them up to speed with the current state of our research, and test their abilities and commitment to the program. Furthermore, several aspects of the research (e.g., writing code for 3D printing of airfoils) are adapted in the form of classroom research activities to enhance Calculus sequence instruction at LaGCC.Keywords: engineering design optimization, genetic algorithms, horizontal axis wind turbine, wind modeling
Procedia PDF Downloads 23713219 Content Analysis of Video Translations: Examining the Linguistic and Thematic Approach by Translator Abdullah Khrief on the X Platform
Authors: Easa Almustanyir
Abstract:
This study investigates the linguistic and thematic approach of translator Abdullah Khrief in the context of video translations on the X platform. The sample comprises 15 videos from Khrief's account, covering diverse content categories like science, religion, social issues, personal experiences, lifestyle, and culture. The analysis focuses on two aspects: language usage and thematic representation. Regarding language, the study examines the prevalence of English while considering the inclusion of French and German content, highlighting Khrief's multilingual versatility and ability to navigate cultural nuances. Thematically, the study explores the diverse range of topics covered, encompassing scientific, religious, social, and personal narratives, underscoring Khrief's broad subject matter expertise and commitment to knowledge dissemination. The study employs a mixed-methods approach, combining quantitative data analysis with qualitative content analysis. Statistical data on video languages, presenter genders, and content categories are analyzed, and a thorough content analysis assesses translation accuracy, cultural appropriateness, and overall quality. Preliminary findings indicate a high level of professionalism and expertise in Khrief's translations. The absence of errors across the diverse range of videos establishes his credibility and trustworthiness. Furthermore, the accurate representation of cultural nuances and sensitive topics highlights Khrief's cultural sensitivity and commitment to preserving intended meanings and emotional resonance.Keywords: audiovisual translation, linguistic versatility, thematic diversity, cultural sensitivity, content analysis, mixed-methods approach
Procedia PDF Downloads 3413218 Analysis of Hard Turning Process of AISI D3-Thermal Aspects
Authors: B. Varaprasad, C. Srinivasa Rao
Abstract:
In the manufacturing sector, hard turning has emerged as vital machining process for cutting hardened steels. Besides many advantages of hard turning operation, one has to implement to achieve close tolerances in terms of surface finish, high product quality, reduced machining time, low operating cost and environmentally friendly characteristics. In the present study, three-dimensional CAE (Computer Aided Engineering) based simulation of hard turning by using commercial software DEFORM 3D has been compared to experimental results of stresses, temperatures and tool forces in machining of AISI D3 steel using mixed Ceramic inserts (CC6050). In the present analysis, orthogonal cutting models are proposed, considering several processing parameters such as cutting speed, feed, and depth of cut. An exhaustive friction modeling at the tool-work interfaces is carried out. Work material flow around the cutting edge is carefully modeled with adaptive re-meshing simulation capability. In process simulations, feed rate and cutting speed are constant (i.e.,. 0.075 mm/rev and 155 m/min), and analysis is focused on stresses, forces, and temperatures during machining. Close agreement is observed between CAE simulation and experimental values.Keywords: hard turning, computer aided engineering, computational machining, finite element method
Procedia PDF Downloads 45813217 From Parchment to Pixels: Digital Preservation for the Future
Authors: Abida Khatoon
Abstract:
This study provides an overview of ancient manuscripts, including their historical significance, current digital preservation methods, and the challenges we face in safeguarding these invaluable resources. India has a long-standing tradition of manuscript preservation, with texts that span a wide range of subjects, from religious scriptures to scientific treatises. These manuscripts were written on various materials, including palm leaves, parchment, metal, bark, wood, animal skin, and paper. These manuscripts offer a deep insight into India's cultural and intellectual history. Ancient manuscripts are crucial historical records, providing valuable insights into past civilizations and knowledge systems. As these physical documents become increasingly fragile, digital preservation methods have become essential to ensure their continued accessibility. Digital preservation involves several key techniques. Scanning and digitization create high-resolution digital images of manuscripts, while reprography produces copies to reduce wear on originals. Digital archiving ensures proper storage and management of these digital files, and preservation of electronic data addresses modern formats like web pages and emails. Despite its benefits, digital preservation faces several challenges. Technological obsolescence, data integrity issues, and the resource-intensive nature of the process are significant hurdles. Securing adequate funding is particularly challenging due to high initial costs and ongoing expenses. Looking ahead, the future of digital preservation is promising. Advancements in technology, increased collaboration among institutions, and the development of sustainable funding models will enhance the preservation and accessibility of these important historical documents.Keywords: preservation strategies, Indian manuscript, cultural heritage, archiving
Procedia PDF Downloads 2913216 Probability-Based Damage Detection of Structures Using Kriging Surrogates and Enhanced Ideal Gas Molecular Movement Algorithm
Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee
Abstract:
Surrogate model has received increasing attention for use in detecting damage of structures based on vibration modal parameters. However, uncertainties existing in the measured vibration data may lead to false or unreliable output result from such model. In this study, an efficient approach based on Monte Carlo simulation is proposed to take into account the effect of uncertainties in developing a surrogate model. The probability of damage existence (PDE) is calculated based on the probability density function of the existence of undamaged and damaged states. The kriging technique allows one to genuinely quantify the surrogate error, therefore it is chosen as metamodeling technique. Enhanced version of ideal gas molecular movement (EIGMM) algorithm is used as main algorithm for model updating. The developed approach is applied to detect simulated damage in numerical models of 72-bar space truss and 120-bar dome truss. The simulation results show the proposed method can perform well in probability-based damage detection of structures with less computational effort compared to direct finite element model.Keywords: probability-based damage detection (PBDD), Kriging, surrogate modeling, uncertainty quantification, artificial intelligence, enhanced ideal gas molecular movement (EIGMM)
Procedia PDF Downloads 24513215 Impact of Tillage and Crop Establishment on Fertility and Sustainability of the Rice-Wheat Cropping System in Inceptisols of Varanasi, Up, India
Authors: Pramod Kumar Sharma, Pratibha Kumari, Udai Pratap Singh, Sustainability
Abstract:
In the Indo-Gangetic Plains of South-East Asia, the rice-wheat cropping system (RWCS) is dominant with conventional tillage (CT) without residue management, which shows depletion of soil fertility and non-sustainable crop productivity. Hence, this investigation was planned to identify suitable natural resource management practices involving different tillage and crop establishment (TCE) methods along with crop residue and their effects, on the sustainability of dominant cropping systems through enhancing soil fertility and productivity. This study was conducted for two consecutive years 2018-19 and 2019-20 on a long-term field experiment that was started in the year 2015-16 taking six different combinations of TCE methods viz. CT, partial conservation agriculture (PCA) i.e. anchored residue of rice and full conservation agriculture (FCA)] i.e. anchored residue of rice and wheat under RWCS in terms of crop productivity, sustainability of soil health, and crop nutrition by the crops. Results showed that zero tillage direct-seeded rice (ZTDSR) - zero tillage wheat (ZTW) [FCA + green gram residue retention (RR)] recorded the highest yield attributes and yield during both the crops. Compared to conventional tillage rice (CTR)-conventional tillage wheat (CTW) [residue removal (R 0 )], the soil quality parameters were improved significantly with ZTDSR-ZTW (FCA+RR). Overall, ZTDSR-ZTW (FCA+RR) had higher nutrient uptake by the crops than CT-based treatment CTR-CTW (R 0 ) and CTR-CTW (RI).These results showed that there is significant profitability of yield and resource utilization by the adoption of FCA it may be a better alternative to the dominant tillage system i.e. CT in RWSC.Keywords: tillage and crop establishment, soil fertility, rice-wheat cropping system, sustainability
Procedia PDF Downloads 11113214 Characterization of Tailings From Traditional Panning of Alluvial Gold Ore (A Case Study of Ilesa - Southwestern Nigeria Goldfield Tailings Dumps)
Authors: Olaniyi Awe, Adelana R. Adetunji, Abraham Adeleke
Abstract:
Field observation revealed a lot of artisanal gold mining activities in Ilesa gold belt of southwestern Nigeria. The possibility of alluvial and lode gold deposits in commercial quantities around this location is very high, as there are many resident artisanal gold miners who have been mining and trading alluvial gold ore for decades and to date in the area. Their major process of solid gold recovery from its ore is by gravity concentration using the convectional panning method. This method is simple to learn and fast to recover gold from its alluvial ore, but its effectiveness is based on rules of thumb and the artisanal miners' experience in handling gold ore panning tool while processing the ore. Research samples from five alluvial gold ore tailings dumps were collected and studied. Samples were subjected to particle size analysis and mineralogical and elemental characterization using X-Ray Diffraction (XRD) and Particle-Induced X-ray Emission (PIXE) methods, respectively. The results showed that the tailings were of major quartz in association with albite, plagioclase, mica, gold, calcite and sulphide minerals. The elemental composition analysis revealed a 15ppm of gold concentration in particle size fraction of -90 microns in one of the tailings dumps investigated. These results are significant. It is recommended that heaps of panning tailings should be further reprocessed using other gold recovery methods such as shaking tables, flotation and controlled cyanidation that can efficiently recover fine gold particles that were previously lost into the gold panning tailings. The tailings site should also be well controlled and monitored so that these heavy minerals do not find their way into surrounding water streams and rivers, thereby causing health hazards.Keywords: gold ore, panning, PIXE, tailings, XRD
Procedia PDF Downloads 9313213 Multi-Criteria Test Case Selection Using Ant Colony Optimization
Authors: Niranjana Devi N.
Abstract:
Test case selection is to select the subset of only the fit test cases and remove the unfit, ambiguous, redundant, unnecessary test cases which in turn improve the quality and reduce the cost of software testing. Test cases optimization is the problem of finding the best subset of test cases from a pool of the test cases to be audited. It will meet all the objectives of testing concurrently. But most of the research have evaluated the fitness of test cases only on single parameter fault detecting capability and optimize the test cases using a single objective. In the proposed approach, nine parameters are considered for test case selection and the best subset of parameters for test case selection is obtained using Interval Type-2 Fuzzy Rough Set. Test case selection is done in two stages. The first stage is the fuzzy entropy-based filtration technique, used for estimating and reducing the ambiguity in test case fitness evaluation and selection. The second stage is the ant colony optimization-based wrapper technique with a forward search strategy, employed to select test cases from the reduced test suite of the first stage. The results are evaluated using the Coverage parameters, Precision, Recall, F-Measure, APSC, APDC, and SSR. The experimental evaluation demonstrates that by this approach considerable computational effort can be avoided.Keywords: ant colony optimization, fuzzy entropy, interval type-2 fuzzy rough set, test case selection
Procedia PDF Downloads 67113212 The Effect of Different Strength Training Methods on Muscle Strength, Body Composition and Factors Affecting Endurance Performance
Authors: Shaher A. I. Shalfawi, Fredrik Hviding, Bjornar Kjellstadli
Abstract:
The main purpose of this study was to measure the effect of two different strength training methods on muscle strength, muscle mass, fat mass and endurance factors. Fourteen physical education students accepted to participate in this study. The participants were then randomly divided into three groups, traditional training group (TTG), cluster training group (CTG) and control group (CG). TTG consisted of 4 participants aged ( ± SD) (22.3 ± 1.5 years), body mass (79.2 ± 15.4 kg) and height (178.3 ± 11.9 cm). CTG consisted of 5 participants aged (22.2 ± 3.5 years), body mass (81.0 ± 24.0 kg) and height (180.2 ± 12.3 cm). CG consisted of 5 participants aged (22 ± 2.8 years), body mass (77 ± 19 kg) and height (174 ± 6.7 cm). The participants underwent a hypertrophy strength training program twice a week consisting of 4 sets of 10 reps at 70% of one-repetition maximum (1RM), using barbell squat and barbell bench press for 8 weeks. The CTG performed 2 x 5 reps using 10 s recovery in between repetitions and 50 s recovery between sets, while TTG performed 4 sets of 10 reps with 90 s recovery in between sets. Pre- and post-tests were administrated to assess body composition (weight, muscle mass, and fat mass), 1RM (bench press and barbell squat) and a laboratory endurance test (Bruce Protocol). Instruments used to collect the data were Tanita BC-601 scale (Tanita, Illinois, USA), Woodway treadmill (Woodway, Wisconsin, USA) and Vyntus CPX breath-to-breath system (Jaeger, Hoechberg, Germany). Analysis was conducted at all measured variables including time to peak VO2, peak VO2, heart rate (HR) at peak VO2, respiratory exchange ratio (RER) at peak VO2, and number of breaths per minute. The results indicate an increase in 1RM performance after 8 weeks of training. The change in 1RM squat was for the TTG = 30 ± 3.8 kg, CTG = 28.6 ± 8.3 kg and CG = 10.3 ± 13.8 kg. Similarly, the change in 1RM bench press was for the TTG = 9.8 ± 2.8 kg, CTG = 7.4 ± 3.4 kg and CG = 4.4 ± 3.4 kg. The within-group analysis from the oxygen consumption measured during the incremental exercise indicated that the TTG had only a statistical significant increase in their RER from 1.16 ± 0.04 to 1.23 ± 0.05 (P < 0.05). The CTG had a statistical significant improvement in their HR at peak VO2 from 186 ± 24 to 191 ± 12 Beats Per Minute (P < 0.05) and their RER at peak VO2 from 1.11 ± 0.06 to 1.18 ±0.05 (P < 0.05). Finally, the CG had only a statistical significant increase in their RER at peak VO2 from 1.11 ± 0.07 to 1.21 ± 0.05 (P < 0.05). The between-group analysis showed no statistical differences between all groups in all the measured variables from the oxygen consumption test during the incremental exercise including changes in muscle mass, fat mass, and weight (kg). The results indicate a similar effect of hypertrophy strength training irrespective of the methods of the training used on untrained subjects. Because there were no notable changes in body-composition measures, the results suggest that the improvements in performance observed in all groups is most probably due to neuro-muscular adaptation to training.Keywords: hypertrophy strength training, cluster set, Bruce protocol, peak VO2
Procedia PDF Downloads 25413211 Detection of Clipped Fragments in Speech Signals
Authors: Sergei Aleinik, Yuri Matveev
Abstract:
In this paper a novel method for the detection of clipping in speech signals is described. It is shown that the new method has better performance than known clipping detection methods, is easy to implement, and is robust to changes in signal amplitude, size of data, etc. Statistical simulation results are presented.Keywords: clipping, clipped signal, speech signal processing, digital signal processing
Procedia PDF Downloads 39713210 Cognitive Methods for Detecting Deception During the Criminal Investigation Process
Authors: Laid Fekih
Abstract:
Background: It is difficult to detect lying, deception, and misrepresentation just by looking at verbal or non-verbal expression during the criminal investigation process, as there is a common belief that it is possible to tell whether a person is lying or telling the truth just by looking at the way they act or behave. The process of detecting lies and deception during the criminal investigation process needs more studies and research to overcome the difficulties facing the investigators. Method: The present study aimed to identify the effectiveness of cognitive methods and techniques in detecting deception during the criminal investigation. It adopted the quasi-experimental method and covered a sample of (20) defendants distributed randomly into two homogeneous groups, an experimental group of (10) defendants be subject to criminal investigation by applying cognitive techniques to detect deception and a second experimental group of (10) defendants be subject to the direct investigation method. The tool that used is a guided interview based on models of investigative questions according to the cognitive deception detection approach, which consists of three techniques of Vrij: imposing the cognitive burden, encouragement to provide more information, and ask unexpected questions, and the Direct Investigation Method. Results: Results revealed a significant difference between the two groups in term of lie detection accuracy in favour of defendants be subject to criminal investigation by applying cognitive techniques, the cognitive deception detection approach produced superior total accuracy rates both with human observers and through an analysis of objective criteria. The cognitive deception detection approach produced superior accuracy results in truth detection: 71%, deception detection: 70% compared to a direct investigation method truth detection: 52%; deception detection: 49%. Conclusion: The study recommended if practitioners use a cognitive deception detection technique, they will correctly classify more individuals than when they use a direct investigation method.Keywords: the cognitive lie detection approach, deception, criminal investigation, mental health
Procedia PDF Downloads 71