Search results for: Arrhenius methodology
4909 Clinical Validation of C-PDR Methodology for Accurate Non-Invasive Detection of Helicobacter pylori Infection
Authors: Suman Som, Abhijit Maity, Sunil B. Daschakraborty, Sujit Chaudhuri, Manik Pradhan
Abstract:
Background: Helicobacter pylori is a common and important human pathogen and the primary cause of peptic ulcer disease and gastric cancer. Currently H. pylori infection is detected by both invasive and non-invasive way but the diagnostic accuracy is not up to the mark. Aim: To set up an optimal diagnostic cut-off value of 13C-Urea Breath Test to detect H. pylori infection and evaluate a novel c-PDR methodology to overcome of inconclusive grey zone. Materials and Methods: All 83 subjects first underwent upper-gastrointestinal endoscopy followed by rapid urease test and histopathology and depending on these results; we classified 49 subjects as H. pylori positive and 34 negative. After an overnight, fast patients are taken 4 gm of citric acid in 200 ml water solution and 10 minute after ingestion of the test meal, a baseline exhaled breath sample was collected. Thereafter an oral dose of 75 mg 13C-Urea dissolved in 50 ml water was given and breath samples were collected upto 90 minute for 15 minute intervals and analysed by laser based high precisional cavity enhanced spectroscopy. Results: We studied the excretion kinetics of 13C isotope enrichment (expressed as δDOB13C ‰) of exhaled breath samples and found maximum enrichment around 30 minute of H. pylori positive patients, it is due to the acid mediated stimulated urease enzyme activity and maximum acidification happened within 30 minute but no such significant isotopic enrichment observed for H. pylori negative individuals. Using Receiver Operating Characteristic (ROC) curve an optimal diagnostic cut-off value, δDOB13C ‰ = 3.14 was determined at 30 minute exhibiting 89.16% accuracy. Now to overcome grey zone problem we explore percentage dose of 13C recovered per hour, i.e. 13C-PDR (%/hr) and cumulative percentage dose of 13C recovered, i.e. c-PDR (%) in exhaled breath samples for the present 13C-UBT. We further explored the diagnostic accuracy of 13C-UBT by constructing ROC curve using c-PDR (%) values and an optimal cut-off value was estimated to be c-PDR = 1.47 (%) at 60 minute, exhibiting 100 % diagnostic sensitivity , 100 % specificity and 100 % accuracy of 13C-UBT for detection of H. pylori infection. We also elucidate the gastric emptying process of present 13C-UBT for H. pylori positive patients. The maximal emptying rate found at 36 minute and half empting time of present 13C-UBT was found at 45 minute. Conclusions: The present study exhibiting the importance of c-PDR methodology to overcome of grey zone problem in 13C-UBT for accurate determination of infection without any risk of diagnostic errors and making it sufficiently robust and novel method for an accurate and fast non-invasive diagnosis of H. pylori infection for large scale screening purposes.Keywords: 13C-Urea breath test, c-PDR methodology, grey zone, Helicobacter pylori
Procedia PDF Downloads 3024908 On the Possibility of Real Time Characterisation of Ambient Toxicity Using Multi-Wavelength Photoacoustic Instrument
Authors: Tibor Ajtai, Máté Pintér, Noémi Utry, Gergely Kiss-Albert, Andrea Palágyi, László Manczinger, Csaba Vágvölgyi, Gábor Szabó, Zoltán Bozóki
Abstract:
According to the best knowledge of the authors, here we experimentally demonstrate first, a quantified correlation between the real-time measured optical feature of the ambient and the off-line measured toxicity data. Finally, using these correlations we are presenting a novel methodology for real time characterisation of ambient toxicity based on the multi wavelength aerosol phase photoacoustic measurement. Ambient carbonaceous particulate matter is one of the most intensively studied atmospheric constituent in climate science nowadays. Beyond their climatic impact, atmospheric soot also plays an important role as an air pollutant that harms human health. Moreover, according to the latest scientific assessments ambient soot is the second most important anthropogenic emission source, while in health aspect its being one of the most harmful atmospheric constituents as well. Despite of its importance, generally accepted standard methodology for the quantitative determination of ambient toxicology is not available yet. Dominantly, ambient toxicology measurement is based on the posterior analysis of filter accumulated aerosol with limited time resolution. Most of the toxicological studies are based on operational definitions using different measurement protocols therefore the comprehensive analysis of the existing data set is really limited in many cases. The situation is further complicated by the fact that even during its relatively short residence time the physicochemical features of the aerosol can be masked significantly by the actual ambient factors. Therefore, decreasing the time resolution of the existing methodology and developing real-time methodology for air quality monitoring are really actual issues in the air pollution research. During the last decades many experimental studies have verified that there is a relation between the chemical composition and the absorption feature quantified by Absorption Angström Exponent (AAE) of the carbonaceous particulate matter. Although the scientific community are in the common platform that the PhotoAcoustic Spectroscopy (PAS) is the only methodology that can measure the light absorption by aerosol with accurate and reliable way so far, the multi-wavelength PAS which are able to selectively characterise the wavelength dependency of absorption has become only available in the last decade. In this study, the first results of the intensive measurement campaign focusing the physicochemical and toxicological characterisation of ambient particulate matter are presented. Here we demonstrate the complete microphysical characterisation of winter time urban ambient including optical absorption and scattering as well as size distribution using our recently developed state of the art multi-wavelength photoacoustic instrument (4λ-PAS), integrating nephelometer (Aurora 3000) as well as single mobility particle sizer and optical particle counter (SMPS+C). Beyond this on-line characterisation of the ambient, we also demonstrate the results of the eco-, cyto- and genotoxicity measurements of ambient aerosol based on the posterior analysis of filter accumulated aerosol with 6h time resolution. We demonstrate a diurnal variation of toxicities and AAE data deduced directly from the multi-wavelength absorption measurement results.Keywords: photoacoustic spectroscopy, absorption Angström exponent, toxicity, Ames-test
Procedia PDF Downloads 3044907 UF as Pretreatment of RO for Tertiary Treatment of Biologically Treated Distillery Spentwash
Authors: Pinki Sharma, Himanshu Joshi
Abstract:
Distillery spentwash contains high chemical oxygen demand (COD), biological oxygen demand (BOD), color, total dissolved solids (TDS) and other contaminants even after biological treatment. The effluent can’t be discharged as such in the surface water bodies or land without further treatment. Reverse osmosis (RO) treatment plants have been installed in many of the distilleries at tertiary level. But at most of the places these plants are not properly working due to high concentration of organic matter and other contaminants in biologically treated spentwash. To make the membrane treatment proven and reliable technology, proper pre-treatment is mandatory. In the present study, ultra-filtration (UF) as pre-treatment of RO at tertiary stage was performed. Operating parameters namely initial pH (pHo: 2–10), trans-membrane pressure (TMP: 4-20 bars) and temperature (T: 15- 43°C) used for conducting experiments with UF system. Experiments were optimized at different operating parameters in terms of COD, color, TDS and TOC removal by using response surface methodology (RSM) with central composite design. The results showed that removal of COD, color and TDS by 62%, 93.5% and 75.5%, with UF, respectively at optimized conditions with increased permeate flux from 17.5 l/m2/h (RO) to 38 l/m2/h (UF-RO). The performance of the RO system was greatly improved both in term of pollutant removal as well as water recovery.Keywords: bio-digested distillery spentwash, reverse osmosis, response surface methodology, ultra-filtration
Procedia PDF Downloads 3474906 Design of a Customized Freshly-Made Fruit Salad and Juices Vending Machine
Authors: María Laura Guevara Campos
Abstract:
The increasing number of vending machines makes it easy for people to find them more frequently in stores, universities, workplaces, and even hospitals. These machines usually offer products with high contents of sugar and fat, which, if consumed regularly, can result in serious health threats, as overweight and obesity. Additionally, the energy consumption of these machines tends to be high, which has an impact on the environment as well. In order to promote the consumption of healthy food, a vending machine was designed to give the customer the opportunity to choose between a customized fruit salad and a customized fruit juice, both of them prepared instantly with the ingredients selected by the customer. The main parameters considered to design the machine were: the storage of the preferred fruits in a salad and/or in a juice according to a survey, the size of the machine, the use of ecologic recipients, and the overall energy consumption. The methodology used for the design was the one proposed by the German Association of Engineers for mechatronics systems, which breaks the design process in several stages, from the elaboration of a list of requirements through the establishment of the working principles and the design concepts to the final design of the machine, which was done in a 3D modelling software. Finally, with the design of this machine, the aim is to contribute to the development and implementation of healthier vending machines that offer freshly-made products, which is not being widely attended at present.Keywords: design, design methodology, mechatronics systems, vending machines
Procedia PDF Downloads 1334905 Design and Optimization for a Compliant Gripper with Force Regulation Mechanism
Authors: Nhat Linh Ho, Thanh-Phong Dao, Shyh-Chour Huang, Hieu Giang Le
Abstract:
This paper presents a design and optimization for a compliant gripper. The gripper is constructed based on the concept of compliant mechanism with flexure hinge. A passive force regulation mechanism is presented to control the grasping force a micro-sized object instead of using a sensor force. The force regulation mechanism is designed using the planar springs. The gripper is expected to obtain a large range of displacement to handle various sized objects. First of all, the statics and dynamics of the gripper are investigated by using the finite element analysis in ANSYS software. And then, the design parameters of the gripper are optimized via Taguchi method. An orthogonal array L9 is used to establish an experimental matrix. Subsequently, the signal to noise ratio is analyzed to find the optimal solution. Finally, the response surface methodology is employed to model the relationship between the design parameters and the output displacement of the gripper. The design of experiment method is then used to analyze the sensitivity so as to determine the effect of each parameter on the displacement. The results showed that the compliant gripper can move with a large displacement of 213.51 mm and the force regulation mechanism is expected to be used for high precision positioning systems.Keywords: flexure hinge, compliant mechanism, compliant gripper, force regulation mechanism, Taguchi method, response surface methodology, design of experiment
Procedia PDF Downloads 3324904 Scrum Challenges and Mitigation Practices in Global Software Development of an Integrated Learning Environment: Case Study of Science, Technology, Innovation, Mathematics, Engineering for the Young
Authors: Evgeniia Surkova, Manal Assaad, Hleb Makeyeu, Juho Makio
Abstract:
The main objective of STIMEY (Science, Technology, Innovation, Mathematics, Engineering for the Young) project is the delivery of a hybrid learning environment that combines multi-level components such as social media concepts, robotic artefacts, and radio, among others. It is based on a well-researched pedagogical framework to attract European youths to STEM (science, technology, engineering, and mathematics) education and careers. To develop and integrate these various components, STIMEY is executed in iterative research cycles leading to progressive improvements. Scrum was the development methodology of choice in the project, as studies indicated its benefits as an agile methodology in global software development, especially of e-learning and integrated learning projects. This paper describes the project partners’ experience with the Scrum framework, discussing the challenges faced in its implementation and the mitigation practices employed. The authors conclude with exploring user experience tools and principles for future research, as a novel direction in supporting the Scrum development team.Keywords: e-learning, global software development, scrum, STEM education
Procedia PDF Downloads 1794903 Supply Chain Risk Management: A Meta-Study of Empirical Research
Authors: Shoufeng Cao, Kim Bryceson, Damian Hine
Abstract:
The existing supply chain risk management (SCRM) research is currently chaotic and somewhat disorganized, and the topic has been addressed conceptually more often than empirically. This paper, using both qualitative and quantitative data, employs a modified Meta-study method to investigate the SCRM empirical research published in quality journals over the period of 12 years (2004-2015). The purpose is to outline the extent research trends and the employed research methodologies (i.e., research method, data collection and data analysis) across the sub-field that will guide future research. The synthesized findings indicate that empirical study on risk ripple effect along an entire supply chain, industry-specific supply chain risk management and global/export supply chain risk management has not yet given much attention than it deserves in the SCRM field. Besides, it is suggested that future empirical research should employ multiple and/or mixed methods and multi-source data collection techniques to reduce common method bias and single-source bias, thus improving research validity and reliability. In conclusion, this paper helps to stimulate more quality empirical research in the SCRM field via identifying promising research directions and providing some methodology guidelines.Keywords: empirical research, meta-study, methodology guideline, research direction, supply chain risk management
Procedia PDF Downloads 3184902 Catalytic Ammonia Decomposition: Cobalt-Molybdenum Molar Ratio Effect on Hydrogen Production
Authors: Elvis Medina, Alejandro Karelovic, Romel Jiménez
Abstract:
Catalytic ammonia decomposition represents an attractive alternative due to its high H₂ content (17.8% w/w), a product stream free of COₓ, among others; however, challenges need to be addressed for its consolidation as an H₂ chemical storage technology, especially, those focused on the synthesis of efficient bimetallic catalytic systems, as an alternative to the price and scarcity of ruthenium, the most active catalyst reported. In this sense, from the perspective of rational catalyst design, adjusting the main catalytic activity descriptor, a screening of supported catalysts with different compositional settings of cobalt-molybdenum metals is presented to evaluate their effect on the catalytic decomposition rate of ammonia. Subsequently, a kinetic study on the supported monometallic Co and Mo catalysts, as well as on the bimetallic CoMo catalyst with the highest activity is shown. The synthesis of catalysts supported on γ-alumina was carried out using the Charge Enhanced Dry Impregnation (CEDI) method, all with a 5% w/w loading metal. Seeking to maintain uniform dispersion, the catalysts were oxidized and activated (In-situ activation) using a flow of anhydrous air and hydrogen, respectively, under the same conditions: 40 ml min⁻¹ and 5 °C min⁻¹ from room temperature to 600 °C. Catalytic tests were carried out in a fixed-bed reactor, confirming the absence of transport limitations, as well as an Approach to equilibrium (< 1 x 10⁻⁴). The reaction rate on all catalysts was measured between 400 and 500 ºC at 53.09 kPa NH3. The synergy theoretically (DFT) reported for bimetallic catalysts was confirmed experimentally. Specifically, it was observed that the catalyst composed mainly of 75 mol% cobalt proved to be the most active in the experiments, followed by the monometallic cobalt and molybdenum catalysts, in this order of activity as referred to in the literature. A kinetic study was performed at 10.13 – 101.32 kPa NH3 and at four equidistant temperatures between 437 and 475 °C the data were adjusted to an LHHW-type model, which considered the desorption of nitrogen atoms from the active phase surface as the rate determining step (RDS). The regression analysis were carried out under an integral regime, using a minimization algorithm based on SLSQP. The physical meaning of the parameters adjusted in the kinetic model, such as the RDS rate constant (k₅) and the lumped adsorption constant of the quasi-equilibrated steps (α) was confirmed through their Arrhenius and Van't Hoff-type behavior (R² > 0.98), respectively. From an energetic perspective, the activation energy for cobalt, cobalt-molybdenum, and molybdenum was 115.2, 106.8, and 177.5 kJ mol⁻¹, respectively. With this evidence and considering the volcano shape described by the ammonia decomposition rate in relation to the metal composition ratio, the synergistic behavior of the system is clearly observed. However, since characterizations by XRD and TEM were inconclusive, the formation of intermetallic compounds should be still verified using HRTEM-EDS. From this point onwards, our objective is to incorporate parameters into the kinetic expressions that consider both compositional and structural elements and explore how these can maximize or influence H₂ production.Keywords: CEDI, hydrogen carrier, LHHW, RDS
Procedia PDF Downloads 614901 Error Analysis in Academic Writing of EFL Learners: A Case Study for Undergraduate Students at Pathein University
Authors: Aye Pa Pa Myo
Abstract:
Writing in English is accounted as a complex process for English as a foreign language learners. Besides, committing errors in writing can be found as an inevitable part of language learners’ writing. Generally, academic writing is quite difficult for most of the students to manage for getting better scores. Students can commit common errors in their writings when they try to write academic writing. Error analysis deals with identifying and detecting the errors and also explains the reason for the occurrence of these errors. In this paper, the researcher has an attempt to examine the common errors of undergraduate students in their academic writings at Pathein University. The purpose of doing this research is to investigate the errors which students usually commit in academic writing and to find out the better ways for correcting these errors in EFL classrooms. In this research, fifty-third-year non-English specialization students attending Pathein University were selected as participants. This research took one month. It was conducted with a mixed methodology method. Two mini-tests were used as research tools. Data were collected with a quantitative research method. Findings from this research pointed that most of the students noticed their common errors after getting the necessary input, and they became more decreased committing these errors after taking mini-test; hence, all findings will be supportive for further researches related to error analysis in academic writing.Keywords: academic writing, error analysis, EFL learners, mini-tests, mixed methodology
Procedia PDF Downloads 1334900 Developing a Knowledge-Based Lean Six Sigma Model to Improve Healthcare Leadership Performance
Authors: Yousuf N. Al Khamisi, Eduardo M. Hernandez, Khurshid M. Khan
Abstract:
Purpose: This paper presents a model of a Knowledge-Based (KB) using Lean Six Sigma (L6σ) principles to enhance the performance of healthcare leadership. Design/methodology/approach: Using L6σ principles to enhance healthcare leaders’ performance needs a pre-assessment of the healthcare organisation’s capabilities. The model will be developed using a rule-based approach of KB system. Thus, KB system embeds Gauging Absence of Pre-requisite (GAP) for benchmarking and Analytical Hierarchy Process (AHP) for prioritization. A comprehensive literature review will be covered for the main contents of the model with a typical output of GAP analysis and AHP. Findings: The proposed KB system benchmarks the current position of healthcare leadership with the ideal benchmark one (resulting from extensive evaluation by the KB/GAP/AHP system of international leadership concepts in healthcare environments). Research limitations/implications: Future work includes validating the implementation model in healthcare environments around the world. Originality/value: This paper presents a novel application of a hybrid KB combines of GAP and AHP methodology. It implements L6σ principles to enhance healthcare performance. This approach assists healthcare leaders’ decision making to reach performance improvement against a best practice benchmark.Keywords: Lean Six Sigma (L6σ), Knowledge-Based System (KBS), healthcare leadership, Gauge Absence Prerequisites (GAP), Analytical Hierarchy Process (AHP)
Procedia PDF Downloads 1674899 Teaching Creative Thinking and Writing to Simultaneous Bilinguals: A Longitudinal Study of 6-7 Years Old English and Punjabi Language Learners
Authors: Hafiz Muhammad Fazalehaq
Abstract:
This paper documents the results of a longitudinal study done on two bilingual children who speak English and Punjabi simultaneously. Their father is a native English speaker whereas their mother speaks Punjabi. Their mother can speak both the languages (English and Punjabi) whereas their father only speaks English. At the age of six, these children have difficulty in creative thinking and of course creative writing. So, the first task for the researcher is to impress and entice the children to think creatively. Various and different methodologies and techniques were used to entice them to start thinking creatively. Creative thinking leads to creative writing. These children were exposed to numerous sources including videos, photographs, texts and audios at first place in order to have a taste of creative genres (stories in this case). The children were encouraged to create their own stories sometimes with photographs and sometimes by using their favorite toys. At a second stage, they were asked to write about an event or incident. After that, they were motivated to create new stories and write them. Length of their creative writing varies from a few sentences to a two standard page. After this six months’ study, the researcher was able to develop a ten steps methodology for creating and improving/enhancing creative thinking and creative writing skills of the subjects understudy. This ten-step methodology entices and motivates the learner to think creatively for producing a creative piece.Keywords: bilinguals, creative thinking, creative writing, simultaneous bilingual
Procedia PDF Downloads 3524898 Non-Destructive Evaluation for Physical State Monitoring of an Angle Section Thin-Walled Curved Beam
Authors: Palash Dey, Sudip Talukdar
Abstract:
In this work, a cross-breed approach is presented for obtaining both the amount of the damage intensity and location of damage existing in thin-walled members. This cross-breed approach is developed based on response surface methodology (RSM) and genetic algorithm (GA). Theoretical finite element (FE) model of cracked angle section thin walled curved beam has been linked to the developed approach to carry out trial experiments to generate response surface functions (RSFs) of free, forced and heterogeneous dynamic response data. Subsequently, the error between the computed response surface functions and measured dynamic response data has been minimized using GA to find out the optimum damage parameters (amount of the damage intensity and location). A single crack of varying location and depth has been considered in this study. The presented approach has been found to reveal good accuracy in prediction of crack parameters and possess great potential in crack detection as it requires only the current response of a cracked beam.Keywords: damage parameters, finite element, genetic algorithm, response surface methodology, thin walled curved beam
Procedia PDF Downloads 2484897 Applications of Building Information Modeling (BIM) in Knowledge Sharing and Management in Construction
Authors: Shu-Hui Jan, Shih-Ping Ho, Hui-Ping Tserng
Abstract:
Construction knowledge can be referred to and reused among involved project managers and job-site engineers to alleviate problems on a construction job-site and reduce the time and cost of solving problems related to constructability. This paper proposes a new methodology to provide sharing of construction knowledge by using the Building Information Modeling (BIM) approach. The main characteristics of BIM include illustrating 3D CAD-based presentations and keeping information in a digital format, and facilitation of easy updating and transfer of information in the 3D BIM environment. Using the BIM approach, project managers and engineers can gain knowledge related to 3D BIM and obtain feedback provided by job-site engineers for future reference. This study addresses the application of knowledge sharing management in the construction phase of construction projects and proposes a BIM-based Knowledge Sharing Management (BIMKSM) system for project managers and engineers. The BIMKSM system is then applied in a selected case study of a construction project in Taiwan to verify the proposed methodology and demonstrate the effectiveness of sharing knowledge in the BIM environment. The combined results demonstrate that the BIMKSM system can be used as a visual BIM-based knowledge sharing management platform by utilizing the BIM approach and web technology.Keywords: construction knowledge management, building information modeling, project management, web-based information system
Procedia PDF Downloads 3554896 Optimization of Poly-β-Hydroxybutyrate Recovery from Bacillus Subtilis Using Solvent Extraction Process by Response Surface Methodology
Authors: Jayprakash Yadav, Nivedita Patra
Abstract:
Polyhydroxybutyrate (PHB) is an interesting material in the field of medical science, pharmaceutical industries, and tissue engineering because of its properties such as biodegradability, biocompatibility, hydrophobicity, and elasticity. PHB is naturally accumulated by several microbes in their cytoplasm during the metabolic process as energy reserve material. PHB can be extracted from cell biomass using halogenated hydrocarbons, chemicals, and enzymes. In this study, a cheaper and non-toxic solvent, acetone, was used for the extraction process. The different parameters like acetone percentage, and solvent pH, process temperature, and incubation periods were optimized using the Response Surface Methodology (RSM). RSM was performed and the determination coefficient (R2) value was found to be 0.8833 from the quadratic regression model with no significant lack of fit. The designed RSM model results indicated that the fitness of the response variable was significant (P-value < 0.0006) and satisfactory to denote the relationship between the responses in terms of PHB recovery and purity with respect to the values of independent variables. Optimum conditions for the maximum PHB recovery and purity were found to be solvent pH 7, extraction temperature - 43 °C, incubation time - 70 minutes, and percentage acetone – 30 % from this study. The maximum predicted PHB recovery was found to be 0.845 g/g biomass dry cell weight and the purity was found to be 97.23 % using the optimized conditions.Keywords: acetone, PHB, RSM, halogenated hydrocarbons, extraction, bacillus subtilis.
Procedia PDF Downloads 4404895 Seismic Assessment of Non-Structural Component Using Floor Design Spectrum
Authors: Amin Asgarian, Ghyslaine McClure
Abstract:
Experiences in the past earthquakes have clearly demonstrated the necessity of seismic design and assessment of Non-Structural Components (NSCs) particularly in post-disaster structures such as hospitals, power plants, etc. as they have to be permanently functional and operational. Meeting this objective is contingent upon having proper seismic performance of both structural and non-structural components. Proper seismic design, analysis, and assessment of NSCs can be attained through generation of Floor Design Spectrum (FDS) in a similar fashion as target spectrum for structural components. This paper presents the developed methodology to generate FDS directly from corresponding Uniform Hazard Spectrum (UHS) (i.e. design spectra for structural components). The methodology is based on the experimental and numerical analysis of a database of 27 real Reinforced Concrete (RC) buildings which are located in Montreal, Canada. The buildings were tested by Ambient Vibration Measurements (AVM) and their dynamic properties have been extracted and used as part of the approach. Database comprises 12 low-rises, 10 medium-rises, and 5 high-rises and they are mostly designated as post-disaster\emergency shelters by the city of Montreal. The buildings are subjected to 20 compatible seismic records to UHS of Montreal and Floor Response Spectra (FRS) are developed for every floors in two horizontal direction considering four different damping ratios of NSCs (i.e. 2, 5, 10, and 20 % viscous damping). Generated FRS (approximately 132’000 curves) are statistically studied and the methodology is proposed to generate the FDS directly from corresponding UHS. The approach is capable of generating the FDS for any selection of floor level and damping ratio of NSCs. It captures the effect of: dynamic interaction between primary (structural) and secondary (NSCs) systems, higher and torsional modes of primary structure. These are important improvements of this approach compared to conventional methods and code recommendations. Application of the proposed approach are represented here through two real case-study buildings: one low-rise building and one medium-rise. The proposed approach can be used as practical and robust tool for seismic assessment and design of NSCs especially in existing post-disaster structures.Keywords: earthquake engineering, operational and functional components, operational modal analysis, seismic assessment and design
Procedia PDF Downloads 2134894 Removal of Chromium (VI) from Aqueous Solution by Teff (Eragrostis Teff) Husk Activated Carbon: Optimization, Kinetics, Isotherm, and Practical Adaptation Study Using Response Surface Methodology
Authors: Tsegaye Adane Birhan
Abstract:
Recently, rapid industrialization has led to the excessive release of heavy metals such as Cr (VI) into the environment. Exposure to chromium (VI) can cause kidney and liver damage, depressed immune systems, and a variety of cancers. Therefore, treatment of Cr (VI) containing wastewater is mandatory. This study aims to optimize the removal of Cr (VI) from an aqueous solution using locally available Teff husk-activated carbon adsorbent. The laboratory-based study was conducted on the optimization of Cr (VI) removal efficiency of Teff husk-activated carbon from aqueous solution. A central composite design was used to examine the effect of the interaction of process parameters and to optimize the process using Design Expert version 7.0 software. The optimized removal efficiency of Teff husk activated carbon (95.597%) was achieved at 1.92 pH, 87.83mg/L initial concentration, 20.22g/L adsorbent dose and 2.07Hrs contact time. The adsorption of Cr (VI) on Teff husk-activated carbon was found to be best fitted with pseudo-second-order kinetics and Langmuir isotherm model of the adsorption. Teff husk-activated carbon can be used as an efficient adsorbent for the removal of chromium (VI) from contaminated water. Column adsorption needs to be studied in the future.Keywords: batch adsorption, chromium (VI), teff husk activated carbon, response surface methodology, tannery wastewater
Procedia PDF Downloads 184893 The Effect of Physical Therapy on Triceps Surae Myofascial Trigger Point
Authors: M. Simon, O. Peillon, R. Seijas, P. Alvarez, A. Pérez-Bellmunt
Abstract:
Introduction: Myofascial trigger points (MTrPs) are defined as hyperirritable areas within taut bands of skeletal muscle and classified as either active or latent. Although they could be present in any muscle, the triceps surae is one of the most affected of the lower limb. The aim of this study was described which treatments are more used and their principal results. Study design: We performed a systematic literature search using strategies for the concepts of “Trigger Points and Gastrocnemius and Soleus not Trapezius” in Medline. Articles were screened by authors and included if they contained a rehabilitation intervention of MTrPs in healthy subjects or patients. Results: The treatments used were mostly invasive interventions and only a small part of the studies used non-invasive treatments. The methodology (time o type of intervention, characteristics of treatment, etc.) used in these treatments were frequently undefined. Overall, examination variables varied significantly among the included studies, but they were improving their parameters when the MTrPs were treated. Conclusions: There are a high variety of physical therapy treatments to improve the symptomatology of MTrPs when affect triceps surae muscle. Even so, not a single study analyzing the skeletal muscle contractile parameters (as maximal displacement or delay time) change with MTrPS therapy has been found. The treatments have to better specificity the methodology used in the futures investigation.Keywords: fascia, myofascial trigger points, physical therapy, triceps surae
Procedia PDF Downloads 1504892 The Effect of Sorafenibe on Soat1 Protein by Using Molecular Docking Method
Authors: Mahdiyeh Gholaminezhad
Abstract:
Context: The study focuses on the potential impact of Sorafenib on SOAT1 protein in liver cancer treatment, addressing the need for more effective therapeutic options. Research aim: To explore the effects of Sorafenib on the activity of SOAT1 protein in liver cancer cells. Methodology: Molecular docking was employed to analyze the interaction between Sorafenib and SOAT1 protein. Findings: The study revealed a significant effect of Sorafenib on the stability and activity of SOAT1 protein, suggesting its potential as a treatment for liver cancer. Theoretical importance: This research highlights the molecular mechanism underlying Sorafenib's anti-cancer properties, contributing to the understanding of its therapeutic effects. Data collection: Data on the molecular structure of Sorafenib and SOAT1 protein were obtained from computational simulations and databases. Analysis procedures: Molecular docking simulations were performed to predict the binding interactions between Sorafenib and SOAT1 protein. Question addressed: How does Sorafenib influence the activity of SOAT1 protein and what are the implications for liver cancer treatment? Conclusion: The study demonstrates the potential of Sorafenib as a targeted therapy for liver cancer by affecting the activity of SOAT1 protein. Reviewers' Comments: The study provides valuable insights into the molecular basis of Sorafenib's action on SOAT1 protein, suggesting its therapeutic potential. To enhance the methodology, the authors could consider validating the docking results with experimental data for further validation.Keywords: liver cancer, sorafenib, SOAT1, molecular docking
Procedia PDF Downloads 284891 Case Study Analysis of 2017 European Railway Traffic Management Incident: The Application of System for Investigation of Railway Interfaces Methodology
Authors: Sanjeev Kumar Appicharla
Abstract:
This paper presents the results of the modelling and analysis of the European Railway Traffic Management (ERTMS) safety-critical incident to raise awareness of biases in the systems engineering process on the Cambrian Railway in the UK using the RAIB 17/2019 as a primary input. The RAIB, the UK independent accident investigator, published the Report- RAIB 17/2019 giving the details of their investigation of the focal event in the form of immediate cause, causal factors, and underlying factors and recommendations to prevent a repeat of the safety-critical incident on the Cambrian Line. The Systems for Investigation of Railway Interfaces (SIRI) is the methodology used to model and analyze the safety-critical incident. The SIRI methodology uses the Swiss Cheese Model to model the incident and identify latent failure conditions (potentially less than adequate conditions) by means of the management oversight and risk tree technique. The benefits of the systems for investigation of railway interfaces methodology (SIRI) are threefold: first is that it incorporates the “Heuristics and Biases” approach advanced by 2002 Nobel laureate in Economic Sciences, Prof Daniel Kahneman, in the management oversight and risk tree technique to identify systematic errors. Civil engineering and programme management railway professionals are aware of the role “optimism bias” plays in programme cost overruns and are aware of bow tie (fault and event tree) model-based safety risk modelling techniques. However, the role of systematic errors due to “Heuristics and Biases” is not appreciated as yet. This overcomes the problems of omission of human and organizational factors from accident analysis. Second, the scope of the investigation includes all levels of the socio-technical system, including government, regulatory, railway safety bodies, duty holders, signaling firms and transport planners, and front-line staff such that lessons are learned at the decision making and implementation level as well. Third, the author’s past accident case studies are supplemented with research pieces of evidence drawn from the practitioner's and academic researchers’ publications as well. This is to discuss the role of system thinking to improve the decision-making and risk management processes and practices in the IEC 15288 systems engineering standard and in the industrial context such as the GB railways and artificial intelligence (AI) contexts as well.Keywords: accident analysis, AI algorithm internal audit, bounded rationality, Byzantine failures, heuristics and biases approach
Procedia PDF Downloads 1904890 Methodology of Personalizing Interior Spaces in Public Libraries
Authors: Baharak Mousapour
Abstract:
Creating public spaces which are tailored for the specific demands of the individuals is one of the challenges for the contemporary interior designers. Improving the general knowledge as well as providing a forum for all walks of life to exploit is one of the objectives of a public library. In this regard, interior design in consistent with the demands of the individuals is of paramount importance. Seemingly, study spaces, in particular, those in close relation to the personalized sector, have proven to be challenging, according to the literature. To address this challenge, attributes of individuals, namely, perception of people from public spaces and their interactions with the so-called spaces, should be analyzed to provide interior designers with something to work on. This paper follows the analytic-descriptive research methodology by outlining case study libraries which have personalized public libraries with the investigation of the type of personalization as its primary objective and (I) recognition of physical schedule and the know-how of the spatial connection in indoor design of a library and (II) analysis of each personalized space in relation to other spaces of the library as its secondary objectives. The significance of the current research lies in the concept of personalization as one of the most recent methods of attracting people to libraries. Previous research exists in this regard, but the lack of data concerning personalization makes this topic worth investigating. Hence, this study aims to put forward approaches through real-case studies for the designers to deal with this concept.Keywords: interior design, library, library design, personalization
Procedia PDF Downloads 1504889 Assessment Methodology of E-government Projects for the Regions of Georgia
Authors: Tina Melkoshvili
Abstract:
Drastic development of information and communication technologies in Georgia has led to the necessity of launching conceptually new, effective, flexible, transparent and society oriented form of government that is e-government. Through applying information technologies, the electronic system enables to raise the efficacy of state governance and increase citizens’ participation in the process. Focusing on the topic of e-government allows us to analyze success stories, attributed benefits and, at the same time, observes challenges hampering the government development process. There are number of methodologies elaborated to study the conditions in the field of electronic governance. They enable us to find out if the government is ready to apply broad opportunities of information and communication technologies and if the government is apt to improve the accessibility and quality of delivering mainly social services. This article seeks to provide comparative analysis of widely spread methodologies used for Electronic government projects’ assessment. It has been concluded that applying current methods of assessment in Georgia is related to difficulties due to inaccessible data and the necessity of involving number of experts. The article presents new indicators for e-government development assessment that reflect efficacy of e-government conception realization in the regions of Georgia and enables to provide quantitative evaluation of regional e-government projects including all significant aspects of development.Keywords: development methodology, e-government in Georgia, information and communication technologies, regional government
Procedia PDF Downloads 2774888 Critical Factors in the Formation, Development and Survival of an Eco-Industrial Park: A Systemic Understanding of Industrial Symbiosis
Authors: Iván González, Pablo Andrés Maya, Sebastián Jaén
Abstract:
Eco-industrial parks (EIPs) work as networks for the exchange of by-products, such as materials, water, or energy. This research identifies the relevant factors in the formation of EIPs in different industrial environments around the world. Then an aggregation of these factors is carried out to reduce them from 50 to 17 and classify them according to 5 fundamental axes. Subsequently, the Vester Sensitivity Model (VSM) systemic methodology is used to determine the influence of the 17 factors on an EIP system and the interrelationship between them. The results show that the sequence of effects between factors: Trust and Cooperation → Business Association → Flows → Additional Income represents the “backbone” of the system, being the most significant chain of influences. In addition, the Organizational Culture represents the turning point of the Industrial Symbiosis on which it must act correctly to avoid falling into unsustainable economic development. Finally, the flow of Information should not be lost since it is what feeds trust between the parties, and the latter strengthens the system in the face of individual or global imbalances. This systemic understanding will enable the formulation of pertinent policies by the actors that interact in the formation and permanence of the EIP. In this way, it seeks to promote large-scale sustainable industrial development, integrating various community actors, which in turn will give greater awareness and appropriation of the current importance of sustainability in industrial production.Keywords: critical factors, eco-industrial park, industrial symbiosis, system methodology
Procedia PDF Downloads 1264887 Evaluation of Different Waste Management Planning Strategies in an Industrial City
Authors: Leila H. Khiabani, Mohammadreza Vafaee, Farshad Hashemzadeh
Abstract:
Industrial waste management regulates different stages of production, storage, transfer, recycling and waste disposal. There are several common practices for industrial waste management. However, due to various local health, economic, social, environmental and aesthetic considerations, the most optimal principles and measures often vary at each specific industrial zone. In addition, waste management strategies are heavily impacted by local administrative, legal, and financial regulations. In this study, a hybrid qualitative and quantitative research methodology has been designed for waste management planning in an industrial city. Firstly, following a qualitative research methodology, the most relevant waste management strategies for the specific industrial city were identified through interviews with environmental planning and waste management experts. Forty experts participated in this study. Alborz industrial city in Iran, which hosts more than one thousand industrial units in nine hundred acres, was chosen as the sample industrial city in this study. The findings from the expert interviews at the first phase were then used to design a quantitative questionnaire for the second phase of the study. The aim of the questionnaire was to quantify the relative impact of different waste management strategies in the sample industrial city. Eight waste management strategies and three implementation policies were included in the questionnaire. The experts were asked to rank the relative effectiveness of each strategy for environmental planning of the sample industrial city. They were also asked to rank the relative effectiveness of each planning policy on each of the waste management strategies. In the end, the weighted average of all the responses was calculated to identify the most effective waste management strategy and planning policies for the sample industrial city. The results suggested that among the eight suggested waste management strategies, industrial composting is the most effective (31%) strategy based on the collective evaluation of the local expert. Additionally, the results suggested that the most effective policy (58%) in the city’s environmental planning is to reduce waste generation by prolonging the effective life of industrial products using higher quality and recyclable materials. These findings can provide useful expert guidelines for prioritization between different waste management strategies in the city’s overall environmental planning roadmap. The findings may also be applicable to similar industrial cities. In addition, a similar methodology can be utilized in the environmental planning of other industrial cities.Keywords: environmental planning, industrial city, quantitative research, waste management
Procedia PDF Downloads 1324886 The European Research and Development Project Improved Nuclear Site Characterization for Waste Minimization in Decommissioning under Constrained Environment: Focus on Performance Analysis and Overall Uncertainty
Authors: M. Crozet, D. Roudil, T. Branger, S. Boden, P. Peerani, B. Russell, M. Herranz, L. Aldave de la Heras
Abstract:
The EURATOM work program project INSIDER (Improved Nuclear Site Characterization for Waste minimization in Decommissioning under Constrained Environment) was launched in June 2017. This 4-year project has 18 partners and aims at improving the management of contaminated materials arising from decommissioning and dismantling (D&D) operations by proposing an integrated methodology of characterization. This methodology is based on advanced statistical processing and modelling, coupled with adapted and innovative analytical and measurement methods, with respect to sustainability and economic objectives. In order to achieve these objectives, the approaches will be then applied to common case studies in the form of Inter-laboratory comparisons on matrix representative reference samples and benchmarking. Work Package 6 (WP6) ‘Performance analysis and overall uncertainty’ is in charge of the analysis of the benchmarking on real samples, the organisation of inter-laboratory comparison on synthetic certified reference materials and the establishment of overall uncertainty budget. Assessment of the outcome will be used for providing recommendations and guidance resulting in pre-standardization tests.Keywords: decommissioning, sampling strategy, research and development, characterization, European project
Procedia PDF Downloads 3654885 Delineation of Green Infrastructure Buffer Areas with a Simulated Annealing: Consideration of Ecosystem Services Trade-Offs in the Objective Function
Authors: Andres Manuel Garcia Lamparte, Rocio Losada Iglesias, Marcos BoullóN Magan, David Miranda Barros
Abstract:
The biodiversity strategy of the European Union for 2030, mentions climate change as one of the key factors for biodiversity loss and considers green infrastructure as one of the solutions to this problem. In this line, the European Commission has developed a green infrastructure strategy which commits members states to consider green infrastructure in their territorial planning. This green infrastructure is aimed at granting the provision of a wide number of ecosystem services to support biodiversity and human well-being by countering the effects of climate change. Yet, there are not too many tools available to delimit green infrastructure. The available ones consider the potential of the territory to provide ecosystem services. However, these methods usually aggregate several maps of ecosystem services potential without considering possible trade-offs. This can lead to excluding areas with a high potential for providing ecosystem services which have many trade-offs with other ecosystem services. In order to tackle this problem, a methodology is proposed to consider ecosystem services trade-offs in the objective function of a simulated annealing algorithm aimed at delimiting green infrastructure multifunctional buffer areas. To this end, the provision potential maps of the regulating ecosystem services considered to delimit the multifunctional buffer areas are clustered in groups, so that ecosystem services that create trade-offs are excluded in each group. The normalized provision potential maps of the ecosystem services in each group are added to obtain a potential map per group which is normalized again. Then the potential maps for each group are combined in a raster map that shows the highest provision potential value in each cell. The combined map is then used in the objective function of the simulated annealing algorithm. The algorithm is run both using the proposed methodology and considering the ecosystem services individually. The results are analyzed with spatial statistics and landscape metrics to check the number of ecosystem services that the delimited areas produce, as well as their regularity and compactness. It has been observed that the proposed methodology increases the number of ecosystem services produced by delimited areas, improving their multifunctionality and increasing their effectiveness in preventing climate change impacts.Keywords: ecosystem services trade-offs, green infrastructure delineation, multifunctional buffer areas, climate change
Procedia PDF Downloads 1754884 Empirical Evidence to Beliefs and Perceptions About Mental Health Disorder and Substance Abuse: The Role of a Social Worker
Authors: Helena Baffoe
Abstract:
Context: In the United States, there have been significant advancements in programs aimed at improving the lives of individuals with mental health disorders and substance abuse problems. However, public attitudes and beliefs regarding these issues have not improved correspondingly. This study aims to explore the perceptions and beliefs surrounding mental health disorders and substance abuse in the context of data analytics in the field of social work. Research Aim: The aim of this research is to provide empirical evidence on the beliefs and perceptions regarding mental health disorders and substance abuse. Specifically, the study seeks to answer the question of whether being diagnosed with a mental disorder implies a diagnosis of substance abuse. Additionally, the research aims to analyze the specific roles that social workers can play in addressing individuals with mental disorders. Methodology: This research adopts a data-driven methodology, acquiring comprehensive data from the Substance Abuse and Mental Health Services Administration (SAMHSA). A noteworthy causal connection between mental disorders and substance abuse exists, a relationship that current literature tends to overlook critically. To address this gap, we applied logistic regression with an Instrumental Variable approach, effectively mitigating potential endogeneity issues in the analysis in order to ensure robust and unbiased results. This methodology allows for a rigorous examination of the relationship between mental disorders and substance abuse. Empirical Findings: The analysis of the data reveals that depressive, anxiety, and trauma/stressor mental disorders are the most common in the United States. However, the study does not find statistically significant evidence to support the notion that being diagnosed with these mental disorders necessarily implies a diagnosis of substance abuse. This suggests that there is a misconception among the public regarding the relationship between mental health disorders and substance abuse. Theoretical Importance: The research contributes to the existing body of literature by providing empirical evidence to challenge prevailing beliefs and perceptions regarding mental health disorders and substance abuse. By using a novel methodological approach and analyzing new US data, the study sheds light on the cultural and social factors that influence these attitudes.Keywords: mental health disorder, substance abuse, empirical evidence, logistic regression with IV
Procedia PDF Downloads 654883 Radial Basis Surrogate Model Integrated to Evolutionary Algorithm for Solving Computation Intensive Black-Box Problems
Authors: Abdulbaset Saad, Adel Younis, Zuomin Dong
Abstract:
For design optimization with high-dimensional expensive problems, an effective and efficient optimization methodology is desired. This work proposes a series of modification to the Differential Evolution (DE) algorithm for solving computation Intensive Black-Box Problems. The proposed methodology is called Radial Basis Meta-Model Algorithm Assisted Differential Evolutionary (RBF-DE), which is a global optimization algorithm based on the meta-modeling techniques. A meta-modeling assisted DE is proposed to solve computationally expensive optimization problems. The Radial Basis Function (RBF) model is used as a surrogate model to approximate the expensive objective function, while DE employs a mechanism to dynamically select the best performing combination of parameters such as differential rate, cross over probability, and population size. The proposed algorithm is tested on benchmark functions and real life practical applications and problems. The test results demonstrate that the proposed algorithm is promising and performs well compared to other optimization algorithms. The proposed algorithm is capable of converging to acceptable and good solutions in terms of accuracy, number of evaluations, and time needed to converge.Keywords: differential evolution, engineering design, expensive computations, meta-modeling, radial basis function, optimization
Procedia PDF Downloads 3974882 Surface Flattening Assisted with 3D Mannequin Based on Minimum Energy
Authors: Shih-Wen Hsiao, Rong-Qi Chen, Chien-Yu Lin
Abstract:
The topic of surface flattening plays a vital role in the field of computer aided design and manufacture. Surface flattening enables the production of 2D patterns and it can be used in design and manufacturing for developing a 3D surface to a 2D platform, especially in fashion design. This study describes surface flattening based on minimum energy methods according to the property of different fabrics. Firstly, through the geometric feature of a 3D surface, the less transformed area can be flattened on a 2D platform by geodesic. Then, strain energy that has accumulated in mesh can be stably released by an approximate implicit method and revised error function. In some cases, cutting mesh to further release the energy is a common way to fix the situation and enhance the accuracy of the surface flattening, and this makes the obtained 2D pattern naturally generate significant cracks. When this methodology is applied to a 3D mannequin constructed with feature lines, it enhances the level of computer-aided fashion design. Besides, when different fabrics are applied to fashion design, it is necessary to revise the shape of a 2D pattern according to the properties of the fabric. With this model, the outline of 2D patterns can be revised by distributing the strain energy with different results according to different fabric properties. Finally, this research uses some common design cases to illustrate and verify the feasibility of this methodology.Keywords: surface flattening, strain energy, minimum energy, approximate implicit method, fashion design
Procedia PDF Downloads 3384881 Preparation of Amla (Phyllanthus emblica) Powder Using Spray Drying Technique
Authors: Shubham Mandliya, Pooja Pandey, H. N. Mishra
Abstract:
Amla (Phyllanthus emblica), a plant of Euphorbiaceous is widely distributed in subtropical and tropical areas of China, India, Indonesia, and Malaysia. Amla is very high in vitamin C content. Spray drying of fruit juices represents another alternative way to improve the physicochemical stability and increase their shelf life. Samples of amla powder were produced using the spray drying method to investigate the effect of inlet temperatures and maltodextrin levels. The spray dryer model used was a laboratory scale dryer and samples were run at different temperatures and concentrations. The response surface methodology (RSM) was used to optimize the spray-drying process for the development of amla powder. The resultant powders were then analyzed for vitamin C, moisture, solubility and dispersibility. The spray dried amla powder contains higher amounts of vitamin C when compared to commercial fruit juice powders. SEM analysis revealed that lower maltodextrin levels and higher inlet air temperatures resulted in smaller but smoother particles. At lower temperature, vitamin C content is high as compared to higher temperature. Spray drying is an effective as well as an economic method which can be commercially used for making powder rather than by tray or solar drying as more fraction is retained with less cost.Keywords: Amla powder, physiochemical properties, response surface methodology, spray drying
Procedia PDF Downloads 2474880 Load Forecast of the Peak Demand Based on Both the Peak Demand and Its Location
Authors: Qais H. Alsafasfeh
Abstract:
The aim of this paper is to provide a forecast of the peak demand for the next 15 years for electrical distribution companies. The proposed methodology provides both the peak demand and its location for the next 15 years. This paper describes the Spatial Load Forecasting model used, the information provided by electrical distribution company in Jordan, the workflow followed, the parameters used and the assumptions made to run the model. The aim of this paper is to provide a forecast of the peak demand for the next 15 years for electrical distribution companies. The proposed methodology provides both the peak demand and its location for the next 15 years. This paper describes the Spatial Load Forecasting model used, the information provided by electrical distribution company in Jordan, the workflow followed, the parameters used and the assumptions made to run the model.Keywords: load forecast, peak demand, spatial load, electrical distribution
Procedia PDF Downloads 495