Search results for: heading time
16029 Evaluation of Mixing and Oxygen Transfer Performances for a Stirred Bioreactor Containing P. chrysogenum Broths
Authors: A. C. Blaga, A. Cârlescu, M. Turnea, A. I. Galaction, D. Caşcaval
Abstract:
The performance of an aerobic stirred bioreactor for fungal fermentation was analyzed on the basis of mixing time and oxygen mass transfer coefficient, by quantifying the influence of some specific geometrical and operational parameters of the bioreactor, as well as the rheological behavior of Penicillium chrysogenum broth (free mycelia and mycelia aggregates). The rheological properties of the fungus broth, controlled by the biomass concentration, its growth rate, and morphology strongly affect the performance of the bioreactor. Experimental data showed that for both morphological structures the accumulation of fungus biomass induces a significant increase of broths viscosity and modifies the rheological behavior. For lower P. chrysogenum concentrations (both morphological conformations), the mixing time initially increases with aeration rate, reaches a maximum value and decreases. This variation can be explained by the formation of small bubbles, due to the presence of solid phase which hinders the bubbles coalescence, the rising velocity of bubbles being reduced by the high apparent viscosity of fungus broths. By biomass accumulation, the variation of mixing time with aeration rate is gradually changed, the continuous reduction of mixing time with air input flow increase being obtained for 33.5 g/l d.w. P. chrysogenum. Owing to the superior apparent viscosity, which reduces considerably the relative contribution of mechanical agitation to the broths mixing, these phenomena are more pronounced for P. chrysogenum free mycelia. Due to the increase of broth apparent viscosity, the biomass accumulation induces two significant effects on oxygen transfer rate: the diminution of turbulence and perturbation of bubbles dispersion - coalescence equilibrium. The increase of P. chrysogenum free mycelia concentration leads to the decrease of kla values. Thus, for the considered variation domain of the main parameters taken into account, namely air superficial velocity from 8.36 10-4 to 5.02 10-3 m/s and specific power input from 100 to 500 W/m3, kla was reduced for 3.7 times for biomass concentration increase from 4 to 36.5 g/l d.w. The broth containing P. crysogenum mycelia aggregates exhibits a particular behavior from the point of view of oxygen transfer. Regardless of bioreactor operating conditions, the increase of biomass concentration leads initially to the increase of oxygen mass transfer rate, the phenomenon that can be explained by the interaction of pellets with bubbles. The results are in relation with the increase of apparent viscosity of broths corresponding to the variation of biomass concentration between the mentioned limits. Thus, the apparent viscosity of the suspension of fungus mycelia aggregates increased for 44.2 times and fungus free mycelia for 63.9 times for CX increase from 4 to 36.5 g/l d.w. By means of the experimental data, some mathematical correlations describing the influences of the considered factors on mixing time and kla have been proposed. The proposed correlations can be used in bioreactor performance evaluation, optimization, and scaling-up.Keywords: biomass concentration, mixing time, oxygen mass transfer, P. chrysogenum broth, stirred bioreactor
Procedia PDF Downloads 34016028 A Physiological Approach for Early Detection of Hemorrhage
Authors: Rabie Fadil, Parshuram Aarotale, Shubha Majumder, Bijay Guargain
Abstract:
Hemorrhage is the loss of blood from the circulatory system and leading cause of battlefield and postpartum related deaths. Early detection of hemorrhage remains the most effective strategy to reduce mortality rate caused by traumatic injuries. In this study, we investigated the physiological changes via non-invasive cardiac signals at rest and under different hemorrhage conditions simulated through graded lower-body negative pressure (LBNP). Simultaneous electrocardiogram (ECG), photoplethysmogram (PPG), blood pressure (BP), impedance cardiogram (ICG), and phonocardiogram (PCG) were acquired from 10 participants (age:28 ± 6 year, weight:73 ± 11 kg, height:172 ± 8 cm). The LBNP protocol consisted of applying -20, -30, -40, -50, and -60 mmHg pressure to the lower half of the body. Beat-to-beat heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean aerial pressure (MAP) were extracted from ECG and blood pressure. Systolic amplitude (SA), systolic time (ST), diastolic time (DT), and left ventricle Ejection time (LVET) were extracted from PPG during each stage. Preliminary results showed that the application of -40 mmHg i.e. moderate stage simulated hemorrhage resulted significant changes in HR (85±4 bpm vs 68 ± 5bpm, p < 0.01), ST (191 ± 10 ms vs 253 ± 31 ms, p < 0.05), LVET (350 ± 14 ms vs 479 ± 47 ms, p < 0.05) and DT (551 ± 22 ms vs 683 ± 59 ms, p < 0.05) compared to rest, while no change was observed in SA (p > 0.05) as a consequence of LBNP application. These findings demonstrated the potential of cardiac signals in detecting moderate hemorrhage. In future, we will analyze all the LBNP stages and investigate the feasibility of other physiological signals to develop a predictive machine learning model for early detection of hemorrhage.Keywords: blood pressure, hemorrhage, lower-body negative pressure, LBNP, machine learning
Procedia PDF Downloads 16716027 A Parallel Algorithm for Solving the PFSP on the Grid
Authors: Samia Kouki
Abstract:
Solving NP-hard combinatorial optimization problems by exact search methods, such as Branch-and-Bound, may degenerate to complete enumeration. For that reason, exact approaches limit us to solve only small or moderate size problem instances, due to the exponential increase in CPU time when problem size increases. One of the most promising ways to reduce significantly the computational burden of sequential versions of Branch-and-Bound is to design parallel versions of these algorithms which employ several processors. This paper describes a parallel Branch-and-Bound algorithm called GALB for solving the classical permutation flowshop scheduling problem as well as its implementation on a Grid computing infrastructure. The experimental study of our distributed parallel algorithm gives promising results and shows clearly the benefit of the parallel paradigm to solve large-scale instances in moderate CPU time.Keywords: grid computing, permutation flow shop problem, branch and bound, load balancing
Procedia PDF Downloads 28316026 Adsorption of Xylene Cyanol FF onto Activated Carbon from Brachystegia Eurycoma Seed Hulls: Determination of the Optimal Conditions by Statistical Design of Experiments
Authors: F. G Okibe, C. E Gimba, V. O Ajibola, I. G Ndukwe, E. D. Paul
Abstract:
A full factorial experimental design technique at two levels and four factors (24) was used to optimize the adsorption at 615 nm of Xylene Cyanol ff in aqueous solutions onto activated carbon prepared from brachystegia eurycoma seed hulls by chemical carbonization method. The effect of pH (3 and 5), initial dye concentration (20 and 60 mg/l), adsorbent dosage (0.01 and 0.05 g), and contact time (30 and 60 min) on removal efficiency of the adsorbent for the dye were investigated at 298K. From the analysis of variance, response surface and cube plot, adsorbent dosage was observed to be the most significant factor affecting the adsorption process. However, from the interaction between the variables studied, the optimum removal efficiency was 96.80 % achieved with adsorbent dosage of 0.05 g, contact time 45 minutes, pH 3, and initial dye concentration 60 mg/l.Keywords: factorial experimental design, adsorption, optimization, brachystegia eurycoma, xylene cyanol ff
Procedia PDF Downloads 40016025 Evaluation of Sequential Polymer Flooding in Multi-Layered Heterogeneous Reservoir
Authors: Panupong Lohrattanarungrot, Falan Srisuriyachai
Abstract:
Polymer flooding is a well-known technique used for controlling mobility ratio in heterogeneous reservoirs, leading to improvement of sweep efficiency as well as wellbore profile. However, low injectivity of viscous polymer solution attenuates oil recovery rate and consecutively adds extra operating cost. An attempt of this study is to improve injectivity of polymer solution while maintaining recovery factor, enhancing effectiveness of polymer flooding method. This study is performed by using reservoir simulation program to modify conventional single polymer slug into sequential polymer flooding, emphasizing on increasing of injectivity and also reduction of polymer amount. Selection of operating conditions for single slug polymer including pre-injected water, polymer concentration and polymer slug size is firstly performed for a layered-heterogeneous reservoir with Lorenz coefficient (Lk) of 0.32. A selected single slug polymer flooding scheme is modified into sequential polymer flooding with reduction of polymer concentration in two different modes: Constant polymer mass and reduction of polymer mass. Effects of Residual Resistance Factor (RRF) is also evaluated. From simulation results, it is observed that first polymer slug with the highest concentration has the main function to buffer between displacing phase and reservoir oil. Moreover, part of polymer from this slug is also sacrificed for adsorption. Reduction of polymer concentration in the following slug prevents bypassing due to unfavorable mobility ratio. At the same time, following slugs with lower viscosity can be injected easily through formation, improving injectivity of the whole process. A sequential polymer flooding with reduction of polymer mass shows great benefit by reducing total production time and amount of polymer consumed up to 10% without any downside effect. The only advantage of using constant polymer mass is slightly increment of recovery factor (up to 1.4%) while total production time is almost the same. Increasing of residual resistance factor of polymer solution yields a benefit on mobility control by reducing effective permeability to water. Nevertheless, higher adsorption results in low injectivity, extending total production time. Modifying single polymer slug into sequence of reduced polymer concentration yields major benefits on reducing production time as well as polymer mass. With certain design of polymer flooding scheme, recovery factor can even be further increased. This study shows that application of sequential polymer flooding can be certainly applied to reservoir with high value of heterogeneity since it requires nothing complex for real implementation but just a proper design of polymer slug size and concentration.Keywords: polymer flooding, sequential, heterogeneous reservoir, residual resistance factor
Procedia PDF Downloads 47816024 Challenge Response-Based Authentication for a Mobile Voting System
Authors: Tohari Ahmad, Hudan Studiawan, Iwang Aryadinata, Royyana M. Ijtihadie, Waskitho Wibisono
Abstract:
A manual voting system has been implemented worldwide. It has some weaknesses which may decrease the legitimacy of the voting result. An electronic voting system is introduced to minimize this weakness. It has been able to provide a better result, in terms of the total time taken in the voting process and accuracy. Nevertheless, people may be reluctant to go to the polling location because of some reasons, such as distance and time. In order to solve this problem, mobile voting is implemented by utilizing mobile devices. There are many mobile voting architectures available. Overall, authenticity of the users is the common problem of all voting systems. There must be a mechanism which can verify the users’ authenticity such that only verified users can give their vote once; others cannot vote. In this paper, a challenge response-based authentication is proposed by utilizing properties of the users, for example, something they have and know. In terms of speed, the proposed system provides good result, in addition to other capabilities offered by the system.Keywords: authentication, data protection, mobile voting, security
Procedia PDF Downloads 41916023 Experimental Study on Two-Step Pyrolysis of Automotive Shredder Residue
Authors: Letizia Marchetti, Federica Annunzi, Federico Fiorini, Cristiano Nicolella
Abstract:
Automotive shredder residue (ASR) is a mixture of waste that makes up 20-25% of end-of-life vehicles. For many years, ASR was commonly disposed of in landfills or incinerated, causing serious environmental problems. Nowadays, thermochemical treatments are a promising alternative, although the heterogeneity of ASR still poses some challenges. One of the emerging thermochemical treatments for ASR is pyrolysis, which promotes the decomposition of long polymeric chains by providing heat in the absence of an oxidizing agent. In this way, pyrolysis promotes the conversion of ASR into solid, liquid, and gaseous phases. This work aims to improve the performance of a two-step pyrolysis process. After the characterization of the analysed ASR, the focus is on determining the effects of residence time on product yields and gas composition. A batch experimental setup that reproduces the entire process was used. The setup consists of three sections: the pyrolysis section (made of two reactors), the separation section, and the analysis section. Two different residence times were investigated to find suitable conditions for the first sample of ASR. These first tests showed that the products obtained were more sensitive to residence time in the second reactor. Indeed, slightly increasing residence time in the second reactor managed to raise the yield of gas and carbon residue and decrease the yield of liquid fraction. Then, to test the versatility of the setup, the same conditions were applied to a different sample of ASR coming from a different chemical plant. The comparison between the two ASR samples shows that similar product yields and compositions are obtained using the same setup.Keywords: automotive shredder residue, experimental tests, heterogeneity, product yields, two-step pyrolysis
Procedia PDF Downloads 12716022 Creating Risk Maps on the Spatiotemporal Occurrence of Agricultural Insecticides in Sub-Saharan Africa
Authors: Chantal Hendriks, Harry Gibson, Anna Trett, Penny Hancock, Catherine Moyes
Abstract:
The use of modern inputs for crop protection, such as insecticides, is strongly underestimated in Sub-Saharan Africa. Several studies measured toxic concentrations of insecticides in fruits, vegetables and fish that were cultivated in Sub-Saharan Africa. The use of agricultural insecticides has impact on human and environmental health, but it also has the potential to impact on insecticide resistance in malaria transmitting mosquitos. To analyse associations between historic use of agricultural insecticides and the distribution of insecticide resistance through space and time, the use and environmental fate of agricultural insecticides needs to be mapped through the same time period. However, data on the use and environmental fate of agricultural insecticides in Africa are limited and therefore risk maps on the spatiotemporal occurrence of agricultural insecticides are created using environmental data. Environmental data on crop density and crop type were used to select the areas that most likely receive insecticides. These areas were verified by a literature review and expert knowledge. Pesticide fate models were compared to select most dominant processes that are involved in the environmental fate of insecticides and that can be mapped at a continental scale. The selected processes include: surface runoff, erosion, infiltration, volatilization and the storing and filtering capacity of soils. The processes indicate the risk for insecticide accumulation in soil, water, sediment and air. A compilation of all available data for traces of insecticides in the environment was used to validate the maps. The risk maps can result in space and time specific measures that reduce the risk of insecticide exposure to non-target organisms.Keywords: crop protection, pesticide fate, tropics, insecticide resistance
Procedia PDF Downloads 14116021 An Audit of Climate Change and Sustainability Teaching in Medical School
Authors: Karolina Wieczorek, Zofia Przypaśniak
Abstract:
Climate change is a rapidly growing threat to global health, and part of the responsibility to combat it lies within the healthcare sector itself, including adequate education of future medical professionals. To mitigate the consequences, the General Medical Council (GMC) has equipped medical schools with a list of outcomes regarding sustainability teaching. Students are expected to analyze the impact of the healthcare sector’s emissions on climate change. The delivery of the related teaching content is, however, often inadequate and insufficient time is devoted for exploration of the topics. Teaching curricula lack in-depth exploration of the learning objectives. This study aims to assess the extent and characteristics of climate change and sustainability subjects teaching in the curriculum of a chosen UK medical school (Barts and The London School of Medicine and Dentistry). It compares the data to the national average scores from the Climate Change and Sustainability Teaching (C.A.S.T.) in Medical Education Audit to draw conclusions about teaching on a regional level. This is a single-center audit of the timetabled sessions of teaching in the medical course. The study looked at the academic year 2020/2021 which included a review of all non-elective, core curriculum teaching materials including tutorials, lectures, written resources, and assignments in all five years of the undergraduate and graduate degrees, focusing only on mandatory teaching attended by all students (excluding elective modules). The topics covered were crosschecked with GMC Outcomes for graduates: “Educating for Sustainable Healthcare – Priority Learning Outcomes” as gold standard to look for coverage of the outcomes and gaps in teaching. Quantitative data was collected in form of time allocated for teaching as proxy of time spent per individual outcomes. The data was collected independently by two students (KW and ZP) who have received prior training and assessed two separate data sets to increase interrater reliability. In terms of coverage of learning outcomes, 12 out of 13 were taught (with the national average being 9.7). The school ranked sixth in the UK for time spent per topic and second in terms of overall coverage, meaning the school has a broad range of topics taught with some being explored in more detail than others. For the first outcome 4 out of 4 objectives covered (average 3.5) with 47 minutes spent per outcome (average 84 min), for the second objective 5 out of 5 covered (average 3.5) with 46 minutes spent (average 20), for the third 3 out of 4 (average 2.5) with 10 mins pent (average 19 min). A disproportionately large amount of time is spent delivering teaching regarding air pollution (respiratory illnesses), which resulted in the topic of sustainability in other specialties being excluded from teaching (musculoskeletal, ophthalmology, pediatrics, renal). Conclusions: Currently, there is no coherent strategy on national teaching of climate change topics and as a result an unstandardized amount of time spent on teaching and coverage of objectives can be observed.Keywords: audit, climate change, sustainability, education
Procedia PDF Downloads 8616020 Integrated Model for Enhancing Data Security Processing Time in Cloud Computing
Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali
Abstract:
Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a simple user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.Keywords: cloud computing, data security, SAAS, PAAS, IAAS, Blowfish
Procedia PDF Downloads 35916019 Optimization of Extraction Conditions for Phenolic Compounds from Deverra Scoparia Coss and Dur
Authors: Roukia Hammoudi, Chabrouk Farid, Dehak Karima, Mahfoud Hadj Mahammed, Mohamed Didi Ouldelhadj
Abstract:
The objective of this study was to optimise the extraction conditions for phenolic compounds from Deverra scoparia Coss and Dur. Apiaceae plant by ultrasound assisted extraction (UAE). The effects of solvent type (acetone, ethanol and methanol), solvent concentration (%), extraction time (mins) and extraction temperature (°C) on total phenolic content (TPC) were determined. The optimum extraction conditions were found to be acetone concentration of 80%, extraction time of 25 min and extraction temperature of 25°C. Under the optimized conditions, the value for TPC was 9.68 ± 1.05 mg GAE/g of extract. The study of the antioxidant power of these oils was performed by the method of DPPH. The results showed that antioxidant activity of the Deverra scoparia essential oil was more effective as compared to ascorbic acid and trolox.Keywords: Deverra scoparia, phenolic compounds, ultrasound assisted extraction, total phenolic content, antioxidant activity
Procedia PDF Downloads 60316018 Optimization of Extraction Conditions for Phenolic Compounds from Deverra scoparia Coss. and Dur
Authors: Roukia Hammoudi, Dehak Karima, Chabrouk Farid, Mahfoud Hadj Mahammed, Mohamed Didi Ouldelhadj
Abstract:
The objective of this study was to optimise the extraction conditions for phenolic compounds from Deverra scoparia Coss and Dur. Apiaceae plant by ultrasound assisted extraction (UAE). The effects of solvent type (Acetone, Ethanol and methanol), solvent concentration (%), extraction time (mins) and extraction temperature (°C) on total phenolic content (TPC) were determined. the optimum extraction conditions were found to be acetone concentration of 80%, extraction time of 25 min and extraction temperature of 25°C. Under the optimized conditions, the value for TPC was 9.68 ± 1.05 mg GAE/g of extract. The study of the antioxidant power of these oils was performed by the method of DPPH. The results showed that antioxidant activity of the Deverra scoparia essential oil was more effective as compared to ascorbic acid and trolox.Keywords: Deverra scoparia, phenolic compounds, ultrasound assisted extraction, total phenolic content, antioxidant activity
Procedia PDF Downloads 59516017 Model Predictive Control Applied to Thermal Regulation of Thermoforming Process Based on the Armax Linear Model and a Quadratic Criterion Formulation
Authors: Moaine Jebara, Lionel Boillereaux, Sofiane Belhabib, Michel Havet, Alain Sarda, Pierre Mousseau, Rémi Deterre
Abstract:
Energy consumption efficiency is a major concern for the material processing industry such as thermoforming process and molding. Indeed, these systems should deliver the right amount of energy at the right time to the processed material. Recent technical development, as well as the particularities of the heating system dynamics, made the Model Predictive Control (MPC) one of the best candidates for thermal control of several production processes like molding and composite thermoforming to name a few. The main principle of this technique is to use a dynamic model of the process inside the controller in real time in order to anticipate the future behavior of the process which allows the current timeslot to be optimized while taking future timeslots into account. This study presents a procedure based on a predictive control that brings balance between optimality, simplicity, and flexibility of its implementation. The development of this approach is progressive starting from the case of a single zone before its extension to the multizone and/or multisource case, taking thus into account the thermal couplings between the adjacent zones. After a quadratic formulation of the MPC criterion to ensure the thermal control, the linear expression is retained in order to reduce calculation time thanks to the use of the ARMAX linear decomposition methods. The effectiveness of this approach is illustrated by experiment and simulation.Keywords: energy efficiency, linear decomposition methods, model predictive control, mold heating systems
Procedia PDF Downloads 27216016 Pilomatrixoma of the Left Infra-Orbital Region in a 9 Year Old
Authors: Zainab Shaikh, Yusuf Miyanji
Abstract:
Pilomatrixoma is a benign neoplasm of the hair follicle matrix that is not commonly diagnosed in general practice. This is a case report of a 9-year-old boy who presented with a one-year history of a 19mm x 11 mm swelling in the left infra-orbital region. This was previously undiagnosed in Spain, where the patient resided at the time of initial presentation, due to the language barrier the patient’s family encountered. An ultrasound and magnetic resonance imaging gave useful information regarding surrounding structures for complete tumor excision and indicated that the risk of facial nerve palsy is low. The lesion was surgically excised and a definitive diagnosis was made after histopathology. Pilomatrixoma, although not rare in its occurrence, is rarely this large at the time of excision due to early presentation. This case highlights the importance of including pilomatrixoma in the differential diagnosis of dermal and subcutaneous lesions in the head and neck region, as it is often misdiagnosed due to the lack of awareness of its clinical presentation.Keywords: pilomatrixoma, swelling, infra-orbital, facial swelling
Procedia PDF Downloads 14616015 Effect of Alkaline Activator, Water, Superplasticiser and Slag Contents on the Compressive Strength and Workability of Slag-Fly Ash Based Geopolymer Mortar Cured under Ambient Temperature
Authors: M. Al-Majidi, A. Lampropoulos, A. Cundy
Abstract:
Geopolymer (cement-free) concrete is the most promising green alternative to ordinary Portland cement concrete and other cementitious materials. While a range of different geopolymer concretes have been produced, a common feature of these concretes is heat curing treatment which is essential in order to provide sufficient mechanical properties in the early age. However, there are several practical issues with the application of heat curing in large-scale structures. The purpose of this study is to develop cement-free concrete without heat curing treatment. Experimental investigations were carried out in two phases. In the first phase (Phase A), the optimum content of water, polycarboxylate based superplasticizer contents and potassium silicate activator in the mix was determined. In the second stage (Phase B), the effect of ground granulated blast furnace slag (GGBFS) incorporation on the compressive strength of fly ash (FA) and Slag based geopolymer mixtures was evaluated. Setting time and workability were also conducted alongside with compressive tests. The results showed that as the slag content was increased the setting time was reduced while the compressive strength was improved. The obtained compressive strength was in the range of 40-50 MPa for 50% slag replacement mixtures. Furthermore, the results indicated that increment of water and superplasticizer content resulted to retarding of the setting time and slight reduction of the compressive strength. The compressive strength of the examined mixes was considerably increased as potassium silicate content was increased.Keywords: fly ash, geopolymer, potassium silicate, slag
Procedia PDF Downloads 22316014 Forecasting Container Throughput: Using Aggregate or Terminal-Specific Data?
Authors: Gu Pang, Bartosz Gebka
Abstract:
We forecast the demand of total container throughput at the Indonesia’s largest seaport, Tanjung Priok Port. We propose four univariate forecasting models, including SARIMA, the additive Seasonal Holt-Winters, the multiplicative Seasonal Holt-Winters and the Vector Error Correction Model. Our aim is to provide insights into whether forecasting the total container throughput obtained by historical aggregated port throughput time series is superior to the forecasts of the total throughput obtained by summing up the best individual terminal forecasts. We test the monthly port/individual terminal container throughput time series between 2003 and 2013. The performance of forecasting models is evaluated based on Mean Absolute Error and Root Mean Squared Error. Our results show that the multiplicative Seasonal Holt-Winters model produces the most accurate forecasts of total container throughput, whereas SARIMA generates the worst in-sample model fit. The Vector Error Correction Model provides the best model fits and forecasts for individual terminals. Our results report that the total container throughput forecasts based on modelling the total throughput time series are consistently better than those obtained by combining those forecasts generated by terminal-specific models. The forecasts of total throughput until the end of 2018 provide an essential insight into the strategic decision-making on the expansion of port's capacity and construction of new container terminals at Tanjung Priok Port.Keywords: SARIMA, Seasonal Holt-Winters, Vector Error Correction Model, container throughput
Procedia PDF Downloads 50416013 The Impact of Inconclusive Results of Thin Layer Chromatography for Marijuana Analysis and It’s Implication on Forensic Laboratory Backlog
Authors: Ana Flavia Belchior De Andrade
Abstract:
Forensic laboratories all over the world face a great challenge to overcame waiting time and backlog in many different areas. Many aspects contribute to this situation, such as an increase in drug complexity, increment in the number of exams requested and cuts in funding limiting laboratories hiring capacity. Altogether, those facts pose an essential challenge for forensic chemistry laboratories to keep both quality and time of response within an acceptable period. In this paper we will analyze how the backlog affects test results and, in the end, the whole judicial system. In this study data from marijuana samples seized by the Federal District Civil Police in Brazil between the years 2013 and 2017 were tabulated and the results analyzed and discussed. In the last five years, the number of petitioned exams increased from 822 in February 2013 to 1358 in March 2018, representing an increase of 32% in 5 years, a rise of more than 6% per year. Meanwhile, our data shows that the number of performed exams did not grow at the same rate. Product numbers are stationed as using the actual technology scenario and analyses routine the laboratory is running in full capacity. Marijuana detection is the most prevalence exam required, representing almost 70% of all exams. In this study, data from 7,110 (seven thousand one hundred and ten) marijuana samples were analyzed. Regarding waiting time, most of the exams were performed not later than 60 days after receipt (77%). Although some samples waited up to 30 months before being examined (0,65%). When marijuana´s exam is delayed we notice the enlargement of inconclusive results using thin-layer chromatography (TLC). Our data shows that if a marijuana sample is stored for more than 18 months, inconclusive results rise from 2% to 7% and when if storage exceeds 30 months, inconclusive rates increase to 13%. This is probably because Cannabis plants and preparations undergo oxidation under storage resulting in a decrease in the content of Δ9-tetrahydrocannabinol ( Δ9-THC). An inconclusive result triggers other procedures that require at least two more working hours of our analysts (e.g., GC/MS analysis) and the report would be delayed at least one day. Those new procedures increase considerably the running cost of a forensic drug laboratory especially when the backlog is significant as inconclusive results tend to increase with waiting time. Financial aspects are not the only ones to be observed regarding backlog cases; there are also social issues as legal procedures can be delayed and prosecution of serious crimes can be unsuccessful. Delays may slow investigations and endanger public safety by giving criminals more time on the street to re-offend. This situation also implies a considerable cost to society as at some point, if the exam takes a long time to be performed, an inconclusive can turn into a negative result and a criminal can be absolved by flawed expert evidence.Keywords: backlog, forensic laboratory, quality management, accreditation
Procedia PDF Downloads 12216012 Development of Hit Marks on Clothes Using Amino Acid Reagents
Authors: Hyo-Su Lim, Ye-Eun Song, Eun-Bi Lee, Sang-Yoon Lee, Young-Il Seo, Jin-Pyo Kim, Nam-Kyu Park
Abstract:
If we analogize any physical external force given to victims in many crimes including violence, it would be possible not only to presume mutual action between victims and suspects, but to make a deduction of more various facts in cases. Therefore, the aim of this study is to identify criminal tools through secretion on clothes by using amino acid reagents such as Ninhydrin, DFO(1,8-dizafluoren-9-one), 1,2 – IND (1,2-indanedione) which are reacting to skin secretion. For more effective collecting condition, porcine skin which is physiologically similar to human was used. Although there were little differences of shape identification according to sensitivity, amino acid reagents were able to identify the fist, foot, and baseball bat. Furthermore, we conducted the experiments for developmental variations through change over time setting up 5-weeks period including first damage as variation factor, and developing materials in each action through certain reagents. Specimen level of development depending on change over time was identified. As a result, each of initial level of development was seen no changes.Keywords: hit marks, amino acid reagents, porcine skin, criminal tool
Procedia PDF Downloads 26316011 A Measurement and Motor Control System for Free Throw Shots in Basketball Using Gyroscope Sensor
Authors: Niloofar Zebarjad
Abstract:
This research aims at finding a tool to provide basketball players with real-time audio feedback on their shooting form in free throw shots. Free throws played a pivotal role in taking the lead in fierce competitions. The major problem in performing an accurate free throw seems to be improper training. Since the arm movement during the free throw shot is complex, the coach or the athlete might miss the movement details during practice. Hence, there is a necessity to create a system that measures arm movements' critical characteristics and control for improper kinematics. The proposed setup in this study quantifies arm kinematics and provides real-time feedback as an audio signal consisting of a gyroscope sensor. Spatial shoulder angle data are transmitted in a mobile application in real-time and can be saved and processed for statistical and analysis purposes. The proposed system is easy to use, inexpensive, portable, and real-time applicable. Objectives: This research aims to modify and control the free throw using audio feedback and determine if and to what extent the new setup reduces errors in arm formations during throws and finally assesses the successful throw rate. Methods: One group of elite basketball athletes and two novice athletes (control and study group) participated in this study. Each group contains 5 participants being studied in three separate sessions over a week. Results: Empirical results showed enhancements in the free throw shooting style, shot pocket (SP), and locked position (LP). The mean values of shoulder angle were controlled on 25° and 45° for SP and LP, respectively, recommended by valid FIBA references. Conclusion: Throughout the experiments, the system helped correct and control the shoulder angles toward the targeted pattern of shot pocket (SP) and locked position (LP). According to the desired results for arm motion, adding another sensor to measure and control the elbow angle is recommended.Keywords: audio-feedback, basketball, free-throw, locked-position, motor-control, shot-pocket
Procedia PDF Downloads 29516010 The Relationships between Carbon Dioxide (CO2) Emissions, Energy Consumption and GDP per capita for Oman: Time Series Analysis, 1980–2010
Authors: Jinhoa Lee
Abstract:
The relationships between environmental quality, energy use and economic output have created growing attention over the past decades among researchers and policy makers. Focusing on the empirical aspects of the role of CO2 emissions and energy use in affecting the economic output, this paper is an effort to fulfil the gap in a comprehensive case study at a country level using modern econometric techniques. To achieve the goal, this country-specific study examines the short-run and long-run relationships among energy consumption, carbon dioxide (CO2) emissions and gross domestic product (GDP) for Oman using time series analysis from the year 1980-2010. To investigate the relationships between the variables, this paper employs the Augmented Dickey Fuller (ADF) test for stationary, Johansen maximum likelihood method for co-integration and a Vector Error Correction Model (VECM) for both short- and long-run causality among the research variables for the sample. All the variables in this study show very strong significant effects on GDP in the country for the long term. The long-run equilibrium in the VECM suggests positive long-run causalities from CO2 emissions to GDP. Conversely, negative impacts of energy consumption on GDP are found to be significant in Oman during the period. In the short run, there exist negative unidirectional causalities among GDP, CO2 emissions and energy consumption running from GDP to CO2 emissions and from energy consumption to CO2 emissions. Overall, the results support arguments that there are relationships among environmental quality, energy use and economic output in Oman over of period 1980-2010.Keywords: CO2 emissions, energy consumption, GDP, Oman, time series analysis
Procedia PDF Downloads 46216009 Thermal Annealing Effects on Nonradiative Recombination Parameters of GaInAsSb/GaSb by Means of Photothermal Defection Technique
Authors: Souha Bouagila, Soufiene Ilahi, Noureddine Yacoubi
Abstract:
We have used Photothermal deflection spectroscopy PTD to investigate the impact of thermal annealing on electronics properties of GaInAsSb/GaSb.GaInAsSb used as an active layer for Vertical Cavity Surface Emitting laser (VCSEL). We have remarked that surface recombination velocity (SRV) from 7963 m / s (± 6.3%) to 1450 m / s (± 3.6) for as grown to sample annealed for 60 min. Accordingly, Force Microscopy images analyses agree well with the measure of surface recombination velocity. We have found that Root-Mean-Square Roughness (RMS) decreases as respect of annealing time. In addition, we have that the diffusion length and minority carrier mobility have been enhanced according to annealing time. However, due to annealing effects, the interface recombination velocity (IRV) is increased from 1196 m / s (± 5) to 6000 m/s (5%) for GaInAsSb in respect of annealed times.Keywords: nonradiative lifetime, mobility of minority carrier, diffusion length, Surface and interface recombination velocity
Procedia PDF Downloads 7416008 Effects of Acacia Honey Drink Ingestion during Rehydration after Exercise Compared to Sports Drink on Physiological Parameters and Subsequent Running Performance in the Heat
Authors: Foong Kiew Ooi, Aidi Naim Mohamad Samsani, Chee Keong Chen, Mohamed Saat Ismail
Abstract:
Introduction: Prolonged exercise in a hot and humid environment can result in glycogen depletion and associated with loss of body fluid. Carbohydrate contained in sports beverages is beneficial for improving sports performance and preventing dehydration. Carbohydrate contained in honey is believed can be served as an alternative form of carbohydrate for enhancing sports performance. Objective: To investigate the effectiveness of honey drink compared to sports drink as a recovery aid for running performance and physiological parameters in the heat. Method: Ten male recreational athletes (age: 22.2 ± 2.0 years, VO2max: 51.5 ± 3.7 ml.kg-1.min-1) participated in this randomized cross-over study. On each trial, participants were required to run for 1 hour in the glycogen depletion phase (Run-1), followed by a rehydration phase for 2 hours and subsequently a 20 minutes time trial performance (Run-2). During Run-1, subjects were required to run on the treadmill in the heat (31°C) with 70% relative humidity at 70 % of their VO2max. During rehydration phase, participants drank either honey drink or sports drink, or plain water with amount equivalent to 150% of body weight loss in dispersed interval (60 %, 50 % and 40 %) at 0 min, 30 min and 60 min respectively. Subsequently, time trial was performed by the participants in 20 minutes and the longest distance covered was recorded. Physiological parameters were analysed using two-way ANOVA with repeated measure and time trial performance was analysed using one-way ANOVA. Results: Result showed that Acacia honey elicited a better time trial performance with significantly longer distance compared to water trial (P<0.05). However, there was no significant difference between Acacia honey and sport drink trials (P > 0.05). Acacia honey and sports drink trials elicited 249 m (8.24 %) and 211 m (6.79 %) longer in distance compared to the water trial respectively. For physiological parameters, plasma glucose, plasma insulin and plasma free fatty acids in Acacia honey and sports drink trials were significantly higher compared to the water trial respectively during rehydration phase and time trial running performance phase. There were no significant differences in body weight changes, oxygen uptake, hematocrit, plasma volume changes and plasma cortisol in all the trials. Conclusion: Acacia honey elicited greatest beneficial effects on sports performance among the drinks, thus it has potential to be used for rehydration in athletes who train and compete in hot environment.Keywords: honey drink, rehydration, sports performance, plasma glucose, plasma insulin, plasma cortisol
Procedia PDF Downloads 30916007 Time-Domain Analysis Approaches of Soil-Structure Interaction: A Comparative Study
Authors: Abdelrahman Taha, Niloofar Malekghaini, Hamed Ebrahimian, Ramin Motamed
Abstract:
This paper compares the substructure and direct methods for soil-structure interaction (SSI) analysis in the time domain. In the substructure SSI method, the soil domain is replaced by a set of springs and dashpots, also referred to as the impedance function, derived through the study of the behavior of a massless rigid foundation. The impedance function is inherently frequency dependent, i.e., it varies as a function of the frequency content of the structural response. To use the frequency-dependent impedance function for time-domain SSI analysis, the impedance function is approximated at the fundamental frequency of the structure-soil system. To explore the potential limitations of the substructure modeling process, a two-dimensional reinforced concrete frame structure is modeled using substructure and direct methods in this study. The results show discrepancies between the simulated responses of the substructure and the direct approaches. To isolate the effects of higher modal responses, the same study is repeated using a harmonic input motion, in which a similar discrepancy is still observed between the substructure and direct approaches. It is concluded that the main source of discrepancy between the substructure and direct SSI approaches is likely attributed to the way the impedance functions are calculated, i.e., assuming a massless rigid foundation without considering the presence of the superstructure. Hence, a refined impedance function, considering the presence of the superstructure, shall be developed. This refined impedance function is expected to significantly improve the simulation accuracy of the substructure approach for structural systems whose behavior is dominated by the fundamental mode response.Keywords: direct approach, impedance function, soil-structure interaction, substructure approach
Procedia PDF Downloads 11816006 Performance Comparison of Microcontroller-Based Optimum Controller for Fruit Drying System
Authors: Umar Salisu
Abstract:
This research presents the development of a hot air tomatoes drying system. To provide a more efficient and continuous temperature control, microcontroller-based optimal controller was developed. The system is based on a power control principle to achieve smooth power variations depending on a feedback temperature signal of the process. An LM35 temperature sensor and LM399 differential comparator were used to measure the temperature. The mathematical model of the system was developed and the optimal controller was designed and simulated and compared with the PID controller transient response. A controlled environment suitable for fruit drying is developed within a closed chamber and is a three step process. First, the infrared light is used internally to preheated the fruit to speedily remove the water content inside the fruit for fast drying. Second, hot air of a specified temperature is blown inside the chamber to maintain the humidity below a specified level and exhaust the humid air of the chamber. Third, the microcontroller disconnects the power to the chamber after the moisture content of the fruits is removed to minimal. Experiments were conducted with 1kg of fresh tomatoes at three different temperatures (40, 50 and 60 °C) at constant relative humidity of 30%RH. The results obtained indicate that the system is significantly reducing the drying time without affecting the quality of the fruits. In the context of temperature control, the results obtained showed that the response of the optimal controller has zero overshoot whereas the PID controller response overshoots to about 30% of the set-point. Another performance metric used is the rising time; the optimal controller rose without any delay while the PID controller delayed for more than 50s. It can be argued that the optimal controller performance is preferable than that of the PID controller since it does not overshoot and it starts in good time.Keywords: drying, microcontroller, optimum controller, PID controller
Procedia PDF Downloads 30116005 The Effect on Lead Times When Normalizing a Supply Chain Process
Authors: Bassam Istanbouli
Abstract:
Organizations are living in a very competitive and dynamic environment which is constantly changing. In order to achieve a high level of service, the products and processes of these organizations need to be flexible and evolvable. If the supply chains are not modular and well designed, changes can bring combinatorial effects to most areas of a company from its management, financial, documentation, logistics and its information structure. Applying the normalized system’s concept to segments of the supply chain may help in reducing those ripple effects, but it may also increase lead times. Lead times are important and can become a decisive element in gaining customers. Industries are always under the pressure in providing good quality products, at competitive prices, when and how the customer wants them. Most of the time, the customers want their orders now, if not yesterday. The above concept will be proven by examining lead times in a manufacturing example before and after applying normalized systems concept to that segment of the chain. We will then show that although we can minimize the combinatorial effects when changes occur, the lead times will be increased.Keywords: supply chain, lead time, normalization, modular
Procedia PDF Downloads 12516004 Distributed Generation Connection to the Network: Obtaining Stability Using Transient Behavior
Authors: A. Hadadi, M. Abdollahi, A. Dustmohammadi
Abstract:
The growing use of DGs in distribution networks provide many advantages and also cause new problems which should be anticipated and be solved with appropriate solutions. One of the problems is transient voltage drop and short circuit in the electrical network, in the presence of distributed generation - which can lead to instability. The appearance of the short circuit will cause loss of generator synchronism, even though if it would be able to recover synchronizing mode after removing faulty generator, it will be stable. In order to increase system reliability and generator lifetime, some strategies should be planned to apply even in some situations which a fault prevent generators from separation. In this paper, one fault current limiter is installed due to prevent DGs separation from the grid when fault occurs. Furthermore, an innovative objective function is applied to determine the impedance optimal amount of fault current limiter in order to improve transient stability of distributed generation. Fault current limiter can prevent generator rotor's sudden acceleration after fault occurrence and thereby improve the network transient stability by reducing the current flow in a fast and effective manner. In fact, by applying created impedance by fault current limiter when a short circuit happens on the path of current injection DG to the fault location, the critical fault clearing time improve remarkably. Therefore, protective relay has more time to clear fault and isolate the fault zone without any instability. Finally, different transient scenarios of connection plan sustainability of small scale synchronous generators to the distribution network are presented.Keywords: critical clearing time, fault current limiter, synchronous generator, transient stability, transient states
Procedia PDF Downloads 19716003 Health Care using Queuing Theory
Authors: S. Vadivukkarasi, K. Karthi, M. Karthick, C. Dinesh, S. Santhosh, A. Yogaraj
Abstract:
The appointment system was designed to minimize patient’s idle time overlooking patients waiting time in hospitals. This is no longer valid in today’s consumer oriented society. Long waiting times for treatment in the outpatient department followed by short consultations has long been a complaint. Nowadays, customers use waiting time as a decisive factor in choosing a service provider. Queuing theory constitutes a very powerful tool because queuing models require relatively little data and are simple and fast to use. Because of this simplicity and speed, modelers can be used to quickly evaluate and compare various alternatives for providing service. The application of queuing models in the analysis of health care systems is increasingly accepted by health care decision makers. Timely access to care is a key component of high-quality health care. However, patient delays are prevalent throughout health care systems, resulting in dissatisfaction and adverse clinical consequences for patients as well as potentially higher costs and wasted capacity for providers. Arguably, the most critical delays for health care are the ones associated with health care emergencies. The allocation of resources can be divided into three general areas: bed management, staff management, and room facility management. Effective and efficient patient flow is indicated by high patient throughput, low patient waiting times, a short length of stay at the hospital and overtime, while simultaneously maintaining adequate staff utilization rates and low patient’s idle times.Keywords: appointment system, patient scheduling, bed management, queueing calculation, system analysis
Procedia PDF Downloads 30016002 Impact of the Xanthan Gum on Rheological Properties of Ceramic Slip
Authors: Souad Hassene Daouadji, Larbi Hammadi, Abdelkrim Hazzab
Abstract:
The slips intended for the manufacture of ceramics must have rheological properties well-defined in order to bring together the qualities required for the casting step (good fluidity for feeding the molds easily settles while generating a regular settling of the dough and for the dehydration phase of the dough in the mold a setting time relatively short is required to have a sufficient refinement which allows demolding both easy and fast). Many additives haveadded in slip of ceramic in order to improve their rheological properties. In this study, we investigated the impact of xanthan gumon rheological properties of ceramic Slip. The modified Cross model is used to fit the stationary flow curves of ceramic slip at different concentration of xanthan added. The thixotropic behavior studied of mixture ceramic slip-xanthan gumat constant temperature is analyzed by using a structural kinetic model (SKM) in order to account for time dependent effect.Keywords: ceramic slip, xanthan gum, modified cross model, thixotropy, viscosity
Procedia PDF Downloads 19116001 Attention-based Adaptive Convolution with Progressive Learning in Speech Enhancement
Authors: Tian Lan, Yixiang Wang, Wenxin Tai, Yilan Lyu, Zufeng Wu
Abstract:
The monaural speech enhancement task in the time-frequencydomain has a myriad of approaches, with the stacked con-volutional neural network (CNN) demonstrating superiorability in feature extraction and selection. However, usingstacked single convolutions method limits feature represen-tation capability and generalization ability. In order to solvethe aforementioned problem, we propose an attention-basedadaptive convolutional network that integrates the multi-scale convolutional operations into a operation-specific blockvia input dependent attention to adapt to complex auditoryscenes. In addition, we introduce a two-stage progressivelearning method to enlarge the receptive field without a dra-matic increase in computation burden. We conduct a series ofexperiments based on the TIMIT corpus, and the experimen-tal results prove that our proposed model is better than thestate-of-art models on all metrics.Keywords: speech enhancement, adaptive convolu-tion, progressive learning, time-frequency domain
Procedia PDF Downloads 12316000 New Method for the Determination of Montelukast in Human Plasma by Solid Phase Extraction Using Liquid Chromatography Tandem Mass Spectrometry
Authors: Vijayalakshmi Marella, NageswaraRaoPilli
Abstract:
This paper describes a simple, rapid and sensitive liquid chromatography / tandem mass spectrometry assay for the determination of montelukast in human plasma using montelukast d6 as an internal standard. Analyte and the internal standard were extracted from 50 µL of human plasma via solid phase extraction technique without evaporation, drying and reconstitution steps. The chromatographic separation was achieved on a C18 column by using a mixture of methanol and 5mM ammonium acetate (80:20, v/v) as the mobile phase at a flow rate of 0.8 mL/min. Good linearity results were obtained during the entire course of validation. Method validation was performed as per FDA guidelines and the results met the acceptance criteria. A run time of 2.5 min for each sample made it possible to analyze more number of samples in short time, thus increasing the productivity. The proposed method was found to be applicable to clinical studies.Keywords: Montelukast, tandem mass spectrometry, montelukast d6, FDA guidelines
Procedia PDF Downloads 315