Search results for: test generate
8937 Numerical Solving Method for Specific Dynamic Performance of Unstable Flight Dynamics with PD Attitude Control
Authors: M. W. Sun, Y. Zhang, L. M. Zhang, Z. H. Wang, Z. Q. Chen
Abstract:
In the realm of flight control, the Proportional- Derivative (PD) control is still widely used for the attitude control in practice, particularly for the pitch control, and the attitude dynamics using PD controller should be investigated deeply. According to the empirical knowledge about the unstable flight dynamics, the control parameter combination conditions to generate sole or finite number of closed-loop oscillations, which is a quite smooth response and is more preferred by practitioners, are presented in analytical or numerical manners. To analyze the effects of the combination conditions of the control parameters, the roots of several polynomials are sought to obtain feasible solutions. These conditions can also be plotted in a 2-D plane which makes the conditions be more explicit by using multiple interval operations. Finally, numerical examples are used to validate the proposed methods and some comparisons are also performed.Keywords: attitude control, dynamic performance, numerical solving method, interval, unstable flight dynamics
Procedia PDF Downloads 5848936 Design of a Sliding Mode Control Using Nonlinear Sliding Surface and Nonlinear Observer Applied to the Trirotor Mini-Aircraft
Authors: Samir Zeghlache, Abderrahmen Bouguerra, Kamel Kara, Djamel Saigaa
Abstract:
The control of the trirotor helicopter includes nonlinearities, uncertainties and external perturbations that should be considered in the design of control laws. This paper presents a control strategy for an underactuated six degrees of freedom (6 DOF) trirotor helicopter, based on the coupling of the fuzzy logic control and sliding mode control (SMC). The main purpose of this work is to eliminate the chattering phenomenon. To achieve our purpose we have used a fuzzy logic control to generate the hitting control signal, also the non linear observer is then synthesized in order to estimate the unmeasured states. Finally simulation results are included to indicate the trirotor UAV with the proposed controller can greatly alleviate the chattering effect and remain robust to the external disturbances.Keywords: fuzzy sliding mode control, trirotor helicopter, dynamic modelling, underactuated systems
Procedia PDF Downloads 5388935 Predicting Loss of Containment in Surface Pipeline using Computational Fluid Dynamics and Supervised Machine Learning Model to Improve Process Safety in Oil and Gas Operations
Authors: Muhammmad Riandhy Anindika Yudhy, Harry Patria, Ramadhani Santoso
Abstract:
Loss of containment is the primary hazard that process safety management is concerned within the oil and gas industry. Escalation to more serious consequences all begins with the loss of containment, starting with oil and gas release from leakage or spillage from primary containment resulting in pool fire, jet fire and even explosion when reacted with various ignition sources in the operations. Therefore, the heart of process safety management is avoiding loss of containment and mitigating its impact through the implementation of safeguards. The most effective safeguard for the case is an early detection system to alert Operations to take action prior to a potential case of loss of containment. The detection system value increases when applied to a long surface pipeline that is naturally difficult to monitor at all times and is exposed to multiple causes of loss of containment, from natural corrosion to illegal tapping. Based on prior researches and studies, detecting loss of containment accurately in the surface pipeline is difficult. The trade-off between cost-effectiveness and high accuracy has been the main issue when selecting the traditional detection method. The current best-performing method, Real-Time Transient Model (RTTM), requires analysis of closely positioned pressure, flow and temperature (PVT) points in the pipeline to be accurate. Having multiple adjacent PVT sensors along the pipeline is expensive, hence generally not a viable alternative from an economic standpoint.A conceptual approach to combine mathematical modeling using computational fluid dynamics and a supervised machine learning model has shown promising results to predict leakage in the pipeline. Mathematical modeling is used to generate simulation data where this data is used to train the leak detection and localization models. Mathematical models and simulation software have also been shown to provide comparable results with experimental data with very high levels of accuracy. While the supervised machine learning model requires a large training dataset for the development of accurate models, mathematical modeling has been shown to be able to generate the required datasets to justify the application of data analytics for the development of model-based leak detection systems for petroleum pipelines. This paper presents a review of key leak detection strategies for oil and gas pipelines, with a specific focus on crude oil applications, and presents the opportunities for the use of data analytics tools and mathematical modeling for the development of robust real-time leak detection and localization system for surface pipelines. A case study is also presented.Keywords: pipeline, leakage, detection, AI
Procedia PDF Downloads 1978934 Faulty Sensors Detection in Planar Array Antenna Using Pelican Optimization Algorithm
Authors: Shafqat Ullah Khan, Ammar Nasir
Abstract:
Using planar antenna array (PAA) in radars, Broadcasting, satellite antennas, and sonar for the detection of targets, Helps provide instant beam pattern control. High flexibility and Adaptability are achieved by multiple beam steering by using a Planar array and are particularly needed in real-life Sanrio’s where the need arises for several high-directivity beams. Faulty sensors in planar arrays generate asymmetry, which leads to service degradation, radiation pattern distortion, and increased levels of sidelobe. The POA, a nature-inspired optimization algorithm, accurately determines faulty sensors within an array, enhancing the reliability and performance of planar array antennas through extensive simulations and experiments. The analysis was done for different types of faults in 7 x 7 and 8 x 8 planar arrays in MATLAB.Keywords: Planar antenna array, , Pelican optimisation Algorithm, , Faculty sensor, Antenna arrays
Procedia PDF Downloads 908933 The Utilization of Big Data in Knowledge Management Creation
Authors: Daniel Brian Thompson, Subarmaniam Kannan
Abstract:
The huge weightage of knowledge in this world and within the repository of organizations has already reached immense capacity and is constantly increasing as time goes by. To accommodate these constraints, Big Data implementation and algorithms are utilized to obtain new or enhanced knowledge for decision-making. With the transition from data to knowledge provides the transformational changes which will provide tangible benefits to the individual implementing these practices. Today, various organization would derive knowledge from observations and intuitions where this information or data will be translated into best practices for knowledge acquisition, generation and sharing. Through the widespread usage of Big Data, the main intention is to provide information that has been cleaned and analyzed to nurture tangible insights for an organization to apply to their knowledge-creation practices based on facts and figures. The translation of data into knowledge will generate value for an organization to make decisive decisions to proceed with the transition of best practices. Without a strong foundation of knowledge and Big Data, businesses are not able to grow and be enhanced within the competitive environment.Keywords: big data, knowledge management, data driven, knowledge creation
Procedia PDF Downloads 1208932 Relativistic Energy Analysis for Some q Deformed Shape Invariant Potentials in D Dimensions Using SUSYQM Approach
Authors: A. Suparmi, C. Cari, M. Yunianto, B. N. Pratiwi
Abstract:
D-dimensional Dirac equations of q-deformed shape invariant potentials were solved using supersymmetric quantum mechanics (SUSY QM) in the case of exact spin symmetry. The D dimensional radial Dirac equation for shape invariant potential reduces to one-dimensional Schrodinger type equation by an appropriate variable and parameter change. The relativistic energy spectra were analyzed by using SUSY QM and shape invariant properties from radial D dimensional Dirac equation that have reduced to one dimensional Schrodinger type equation. The SUSY operator was used to generate the D dimensional relativistic radial wave functions, the relativistic energy equation reduced to the non-relativistic energy in the non-relativistic limit.Keywords: D-dimensional dirac equation, non-central potential, SUSY QM, radial wave function
Procedia PDF Downloads 3468931 Effect of Highly Pressurized Dispersion Arc Nozzle on Breakup of Oil Leakage in Offshore
Authors: N. M. M. Ammar, S. M. Mustaqim, N. M. Nadzir
Abstract:
The most important problem occurs on oil spills in sea water is to reduce the oil spills size. This study deals with the development of high pressurized nozzle using dispersion method for oil leakage in offshore. 3D numerical simulation results were obtained using ANSYS Fluent 13.0 code and correlate with the experimental data for validation. This paper studies the contribution of the process on flow speed and pressure of the flow from two different geometrical designs of nozzles and to generate a spray pattern suitable for dispersant application. Factor of size distribution of droplets generated by the nozzle is calculated using pressures ranging from 2 to 6 bars. Results obtain from both analyses shows a significant spray pattern and flow distribution as well as distance. Results also show a significant contribution on the effect of oil leakage in terms of the diameter of the oil spills break up.Keywords: arc nozzle, CFD simulation, droplets, oil spills
Procedia PDF Downloads 4228930 External Sector and Its Impact on Economic Growth of Pakistan (1990-2010)
Authors: Rizwan Fazal
Abstract:
This study investigates the behavior of external sector of Pakistan economy and its impact on economic growth, using quarterly data for the period 1990:01-2010:04. External sector indices used in this study are financial integration, net foreign assets and trade integration. Augmented Ducky fuller confirms that all variables of external sector are non-stationary at level, but at first difference it becomes stationary. The co-integration test suggests one co-integrating variables in the study. The analysis is based on Vector Auto Regression model followed by Vector Error Correction Model. The empirical findings show that financial integration play important role in increasing economic growth in Pakistan economy while trade integration has negative effect on economic growth of Pakistan in the long run. However, the short run confirms that output lag accounts for error correction. The estimated CUSUM and CUSUMQ stability test provide information that the period of the study equation remains stable.Keywords: financial integration, trade integration, net foreign assets, gross domestic product
Procedia PDF Downloads 2758929 Dry Reforming of Methane Using Metal Supported and Core Shell Based Catalyst
Authors: Vinu Viswanath, Lawrence Dsouza, Ugo Ravon
Abstract:
Syngas typically and intermediary gas product has a wide range of application of producing various chemical products, such as mixed alcohols, hydrogen, ammonia, Fischer-Tropsch products methanol, ethanol, aldehydes, alcohols, etc. There are several technologies available for the syngas production. An alternative to the conventional processes an attractive route of utilizing carbon dioxide and methane in equimolar ratio to generate syngas of ratio close to one has been developed which is also termed as Dry Reforming of Methane technology. It also gives the privilege to utilize the greenhouse gases like CO2 and CH4. The dry reforming process is highly endothermic, and indeed, ΔG becomes negative if the temperature is higher than 900K and practically, the reaction occurs at 1000-1100K. At this temperature, the sintering of the metal particle is happening that deactivate the catalyst. However, by using this strategy, the methane is just partially oxidized, and some cokes deposition occurs that causing the catalyst deactivation. The current research work was focused to mitigate the main challenges of dry reforming process such coke deposition, and metal sintering at high temperature.To achieve these objectives, we employed three different strategies of catalyst development. 1) Use of bulk catalysts such as olivine and pyrochlore type materials. 2) Use of metal doped support materials, like spinel and clay type material. 3) Use of core-shell model catalyst. In this approach, a thin layer (shell) of redox metal oxide is deposited over the MgAl2O4 /Al2O3 based support material (core). For the core-shell approach, an active metal is been deposited on the surface of the shell. The shell structure formed is a doped metal oxide that can undergo reduction and oxidation reactions (redox), and the core is an alkaline earth aluminate having a high affinity towards carbon dioxide. In the case of metal-doped support catalyst, the enhanced redox properties of doped CeO2 oxide and CO2 affinity property of alkaline earth aluminates collectively helps to overcome coke formation. For all of the mentioned three strategies, a systematic screening of the metals is carried out to optimize the efficiency of the catalyst. To evaluate the performance of them, the activity and stability test were carried out under reaction conditions of temperature ranging from 650 to 850 ̊C and an operating pressure ranging from 1 to 20 bar. The result generated infers that the core-shell model catalyst showed high activity and better stable DR catalysts under atmospheric as well as high-pressure conditions. In this presentation, we will show the results related to the strategy.Keywords: carbon dioxide, dry reforming, supports, core shell catalyst
Procedia PDF Downloads 1848928 Effects of Modified Low-Dye Taping on First Ray Mobility Test and Sprint Time
Authors: Yu-Ju Tsai, Ching-Chun Wang, Wen-Tzu Tang, Huei-Ming Chai
Abstract:
A pronated foot is frequently associated with a hypermobile first ray, then developing further severe foot problems. Low-Dye taping with athletic tape has been widely used to restrict excessive first ray motion and re-build height of the medial longitudinal arch in general population with pronated foot. It is not the case, however, for sprinters since they feel too much restriction of foot motions. Currently, the kinesio tape, more elastic than the athletic tape, has been widely used to re-adjust joint positions. It was interesting whether modified low-Dye taping using kinesio tape was beneficial for altering first ray mobility and still giving enough arch support. The purpose of this study was to investigate the effect of modified low-Dye taping on first ray mobility test and 60-m sprint time for sprinters with pronated foot. The significance of this study provides new insight into a treatment alternative of modified low-Dye taping for sprinter with pronated foot. Ten young male sprinters, aged 20.8±1.6 years, with pronated foot were recruited for this study. The pronated foot was defined as the foot that the navicular drop test was greater than 1.0 cm. Three optic shutters were placed at the start, 30-m, and 60-m sites to record sprint time. All participants were asked to complete 3 trials of the 60-m dash with both taping and non-taping conditions in a random order. The low-Dye taping was applied using the method postulated by Ralph Dye in 1939 except the kinesio tape was used instead. All outcome variables were recorded for taping and non-taping conditions. Paired t-tests were used to analyze all outcome variables between 2 conditions. Although there were no statistically significant differences in dorsal and plantar mobility between taping and non-taping conditions, a statistical significance was found in a total range of motion (dorsiflexion plus plantarflexion angle) of the first ray when a modified low-Dye taping was applied (p < 0.05). Time to complete 60-m sprint was significantly increased with low-Dye taping (p < 0.05) while no significance was found for time to 30-m. it indicated that modified low-Dye taping changed maximum sprint speed of 60-m dash. Conclusively, modified low-Dye taping was capable of increasing first ray mobility and further altered maximum sprint speed.Keywords: first ray mobility, kinesio taping, pronated foot, sprint time
Procedia PDF Downloads 2788927 Embedding Knowledge Management in Business Process
Authors: Paul Ihuoma Oluikpe
Abstract:
The purpose of this paper is to explore and highlight the process of creating value for strategy management by embedding knowledge management in the business process. Knowledge management can be seen from a three-dimensional perspective of content, connections and competencies. These dimensions can be embedded in the knowledge processes (create, capture, share, and apply) and operationalized within a business process to effectively create a scenario where knowledge can be focused on enabling a process and the process in turn generates outcomes. The application of knowledge management on business processes of organizations is rare and underreported. Few researches have explored this paradigm although researches have tended to reinforce the notion that competitive advantage sits within the internal aspects of the firm. Given this notion, it is surprising that knowledge management research and practice have not focused sufficiently on the business process which is the basic unit of organizational decision implementation. This research serves to generate understanding on applying KM in business process using a large multinational in Sub-Saharan Africa.Keywords: knowledge management, business process, strategy, multinational
Procedia PDF Downloads 6968926 Contextual Toxicity Detection with Data Augmentation
Authors: Julia Ive, Lucia Specia
Abstract:
Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing
Procedia PDF Downloads 1768925 Influence of Bio-Based Admixture on Compressive Strength of Concrete for Columns
Authors: K. Raza, S. Gul, M. Ali
Abstract:
Concrete is a fundamental building material, extensively utilized by the construction industry. Problems related to the strength of concrete is an immense issue for the sustainability of concrete structures. Concrete mostly loses its strength due to the cracks produced in it by shrinkage or hydration process. This study aims to enhance the strength and service life of the concrete structures by incorporating bio-based admixture in the concrete. By the injection of bio-based admixture (BBA) in concrete, it will self-heal the cracks by producing calcium carbonate. Minimization of cracks will compact the microstructure of the concrete, due to which strength will increase. For this study, Bacillus subtilis will be used as a bio-based admixture (BBA) in concrete. Calcium lactate up to 1.5% will be used as the food source for the Bacillus subtilis in concrete. Two formulations containing 0 and 5% of Bacillus subtilis by weight of cement, will be used for the casting of concrete specimens. Direct mixing method will be adopted for the usage of bio-based admixture in concrete. Compressive strength test will be carried out after 28 days of curing. Scanning electron microscopy (SEM) and X-ray diffraction analysis (XRD) will be performed for the examination of micro-structure of concrete. Results will be drawn by comparing the test results of 0 and 5% the formulations. It will be recommended to use to bio-based admixture (BBA) in concrete for columns because of the satisfactory increase in the compressive strength of concrete.Keywords: bio-based admixture, Bacillus subtilis, calcium lactate, compressive strength
Procedia PDF Downloads 2318924 Plasmodium knowlesi Zoonotic Malaria: An Emerging Challenge of Health Problems in Thailand
Authors: Surachart Koyadun
Abstract:
Currently, Plasmodium knowlesi malaria has spread to almost all countries in Southeast Asia. This research aimed to 1) describe the epidemiology of Plasmodium knowlesi malaria, 2) examine the clinical symptoms of P. knowlesi malaria patients 3) analyze the ecology, animal reservoir and entomology of P. knowlesi malaria. 4) summarize the diagnosis, blood parasites, and treatment of P. knowlesi malaria. The study design was a case report combined with retrospective descriptive survey research. A total of 34 study subjects were patients with a confirmed diagnosis of P. knowlesi malaria who received treatment at hospitals and vector-borne disease control units in Songkhla Province during 2021 – 2022. The results of the epidemiological study unveiled the majority of the samples were male, had a history of staying overnight in the forest before becoming sick, the source of the infection was in the forest, and the season during which they were sick was mostly summer. The average length of time from the onset of illness until receiving a blood test was 3.8 days. The average length of hospital stay was 4 days. Patients were treated with Chloroquine Phosphate, Primaquine, Artesunate, Quinine, and Dihydroartemisinin-piperaquine (40 mg DHA-320 mg PPQ). One death was seen in 34 P. knowlesi malaria patients. All remaining patients recovered and responded to treatment. All symptoms improved after drug administration. No treatment failures were found. Analyses of ecological, zoonotic and entomological data revealed an association between infected patients and forested, monkey-hosted and mosquito-transmitted areas. The recommendation from this study was that the Polymerase Chain Reaction (PCR) method should be used in conjunction with the Thick/Thin Film test and blood parasite test (Parasitaemia) for the specificity of the infection, accuracy of diagnosis, leading to treatment of disease in a timely manner and be effective in disease control.Keywords: human malaria, Plasmodium knowlesi, zoonotic disease, diagnosis and treatment, epidemiology, ecology
Procedia PDF Downloads 318923 Development of Student Invention Competences and Skills in Polytechnic University
Authors: D. S. Denchuk, O. M. Zamyatina, M. G. Minin, M. A. Soloviev, K. V. Bogrova
Abstract:
The article considers invention activity in Russia and worldwide, its modern state, and the impact of innovative engineering activity on the national economy of the considered countries. It also analyses the historical premises of modern engineer-ing invention. The authors explore the development of engineering invention at an engineer-ing university, the creation of particular environment for scientific and technical creativity of students on the example of Elite engineering education program at Tomsk Polytechnic University, Russia. It is revealed that for the successful de-velopment of engineering invention in a higher education institution it is neces-sary to apply a learning model that develops the creative potential of a student, which is, in its turn, inseparably connected with the ability to generate new ideas in engineering. Such academic environment can become a basis for revealing stu-dents' creativity.Keywords: engineering invention, scientific and technical creativity, students, project-based approach
Procedia PDF Downloads 3928922 A Pilot Study on the Short Term Effects of Paslop Dance Exercise on Core Strength, Balance and Flexibility
Authors: Wilawan Kanhachon, Yodchai Boonprakob, Uraiwon Chatchawan, Junichiro Yamauchi
Abstract:
Introduction: Paslop is a traditional dance from Laos, which is popular in Laos and northeastern of Thailand. This unique type of Paslop dancing is to control body movement with the song. While dancing to the beat, dancers should contract their abdomen and back muscle all the time. Paslop may be a good alternative to improve strengthening, balance and flexibility. Objective: To investigate the effects of Paslop dance exercise on core strength, balance, and flexibility. Methods: Seven healthy participants (age, 20.57±1.13 yrs; height, 162.29±6.16 cm; body mass, 58.14±7.03 kg; mean± S.D.) were volunteered to perform the 45-minute Paslop dance exercise in three times a week for 8 weeks. Before, during and after the exercise period, core strength, balance and flexibility were measured with the pressure biofeedback unit (PBU), one-leg stance test (OLST), and sit and reach test (SAR), respectively. Result: PBU score for core strength increased from 2.12 mmHg in baseline to 6.34 mmHg at the 4th week and 10.10 mmHg at the 8th week after the Paslop dance training, while OLST and SAR did not change. Conclusion: The study demonstrates that 8-week Paslop dancing exercise can improve the core strength.Keywords: balance, core strength, flexibility, Paslop
Procedia PDF Downloads 3838921 Comparison between Continuous Genetic Algorithms and Particle Swarm Optimization for Distribution Network Reconfiguration
Authors: Linh Nguyen Tung, Anh Truong Viet, Nghien Nguyen Ba, Chuong Trinh Trong
Abstract:
This paper proposes a reconfiguration methodology based on a continuous genetic algorithm (CGA) and particle swarm optimization (PSO) for minimizing active power loss and minimizing voltage deviation. Both algorithms are adapted using graph theory to generate feasible individuals, and the modified crossover is used for continuous variable of CGA. To demonstrate the performance and effectiveness of the proposed methods, a comparative analysis of CGA with PSO for network reconfiguration, on 33-node and 119-bus radial distribution system is presented. The simulation results have shown that both CGA and PSO can be used in the distribution network reconfiguration and CGA outperformed PSO with significant success rate in finding optimal distribution network configuration.Keywords: distribution network reconfiguration, particle swarm optimization, continuous genetic algorithm, power loss reduction, voltage deviation
Procedia PDF Downloads 1948920 A Knowledge-As-A-Service Support Framework for Ambient Learning in Kenya
Authors: Lucy W. Mburu, Richard Karanja, Simon N. Mwendia
Abstract:
Over recent years, learners have experienced a constant need to access on demand knowledge that is fully aligned with the paradigm of cloud computing. As motivated by the global sustainable development goal to ensure inclusive and equitable learning opportunities, this research has developed a framework hinged on the knowledge-as-a-service architecture that utilizes knowledge from ambient learning systems. Through statistical analysis and decision tree modeling, the study discovers influential variables for ambient learning among university students. The main aim is to generate a platform for disseminating and exploiting the available knowledge to aid the learning process and, thus, to improve educational support on the ambient learning system. The research further explores how collaborative effort can be used to form a knowledge network that allows access to heterogeneous sources of knowledge, which benefits knowledge consumers, such as the developers of ambient learning systems.Keywords: actionable knowledge, ambient learning, cloud computing, decision trees, knowledge as a service
Procedia PDF Downloads 1648919 Investigation of Enterotoxigenic Staphylococcus aureus in Kitchen of Catering
Authors: Çiğdem Sezer, Aksem Aksoy, Leyla Vatansever
Abstract:
This study has been done for the purpose of evaluation of public health and identifying of enterotoxigenic Staphyloccocus aureus in kitchen of catering. In the kitchen of catering, samples have been taken by swabs from surface of equipments which are in the salad section, meat section and bakery section. Samples have been investigated with classical cultural methods in terms of Staphyloccocus aureus. Therefore, as a 10x10 cm area was identified (salad, cutting and chopping surfaces, knives, meat grinder, meat chopping surface) samples have been taken with sterile swabs with helping FTS from this area. In total, 50 samples were obtained. In aseptic conditions, Baird-Parker agar (with egg yolk tellurite) surface was seeded with swabs. After 24-48 hours of incubation at 37°C, the black colonies with 1-1.5 mm diameter and which are surrounded by a zone indicating lecithinase activity were identified as S. aureus after applying Gram staining, catalase, coagulase, glucose and mannitol fermentation and termonuclease tests. Genotypic characterization (Staphylococcus genus and S.aureus species spesific) of isolates was performed by PCR. The ELISA test was applied to the isolates for the identification of staphylococcal enterotoxins (SET) A, B, C, D, E in bacterial cultures. Measurements were taken at 450 nm in an ELISA reader using an Ridascreen-Total set ELISA test kit (r-biopharm R4105-Enterotoxin A, B, C, D, E). The results were calculated according to the manufacturer’s instructions. A total of 50 samples of 97 S. aureus was isolated. This number has been identified as 60 with PCR analysis. According to ELISA test, only 1 of 60 isolates were found to be enterotoxigenic. Enterotoxigenic strains were identified from the surface of salad chopping and cutting. In the kitchen of catering, S. aureus identification indicates a significant source of contamination. Especially, in raw consumed salad preparation phase of contamination is very important. This food can be a potential source of food-borne poisoning their terms, and they pose a significant risk to consumers have been identified.Keywords: Staphylococcus aureus, enterotoxin, catering, kitchen, health
Procedia PDF Downloads 4068918 Consistent Testing for an Implication of Supermodular Dominance with an Application to Verifying the Effect of Geographic Knowledge Spillover
Authors: Chung Danbi, Linton Oliver, Whang Yoon-Jae
Abstract:
Supermodularity, or complementarity, is a popular concept in economics which can characterize many objective functions such as utility, social welfare, and production functions. Further, supermodular dominance captures a preference for greater interdependence among inputs of those functions, and it can be applied to examine which input set would produce higher expected utility, social welfare, or production. Therefore, we propose and justify a consistent testing for a useful implication of supermodular dominance. We also conduct Monte Carlo simulations to explore the finite sample performance of our test, with critical values obtained from the recentered bootstrap method, with and without the selective recentering, and the subsampling method. Under various parameter settings, we confirmed that our test has reasonably good size and power performance. Finally, we apply our test to compare the geographic and distant knowledge spillover in terms of their effects on social welfare using the National Bureau of Economic Research (NBER) patent data. We expect localized citing to supermodularly dominate distant citing if the geographic knowledge spillover engenders greater social welfare than distant knowledge spillover. Taking subgroups based on firm and patent characteristics, we found that there is industry-wise and patent subclass-wise difference in the pattern of supermodular dominance between localized and distant citing. We also compare the results from analyzing different time periods to see if the development of Internet and communication technology has changed the pattern of the dominance. In addition, to appropriately deal with the sparse nature of the data, we apply high-dimensional methods to efficiently select relevant data.Keywords: supermodularity, supermodular dominance, stochastic dominance, Monte Carlo simulation, bootstrap, subsampling
Procedia PDF Downloads 1358917 Experimental Investigation of the Effect of Material Composition on Landslides
Authors: Mengqi Wu, Haiping Zhu, Chin J. Leo
Abstract:
In this study, six experimental cases with different components (dry and wet soils and rocks) were considered to elucidate the influence of material composition on landslide profiles. The results show that the accumulation zone for all cases considered has a quadrilateral shape with two different bottom angles. The asymmetry of the accumulation zone can be attributed to the fact that soils in different parts of the landslide sliding can produce different speeds and suffer different resistances. The higher soil moisture can generate stronger cohesion between soils to reduce the volume of the sliding body during the landslide. The rock content can increase the accumulation angles to improve slope stability. The interaction between the irregular shapes of rocks and soils provides more resistance than that between spherical rocks and soils, which causes the slope with irregular rocks and soils to have higher stability.Keywords: landslide, soil moisture, rock content, experimental simulation
Procedia PDF Downloads 1088916 Using Machine Learning to Classify Different Body Parts and Determine Healthiness
Authors: Zachary Pan
Abstract:
Our general mission is to solve the problem of classifying images into different body part types and deciding if each of them is healthy or not. However, for now, we will determine healthiness for only one-sixth of the body parts, specifically the chest. We will detect pneumonia in X-ray scans of those chest images. With this type of AI, doctors can use it as a second opinion when they are taking CT or X-ray scans of their patients. Another ad-vantage of using this machine learning classifier is that it has no human weaknesses like fatigue. The overall ap-proach to this problem is to split the problem into two parts: first, classify the image, then determine if it is healthy. In order to classify the image into a specific body part class, the body parts dataset must be split into test and training sets. We can then use many models, like neural networks or logistic regression models, and fit them using the training set. Now, using the test set, we can obtain a realistic accuracy the models will have on images in the real world since these testing images have never been seen by the models before. In order to increase this testing accuracy, we can also apply many complex algorithms to the models, like multiplicative weight update. For the second part of the problem, to determine if the body part is healthy, we can have another dataset consisting of healthy and non-healthy images of the specific body part and once again split that into the test and training sets. We then use another neural network to train on those training set images and use the testing set to figure out its accuracy. We will do this process only for the chest images. A major conclusion reached is that convolutional neural networks are the most reliable and accurate at image classification. In classifying the images, the logistic regression model, the neural network, neural networks with multiplicative weight update, neural networks with the black box algorithm, and the convolutional neural network achieved 96.83 percent accuracy, 97.33 percent accuracy, 97.83 percent accuracy, 96.67 percent accuracy, and 98.83 percent accuracy, respectively. On the other hand, the overall accuracy of the model that de-termines if the images are healthy or not is around 78.37 percent accuracy.Keywords: body part, healthcare, machine learning, neural networks
Procedia PDF Downloads 1138915 The Design of a Computer Simulator to Emulate Pathology Laboratories: A Model for Optimising Clinical Workflows
Authors: M. Patterson, R. Bond, K. Cowan, M. Mulvenna, C. Reid, F. McMahon, P. McGowan, H. Cormican
Abstract:
This paper outlines the design of a simulator to allow for the optimisation of clinical workflows through a pathology laboratory and to improve the laboratory’s efficiency in the processing, testing, and analysis of specimens. Often pathologists have difficulty in pinpointing and anticipating issues in the clinical workflow until tests are running late or in error. It can be difficult to pinpoint the cause and even more difficult to predict any issues which may arise. For example, they often have no indication of how many samples are going to be delivered to the laboratory that day or at a given hour. If we could model scenarios using past information and known variables, it would be possible for pathology laboratories to initiate resource preparations, e.g. the printing of specimen labels or to activate a sufficient number of technicians. This would expedite the clinical workload, clinical processes and improve the overall efficiency of the laboratory. The simulator design visualises the workflow of the laboratory, i.e. the clinical tests being ordered, the specimens arriving, current tests being performed, results being validated and reports being issued. The simulator depicts the movement of specimens through this process, as well as the number of specimens at each stage. This movement is visualised using an animated flow diagram that is updated in real time. A traffic light colour-coding system will be used to indicate the level of flow through each stage (green for normal flow, orange for slow flow, and red for critical flow). This would allow pathologists to clearly see where there are issues and bottlenecks in the process. Graphs would also be used to indicate the status of specimens at each stage of the process. For example, a graph could show the percentage of specimen tests that are on time, potentially late, running late and in error. Clicking on potentially late samples will display more detailed information about those samples, the tests that still need to be performed on them and their urgency level. This would allow any issues to be resolved quickly. In the case of potentially late samples, this could help to ensure that critically needed results are delivered on time. The simulator will be created as a single-page web application. Various web technologies will be used to create the flow diagram showing the workflow of the laboratory. JavaScript will be used to program the logic, animate the movement of samples through each of the stages and to generate the status graphs in real time. This live information will be extracted from an Oracle database. As well as being used in a real laboratory situation, the simulator could also be used for training purposes. ‘Bots’ would be used to control the flow of specimens through each step of the process. Like existing software agents technology, these bots would be configurable in order to simulate different situations, which may arise in a laboratory such as an emerging epidemic. The bots could then be turned on and off to allow trainees to complete the tasks required at that step of the process, for example validating test results.Keywords: laboratory-process, optimization, pathology, computer simulation, workflow
Procedia PDF Downloads 2888914 Efficient Pre-Processing of Single-Cell Assay for Transposase Accessible Chromatin with High-Throughput Sequencing Data
Authors: Fan Gao, Lior Pachter
Abstract:
The primary tool currently used to pre-process 10X Chromium single-cell ATAC-seq data is Cell Ranger, which can take very long to run on standard datasets. To facilitate rapid pre-processing that enables reproducible workflows, we present a suite of tools called scATAK for pre-processing single-cell ATAC-seq data that is 15 to 18 times faster than Cell Ranger on mouse and human samples. Our tool can also calculate chromatin interaction potential matrices, and generate open chromatin signal and interaction traces for cell groups. We use scATAK tool to explore the chromatin regulatory landscape of a healthy adult human brain and unveil cell-type specific features, and show that it provides a convenient and computational efficient approach for pre-processing single-cell ATAC-seq data.Keywords: single-cell, ATAC-seq, bioinformatics, open chromatin landscape, chromatin interactome
Procedia PDF Downloads 1588913 The Effect of the Construction Contract System by Simulating the Comparative Costs of Capital to the Financial Feasibility of the Construction of Toll Bali Mandara
Authors: Mas Pertiwi I. G. AG Istri, Sri Kristinayanti Wayan, Oka Aryawan I. Gede Made
Abstract:
Ability of government to meet the needs of infrastructure investment constrained by the size of the budget commitments for other sectors. Another barrier is the complexity of the process of land acquisition. Public Private Partnership can help bridge the investment gap by including the amount of funding from the private sector, shifted the responsibility of financing, construction of the asset, and the operation and post-project design and care to them. In principle, a construction project implementation always requires the investor as a party to provide resources in the form of funding which it must be contained in a successor agreement in the form of a contract. In general, construction contracts consist of contracts which passed in Indonesia and contract International. One source of funding used in the implementation of construction projects comes from funding that comes from the collaboration between the government and the private sector, for example with the system: BLT (Build Lease Transfer), BOT (Build Operate Transfer), BTO (Build Transfer Operate) and BOO (Build Operate Own). And form of payment under a construction contract can be distinguished several ways: monthly payment, payments based on progress and payment after completed projects (Turn Key). One of the tools used to analyze the feasibility of the investment is to use financial models. The financial model describes the relationship between different variables and assumptions used. From a financial model will be known how the cash flow structure of the project, which includes revenues, expenses, liabilities to creditors and the payment of taxes to the government. Net cash flow generated from the project will be used as a basis for analyzing the feasibility of investment source of project financing Public Private Partnership could come from equity or debt. The proportion of funding according to its source is a comparison of a number of investment funds originating from each source of financing for a total investment cost during the construction period by selected the contract system and several alternative financing percentage ratio determined according to sources will generate cash flow structure that is different. Of the various possibilities for the structure of the cash flow generated will be analyzed by software is to test T Paired to compared the contract system used by various alternatives comparison of financing to determine the effect of the contract system and the comparison of such financing for the feasibility of investment toll road construction project for the economic life of 20 (twenty) years. In this use case studies of toll road contruction project Bali Mandara. And in this analysis only covered two systems contracts, namely Build Operate Transfer and Turn Key. Based on the results obtained by analysis of the variable investment feasibility of the NPV, BCR and IRR between the contract system Build Operate Transfer and contract system Turn Key on the interest rate of 9%, 12% and 15%.Keywords: contract system, financing, internal rate of return, net present value
Procedia PDF Downloads 2328912 Design and Characterization of CMOS Readout Circuit for ISFET and ISE Based Sensors
Authors: Yuzman Yusoff, Siti Noor Harun, Noor Shelida Salleh, Tan Kong Yew
Abstract:
This paper presents the design and characterization of analog readout interface circuits for ion sensitive field effect transistor (ISFET) and ion selective electrode (ISE) based sensor. These interface circuits are implemented using MIMOS’s 0.35um CMOS technology and experimentally characterized under 24-leads QFN package. The characterization evaluates the circuit’s functionality, output sensitivity and output linearity. Commercial sensors for both ISFET and ISE are employed together with glass reference electrode during testing. The test result shows that the designed interface circuits manage to readout signals produced by both sensors with measured sensitivity of ISFET and ISE sensor are 54mV/pH and 62mV/decade, respectively. The characterized output linearity for both circuits achieves above 0.999 rsquare. The readout also has demonstrated reliable operation by passing all qualifications in reliability test plan.Keywords: readout interface circuit (ROIC), analog interface circuit, ion sensitive field effect transistor (ISFET), ion selective electrode (ISE), ion sensor electronics
Procedia PDF Downloads 3178911 Mathematical Modelling of the Effect of Glucose on Pancreatic Alpha-Cell Activity
Authors: Karen K. Perez-Ramirez, Genevieve Dupont, Virginia Gonzalez-Velez
Abstract:
Pancreatic alpha-cells participate on glucose regulation together with beta cells. They release glucagon hormone when glucose level is low to stimulate gluconeogenesis from the liver. As other excitable cells, alpha cells generate Ca2+ and metabolic oscillations when they are stimulated. It is known that the glucose level can trigger or silence this activity although it is not clear how this occurs in normal and diabetic people. In this work, we propose an electric-metabolic mathematical model implemented in Matlab to study the effect of different glucose levels on the electrical response and Ca2+ oscillations of an alpha cell. Our results show that Ca2+ oscillations appear in opposite phase with metabolic oscillations in a window of glucose values. The model also predicts a direct relationship between the level of glucose and the intracellular adenine nucleotides showing a self-regulating pathway for the alpha cell.Keywords: Ca2+ oscillations, mathematical model, metabolic oscillations, pancreatic alpha cell
Procedia PDF Downloads 1848910 A Framework for Vacant City-Owned Land to Be Utilised for Urban Agriculture: The Case of Cape Town, South Africa
Authors: P. S. Van Staden, M. M. Campbell
Abstract:
Vacant City of Cape Town-owned land lying un-utilized and -productive could be developed for land uses such as urban agriculture that may improve the livelihoods of low income families. The new City of Cape Town zoning scheme includes an Urban Agriculture zoning for the first time. Unstructured qualitative interviews among town planners revealed their optimism about this inclusion as it will provide low-income residents with opportunities to generate an income. An existing farming community at Philippi, located within the municipal boundary of the city, was approached and empirical data obtained through questionnaires provided proof that urban agriculture could be viable in a coastal metropolitan city such as Cape Town even if farmers only produce for their own households. The lease method proposed for urban agriculture is a usufruct agreement conferring the right to another party, other than the legal owner, to enjoy the use and advantages of the property.Keywords: land uses, urban agriculture, agriculture, food engineering
Procedia PDF Downloads 3028909 Synthesis of Double Dye-Doped Silica Nanoparticles and Its Application in Paper-Based Chromatography
Authors: Ka Ho Yau, Jan Frederick Engels, Kwok Kei Lai, Reinhard Renneberg
Abstract:
Lateral flow test is a prevalent technology in various sectors such as food, pharmacology and biomedical sciences. Colloidal gold (CG) is widely used as the signalling molecule because of the ease of synthesis, bimolecular conjugation and its red colour due to intrinsic SPRE. However, the production of colloidal gold is costly and requires vigorous conditions. The stability of colloidal gold are easily affected by environmental factors such as pH, high salt content etc. Silica nanoparticles are well known for its ease of production and stability over a wide range of solvents. Using reverse micro-emulsion (w/o), silica nanoparticles with different sizes can be produced precisely by controlling the amount of water. By incorporating different water-soluble dyes, a rainbow colour of the silica nanoparticles could be produced. Conjugation with biomolecules such as antibodies can be achieved after surface modification of the silica nanoparticles with organosilane. The optimum amount of the antibodies to be labelled was determined by Bradford Assay. In this work, we have demonstrated the ability of the dye-doped silica nanoparticles as a signalling molecule in lateral flow test, which showed a semi-quantitative measurement of the analyte. The image was further analysed for the LOD=10 ng of the analyte. The working range and the linear range of the test were from 0 to 2.15μg/mL and from 0 to 1.07 μg/mL (R2=0.988) respectively. The performance of the tests was comparable to those using colloidal gold with the advantages of lower cost, enhanced stability and having a wide spectrum of colours. The positives lines can be imaged by naked eye or by using a mobile phone camera for a better quantification. Further research has been carried out in multicolour detection of different biomarkers simultaneously. The preliminary results were promising as there was little cross-reactivity being observed for an optimized system. This approach provides a platform for multicolour detection for a set of biomarkers that enhances the accuracy of diseases diagnostics.Keywords: colorimetric detection, immunosensor, paper-based biosensor, silica
Procedia PDF Downloads 3868908 The Impact of a Model's Skin Tone and Ethnic Identification on Consumer Decision Making
Authors: Shanika Y. Koreshi
Abstract:
Sri Lanka housed the lingerie product development and manufacturing subsidiary to renowned brands such as La Senza, Marks & Spencer, H&M, Etam, Lane Bryant, and George. Over the last few years, they have produced local brands such as Amante to cater to the local and regional customers. Past research has identified factors such as quality, price, and design to be vital when marketing lingerie to consumers. However, there has been minimum research that looks into the ethnically targeted market and skin colour within the Asian population. Therefore, the main aim of the research was to identify whether consumer preference for lingerie is influenced by the skin tone of the model wearing it. Moreover, the secondary aim was to investigate if the consumer preference for lingerie is influenced by the consumer’s ethnic identification with the skin tone of the model. An experimental design was used to explore the above aims. The participants constituted of 66 females residing in the western province of Sri Lanka and were gathered via convenience sampling. Six computerized images of a real model were used in the study, and her skin tone was digitally manipulated to express three different skin tones (light, tan and dark). Consumer preferences were measured through a ranking order scale that was constructed via a focus group discussion and ethnic identity was measured by the Multigroup Ethnic Identity Measure-Revised. Wilcoxon signed-rank test, Friedman test, and chi square test of independence were carried out using SPSS version 20. The results indicated that majority of the consumers ethnically identified and preferred the tan skin over the light and dark skin tones. The findings support the existing literature that states there is a preference among consumers when models have a medium skin tone over a lighter skin tone. The preference for a tan skin tone in a model is consistent with the ethnic identification of the Sri Lankan sample. The study implies that lingerie brands should consider the model's skin tones when marketing the brand to different ethnic backgrounds.Keywords: consumer preference, ethnic identification, lingerie, skin tone
Procedia PDF Downloads 263