Search results for: Monte Carlo steps
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2011

Search results for: Monte Carlo steps

1591 A Risk Management Framework for Selling a Mega Power Plant Project in a New Market

Authors: Negar Ganjouhaghighi, Amirali Dolatshahi

Abstract:

The origin of most risks of a mega project usually takes place in the phases before closing the contract. As a practical point of view, using project risk management techniques for preparing a proposal is not a total solution for managing the risks of a contract. The objective of this paper is to cover all those activities associated with risk management of a mega project sale’s processes; from entrance to a new market to awarding activities and the review of contract performance. In this study, the risk management happens in six consecutive steps that are divided into three distinct but interdependent phases upstream of the award of the contract: pre-tendering, tendering and closing. In the first step, by preparing standard market risk report, risks of the new market are identified. The next step is the bid or no bid decision making based on the previous gathered data. During the next three steps in tendering phase, project risk management techniques are applied for determining how much contingency reserve must be added or reduced to the estimated cost in order to put the residual risk to an acceptable level. Finally, the last step which happens in closing phase would be an overview of the project risks and final clarification of residual risks. The sales experience of more than 20,000 MW turn-key power plant projects alongside this framework, are used to develop a software that assists the sales team to have a better project risk management.

Keywords: project marketing, risk management, tendering, project management, turn-key projects

Procedia PDF Downloads 323
1590 Use of PACER Application as Physical Activity Assessment Tool: Results of a Reliability and Validity Study

Authors: Carine Platat, Fatima Qshadi, Ghofran Kayed, Nour Hussein, Amjad Jarrar, Habiba Ali

Abstract:

Nowadays, smartphones are very popular. They are offering a variety of easy-to-use and free applications among which step counters and fitness tests. The number of users is huge making of such applications a potentially efficient new strategy to encourage people to become more active. Nonetheless, data on their reliability and validity are very scarce and when available, they are often negative and contradictory. Besides, weight status, which is likely to introduce a bias in the physical activity assessment, was not often considered. Hence, the use of these applications as motivational tool, assessment tool and in research is questionable. PACER is one of the free step counters application. Even though it is one of the best rated free application by users, it has never been tested for reliability and validity. Prior any use of PACER, this remains to be investigated. The objective of this work is to investigate the reliability and validity of the smartphone application PACER in measuring the number of steps and in assessing the cardiorespiratory fitness by the 6 minutes walking test. 20 overweight or obese students (10 male and 10 female) were recruited at the United Arab Emirate University, aged between 18 and 25 years old. Reliability and validity were tested in real life conditions and in controlled conditions by using a treadmill. Test-retest experiments were done with PACER on 2 days separated by a week in real life conditions (24 hours each time) and in controlled conditions (30 minutes on treadmill, 3km/h). Validity was tested against the pedometer OMRON in the same conditions. During treadmill test, video was recorded and steps numbers were compared between PACER, pedometer and video. The validity of PACER in estimating the cardiorespiratory fitness (VO2max) as part of the 6 minutes walking test (6MWT) was studied against the 20m shuttle running test. Reliability was studied by calculating intraclass correlation coefficients (ICC), 95% confidence interval (95%CI) and by Bland-Altman plots. Validity was studied by calculating Spearman correlation coefficient (rho) and Bland-Altman plots. PACER reliability was good in both male and female in real life conditions (p≤10-3) but only in female in controlled conditions (p=0.01). PACER was valid against OMRON pedometer in male and female in real life conditions (rho=0.94, p≤10-3 ; rho=0.64, p=0.01, in male and female respectively). In controlled conditions, PACER was not valid against pedometer. But, PACER was valid against video in female (rho=0.72, p≤10-3). PACER was valid against the shuttle run test in male and female (rho-=0.66, p=0.01 ; rho=0.51, p=0.04) to estimate VO2max. This study provides data on the reliability and viability of PACER in overweight or obese male and female young adults. Globally, PACER was shown as reliable and valid in real life conditions in overweight or obese male and female to count steps and assess fitness. This supports the use of PACER to assess and promote physical activity in clinical follow-up and community interventions.

Keywords: smartphone application, pacer, reliability, validity, steps, fitness, physical activity

Procedia PDF Downloads 446
1589 Towards the Design of Gripper Independent of Substrate Surface Structures

Authors: Annika Schmidt, Ausama Hadi Ahmed, Carlo Menon

Abstract:

End effectors for robotic systems are becoming more and more advanced, resulting in a growing variety of gripping tasks. However, most grippers are application specific. This paper presents a gripper that interacts with an object’s surface rather than being dependent on a defined shape or size. For this purpose, ingressive and astrictive features are combined to achieve the desired gripping capabilities. The developed prototype is tested on a variety of surfaces with different hardness and roughness properties. The results show that the gripping mechanism works on all of the tested surfaces. The influence of the material properties on the amount of the supported load is also studied and the efficiency is discussed.

Keywords: claw, dry adhesion, insects, material properties

Procedia PDF Downloads 351
1588 Comparative Study of Outcome of Patients with Wilms Tumor Treated with Upfront Chemotherapy and Upfront Surgery in Alexandria University Hospitals

Authors: Golson Mohamed, Yasmine Gamasy, Khaled EL-Khatib, Anas Al-Natour, Shady Fadel, Haytham Rashwan, Haytham Badawy, Nadia Farghaly

Abstract:

Introduction: Wilm's tumor is the most common malignant renal tumor in children. Much progress has been made in the management of patients with this malignancy over the last 3 decades. Today treatments are based on several trials and studies conducted by the International Society of Pediatric Oncology (SIOP) in Europe and National Wilm's Tumor Study Group (NWTS) in the USA. It is necessary for us to understand why do we follow either of the protocols, NWTS which follows the upfront surgery principle or the SIOP which follows the upfront chemotherapy principle in all stages of the disease. Objective: The aim of is to assess outcome in patients treated with preoperative chemotherapy and patients treated with upfront surgery to compare their effect on overall survival. Study design: to decide which protocol to follow, study was carried out on records for patients aged 1 day to 18 years old suffering from Wilm's tumor who were admitted to Alexandria University Hospital, pediatric oncology, pediatric urology and pediatric surgery departments, with a retrospective survey records from 2010 to 2015, Design and editing of the transfer sheet with a (PRISMA flow study) Preferred Reporting Items for Systematic Reviews and Meta-Analyses. Data were fed to the computer and analyzed using IBM SPSS software package version 20.0. (11) Qualitative data were described using number and percent. Quantitative data were described using Range (minimum and maximum), mean, standard deviation and median. Comparison between different groups regarding categorical variables was tested using Chi-square test. When more than 20% of the cells have expected count less than 5, correction for chi-square was conducted using Fisher’s Exact test or Monte Carlo correction. The distributions of quantitative variables were tested for normality using Kolmogorov-Smirnov test, Shapiro-Wilk test, and D'Agstino test, if it reveals normal data distribution, parametric tests were applied. If the data were abnormally distributed, non-parametric tests were used. For normally distributed data, a comparison between two independent populations was done using independent t-test. For abnormally distributed data, comparison between two independent populations was done using Mann-Whitney test. Significance of the obtained results was judged at the 5% level. Results: A significantly statistical difference was observed for survival between the two studied groups favoring the upfront chemotherapy(86.4%)as compared to the upfront surgery group (59.3%) where P=0.009. As regard complication, 20 cases (74.1%) out of 27 were complicated in the group of patients treated with upfront surgery. Meanwhile, 30 cases (68.2%) out of 44 had complications in patients treated with upfront chemotherapy. Also, the incidence of intraoperative complication (rupture) was less in upfront chemotherapy group as compared to upfront surgery group. Conclusion: Upfront chemotherapy has superiority over upfront surgery.As the patient who started with upfront chemotherapy shown, higher survival rate, less percent in complication, less percent needed for radiotherapy, and less rate in recurrence.

Keywords: Wilm's tumor, renal tumor, chemotherapy, surgery

Procedia PDF Downloads 315
1587 Feasibility Study of a Solar Farm Project with an Executive Approach

Authors: Amir Reza Talaghat

Abstract:

Since 2015, a new approach and policy regarding energy resources protection and using renewable energies has been started in Iran which was developing new projects. Investigating about the feasibility study of these new projects helped to figure out five steps to prepare an executive feasibility study of the concerned projects, which are proper site selections, authorizations, design and simulation, economic study and programming, respectively. The results were interesting and essential for decision makers and investors to start implementing of these projects in reliable condition. The research is obtained through collection and study of the project's documents as well as recalculation to review conformity of the results with GIS data and the technical information of the bidders. In this paper, it is attempted to describe the result of the performed research by describing the five steps as an executive methodology, for preparing a feasible study of installing a 10 MW – solar farm project. The corresponding results of the research also help decision makers to start similar projects is explained in this paper as follows: selecting the best location for the concerned PV plant, reliable and safe conditions for investment and the required authorizations to start implementing the solar farm project in the concerned region, selecting suitable component to achieve the best possible performance for the plant, economic profit of the investment, proper programming to implement the project on time.

Keywords: solar farm, solar energy, execution of PV power plant PV power plant

Procedia PDF Downloads 173
1586 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines

Authors: Kamyar Tolouei, Ehsan Moosavi

Abstract:

In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.

Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization

Procedia PDF Downloads 98
1585 Flood Disaster Prevention and Mitigation in Nigeria Using Geographic Information System

Authors: Dinebari Akpee, Friday Aabe Gaage, Florence Fred Nwaigwu

Abstract:

Natural disasters like flood affect many parts of the world including developing countries like Nigeria. As a result, many human lives are lost, properties damaged and so much money is lost in infrastructure damages. These hazards and losses can be mitigated and reduced by providing reliable spatial information to the generality of the people through about flood risks through flood inundation maps. Flood inundation maps are very crucial for emergency action plans, urban planning, ecological studies and insurance rates. Nigeria experience her worst flood in her entire history this year. Many cities were submerged and completely under water due to torrential rainfall. Poor city planning, lack of effective development control among others contributes to the problem too. Geographic information system (GIS) can be used to visualize the extent of flooding, analyze flood maps to produce flood damaged estimation maps and flood risk maps. In this research, the under listed steps were taken in preparation of flood risk maps for the study area: (1) Digitization of topographic data and preparation of digital elevation model using ArcGIS (2) Flood simulation using hydraulic model and integration and (3) Integration of the first two steps to produce flood risk maps. The results shows that GIS can play crucial role in Flood disaster control and mitigation.

Keywords: flood disaster, risk maps, geographic information system, hazards

Procedia PDF Downloads 220
1584 Optimization of Hepatitis B Surface Antigen Purifications to Improving the Production of Hepatitis B Vaccines on Pichia pastoris

Authors: Rizky Kusuma Cahyani

Abstract:

Hepatitis B is a liver inflammatory disease caused by hepatitis B virus (HBV). This infection can be prevented by vaccination which contains HBV surface protein (sHBsAg). However, vaccine supply is limited. Several attempts have been conducted to produce local sHBsAg. However, the purity degree and protein yield are still inadequate. Therefore optimization of HBsAg purification steps is required to obtain high yield with better purification fold. In this study, optimization of purification was done in 2 steps, precipitation using variation of NaCl concentration (0,3 M; 0,5 M; 0,7 M) and PEG (3%, 5%, 7%); ion exchange chromatography (IEC) using NaCl 300-500 mM elution buffer concentration.To determine HBsAg protein, bicinchoninic acid assay (BCA) and enzyme-linked immunosorbent assay (ELISA) was used in this study. Visualization of HBsAg protein was done by SDS-PAGE analysis. Based on quantitative analysis, optimal condition at precipitation step was given 0,3 M NaCl and PEG 3%, while in ion exchange chromatography step, the optimum condition when protein eluted with NaCl 500 mM. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) analysis indicates that the presence of protein HBsAg with a molecular weight of 25 kDa (monomer) and 50 kDa (dimer). The optimum condition for purification of sHBsAg produced in Pichia pastoris gave a yield of 47% and purification fold 17x so that it would increase the production of hepatitis B vaccine to be more optimal.

Keywords: hepatitis B virus, HBsAg, hepatitis B surface antigen, Pichia pastoris, purification

Procedia PDF Downloads 148
1583 The Democratization of 3D Capturing: An Application Investigating Google Tango Potentials

Authors: Carlo Bianchini, Lorenzo Catena

Abstract:

The appearance of 3D scanners and then, more recently, of image-based systems that generate point clouds directly from common digital images have deeply affected the survey process in terms of both capturing and 2D/3D modelling. In this context, low cost and mobile systems are increasingly playing a key role and actually paving the way to the democratization of what in the past was the realm of few specialized technicians and expensive equipment. The application of Google Tango on the ancient church of Santa Maria delle Vigne in Pratica di Mare – Rome presented in this paper is one of these examples.

Keywords: the architectural survey, augmented/mixed/virtual reality, Google Tango project, image-based 3D capturing

Procedia PDF Downloads 146
1582 Reliability-Centered Maintenance Application for the Development of Maintenance Strategy for a Cement Plant

Authors: Nabil Hameed Al-Farsi

Abstract:

This study’s main goal is to develop a model and a maintenance strategy for a cement factory called Arabian Cement Company, Rabigh Plant. The proposed work here depends on Reliability centric maintenance approach to develop a strategy and maintenance schedule that ensures increasing the reliability of the production system components, thus ensuring continuous productivity. The cost-effective maintenance of the plant’s dependability performance is the key goal of durability-based maintenance is. The cement plant consists of 7 important steps, so, developing a maintenance plan based on Reliability centric maintenance (RCM) method is made up of 10 steps accordingly starting from selecting units and data until performing and updating the model. The processing unit chosen for the analysis of this case is the calcinatory unit regarding model’s validation and the Travancore Titanium Products Ltd (TTP) using the claimed data history acquired from the maintenance department maintenance from the mentioned company. After applying the proposed model, the results of the maintenance simulation justified the plant's existing scheduled maintenance policy being reconsidered. Results represent the need for preventive maintenance for all Class A criticality equipment instead of the planned maintenance and the breakdown one for all other equipment depends on its criticality and an FMEA report. Consequently, the additional cost of preventive maintenance would be offset by the cost savings from breakdown maintenance for the remaining equipment.

Keywords: engineering, reliability, strategy, maintenance, failure modes, effects and criticality analysis (FMEA)

Procedia PDF Downloads 165
1581 Biographical Learning and Its Impact on the Democratization Processes of Post War Societies

Authors: Rudolf Egger

Abstract:

This article shows some results of an ongoing project in Kosova. This project deals with the meaning of social transformation processes in the life-courses of Kosova people. One goal is to create an oral history archive in this country. In the last seven years we did some interpretative work (using narrative interviews) concerning the experiences and meanings of social changes from the perspective of life course. We want to reconstruct the individual possibilities in creating one's life in new social structures. After the terrible massacres of ethnical-territorially defined nationalism in former Yugoslavia it is the main focus to find out something about the many small daily steps which must be done, to build up a kind of “normality” in this country. These steps can be very well reconstructed by narrations, by life stories, because personal experiences are naturally linked with social orders. Each individual story is connected with further stories, in which the collective history will be negotiated and reflected. The view on the biographical narration opens the possibility to analyze the concreteness of the “individual case” in the complexity of collective history. Life stories contain thereby a kind of a transition character, that’s why they can be used for the reconstruction of periods of political transformation. For example: In the individual story we can find very clear the national or mythological character of the Albanian people in Kosova. The shown narrations can be read also as narrative lines in relation to the (re-)interpretation of the past, in which lived life is fixed into history in the so-called collective memory in Kosova.

Keywords: biographical learning, adult education, social change, post war societies

Procedia PDF Downloads 415
1580 Readout Development of a LGAD-based Hybrid Detector for Microdosimetry (HDM)

Authors: Pierobon Enrico, Missiaggia Marta, Castelluzzo Michele, Tommasino Francesco, Ricci Leonardo, Scifoni Emanuele, Vincezo Monaco, Boscardin Maurizio, La Tessa Chiara

Abstract:

Clinical outcomes collected over the past three decades have suggested that ion therapy has the potential to be a treatment modality superior to conventional radiation for several types of cancer, including recurrences, as well as for other diseases. Although the results have been encouraging, numerous treatment uncertainties remain a major obstacle to the full exploitation of particle radiotherapy. To overcome therapy uncertainties optimizing treatment outcome, the best possible radiation quality description is of paramount importance linking radiation physical dose to biological effects. Microdosimetry was developed as a tool to improve the description of radiation quality. By recording the energy deposition at the micrometric scale (the typical size of a cell nucleus), this approach takes into account the non-deterministic nature of atomic and nuclear processes and creates a direct link between the dose deposited by radiation and the biological effect induced. Microdosimeters measure the spectrum of lineal energy y, defined as the energy deposition in the detector divided by most probable track length travelled by radiation. The latter is provided by the so-called “Mean Chord Length” (MCL) approximation, and it is related to the detector geometry. To improve the characterization of the radiation field quality, we define a new quantity replacing the MCL with the actual particle track length inside the microdosimeter. In order to measure this new quantity, we propose a two-stage detector consisting of a commercial Tissue Equivalent Proportional Counter (TEPC) and 4 layers of Low Gain Avalanche Detectors (LGADs) strips. The TEPC detector records the energy deposition in a region equivalent to 2 um of tissue, while the LGADs are very suitable for particle tracking because of the thickness thinnable down to tens of micrometers and fast response to ionizing radiation. The concept of HDM has been investigated and validated with Monte Carlo simulations. Currently, a dedicated readout is under development. This two stages detector will require two different systems to join complementary information for each event: energy deposition in the TEPC and respective track length recorded by LGADs tracker. This challenge is being addressed by implementing SoC (System on Chip) technology, relying on Field Programmable Gated Arrays (FPGAs) based on the Zynq architecture. TEPC readout consists of three different signal amplification legs and is carried out thanks to 3 ADCs mounted on a FPGA board. LGADs activated strip signal is processed thanks to dedicated chips, and finally, the activated strip is stored relying again on FPGA-based solutions. In this work, we will provide a detailed description of HDM geometry and the SoC solutions that we are implementing for the readout.

Keywords: particle tracking, ion therapy, low gain avalanche diode, tissue equivalent proportional counter, microdosimetry

Procedia PDF Downloads 166
1579 A Comparative Study of Cognitive Factors Affecting Social Distancing among Vaccinated and Unvaccinated Filipinos

Authors: Emmanuel Carlo Belara, Albert John Dela Merced, Mark Anthony Dominguez, Diomari Erasga, Jerome Ferrer, Bernard Ombrog

Abstract:

Social distancing errors are a common prevalence between vaccinated and unvaccinated in the Filipino community. This study aims to identify and relate the factors on how they affect our daily lives. Observed factors include memory, attention, anxiety, decision-making, and stress. Upon applying the ergonomic tools and statistical treatment such as t-test and multiple linear regression, stress and attention turned out to have the most impact to the errors of social distancing.

Keywords: vaccinated, unvaccinated, socoal distancing, filipinos

Procedia PDF Downloads 197
1578 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: deep learning, disease prediction, end-to-end machine learning, metagenomics, multiple instance learning, precision medicine

Procedia PDF Downloads 121
1577 Introducing a Practical Model for Instructional System Design Based on Determining of the knowledge Level of the Organization: Case Study of Isfahan Public Transportation Co.

Authors: Mojtaba Aghajari, Alireza Aghasi

Abstract:

The first challenge which the current research faced has been the identification or determination of the level of knowledge in Isfahan public transportation corporation, and the second challenge has been the recognition and choice of a proper approach for the instructional system design. Responding these two challenges will present an appropriate model of instructional system design. In order to respond the first challenge or question, Nonaka and Takeuchi KM model has been utilized due to its universality among the 26 models proposed so far. The statistical population of this research included 2200 people, among which 200 persons were chosen as the sample of the research by the use of Morgan’s method. The data gathering has been carried out by the means of a questionnaire based on Nonaka and Takeuchi KM model, analysis of which has been done by SPSS program. The output of this questionnaire, yielding the point of 1.96 (out of 5 points), revealed that the general condition of Isfahan public transportation corporation is weak concerning its being knowledge-centered. After placing this output on Jonassen’s continuum, it was revealed that the appropriate approach for instructional system design is the system (or behavioral) approach. Accordingly, different steps of the general model of ADDIE, which covers all of the ISO10015 standards, were adopted in the act of designing. Such process in Isfahan public transportation corporation was designed and divided into three main steps, including: instructional designing and planning, instructional course planning, determination of the evaluation and the effectiveness of the instructional courses.

Keywords: instructional system design, system approach, knowledge management, employees

Procedia PDF Downloads 318
1576 Propagation of Ultra-High Energy Cosmic Rays through Extragalactic Magnetic Fields: An Exploratory Study of the Distance Amplification from Rectilinear Propagation

Authors: Rubens P. Costa, Marcelo A. Leigui de Oliveira

Abstract:

The comprehension of features on the energy spectra, the chemical compositions, and the origins of Ultra-High Energy Cosmic Rays (UHECRs) - mainly atomic nuclei with energies above ~1.0 EeV (exa-electron volts) - are intrinsically linked to the problem of determining the magnitude of their deflections in cosmic magnetic fields on cosmological scales. In addition, as they propagate from the source to the observer, modifications are expected in their original energy spectra, anisotropy, and the chemical compositions due to interactions with low energy photons and matter. This means that any consistent interpretation of the nature and origin of UHECRs has to include the detailed knowledge of their propagation in a three-dimensional environment, taking into account the magnetic deflections and energy losses. The parameter space range for the magnetic fields in the universe is very large because the field strength and especially their orientation have big uncertainties. Particularly, the strength and morphology of the Extragalactic Magnetic Fields (EGMFs) remain largely unknown, because of the intrinsic difficulty of observing them. Monte Carlo simulations of charged particles traveling through a simulated magnetized universe is the straightforward way to study the influence of extragalactic magnetic fields on UHECRs propagation. However, this brings two major difficulties: an accurate numerical modeling of charged particles diffusion in magnetic fields, and an accurate numerical modeling of the magnetized Universe. Since magnetic fields do not cause energy losses, it is important to impose that the particle tracking method conserve the particle’s total energy and that the energy changes are results of the interactions with background photons only. Hence, special attention should be paid to computational effects. Additionally, because of the number of particles necessary to obtain a relevant statistical sample, the particle tracking method must be computationally efficient. In this work, we present an analysis of the propagation of ultra-high energy charged particles in the intergalactic medium. The EGMFs are considered to be coherent within cells of 1 Mpc (mega parsec) diameter, wherein they have uniform intensities of 1 nG (nano Gauss). Moreover, each cell has its field orientation randomly chosen, and a border region is defined such that at distances beyond 95% of the cell radius from the cell center smooth transitions have been applied in order to avoid discontinuities. The smooth transitions are simulated by weighting the magnetic field orientation by the particle's distance to the two nearby cells. The energy losses have been treated in the continuous approximation parameterizing the mean energy loss per unit path length by the energy loss length. We have shown, for a particle with the typical energy of interest the integration method performance in the relative error of Larmor radius, without energy losses and the relative error of energy. Additionally, we plotted the distance amplification from rectilinear propagation as a function of the traveled distance, particle's magnetic rigidity, without energy losses, and particle's energy, with energy losses, to study the influence of particle's species on these calculations. The results clearly show when it is necessary to use a full three-dimensional simulation.

Keywords: cosmic rays propagation, extragalactic magnetic fields, magnetic deflections, ultra-high energy

Procedia PDF Downloads 125
1575 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services

Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme

Abstract:

Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.

Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing

Procedia PDF Downloads 104
1574 An Infrared Inorganic Scintillating Detector Applied in Radiation Therapy

Authors: Sree Bash Chandra Debnath, Didier Tonneau, Carole Fauquet, Agnes Tallet, Julien Darreon

Abstract:

Purpose: Inorganic scintillating dosimetry is the most recent promising technique to solve several dosimetric issues and provide quality assurance in radiation therapy. Despite several advantages, the major issue of using scintillating detectors is the Cerenkov effect, typically induced in the visible emission range. In this context, the purpose of this research work is to evaluate the performance of a novel infrared inorganic scintillator detector (IR-ISD) in the radiation therapy treatment to ensure Cerenkov free signal and the best matches between the delivered and prescribed doses during treatment. Methods: A simple and small-scale infrared inorganic scintillating detector of 100 µm diameter with a sensitive scintillating volume of 2x10-6 mm3 was developed. A prototype of the dose verification system has been introduced based on PTIR1470/F (provided by Phosphor Technology®) material used in the proposed novel IR-ISD. The detector was tested on an Elekta LINAC system tuned at 6 MV/15MV and a brachytherapy source (Ir-192) used in the patient treatment protocol. The associated dose rate was measured in count rate (photons/s) using a highly sensitive photon counter (sensitivity ~20ph/s). Overall measurements were performed in IBATM water tank phantoms by following international Technical Reports series recommendations (TRS 381) for radiotherapy and TG43U1 recommendations for brachytherapy. The performance of the detector was tested through several dosimetric parameters such as PDD, beam profiling, Cerenkov measurement, dose linearity, dose rate linearity repeatability, and scintillator stability. Finally, a comparative study is also shown using a reference microdiamond dosimeter, Monte-Carlo (MC) simulation, and data from recent literature. Results: This study is highlighting the complete removal of the Cerenkov effect especially for small field radiation beam characterization. The detector provides an entire linear response with the dose in the 4cGy to 800 cGy range, independently of the field size selected from 5 x 5 cm² down to 0.5 x 0.5 cm². A perfect repeatability (0.2 % variation from average) with day-to-day reproducibility (0.3% variation) was observed. Measurements demonstrated that ISD has superlinear behavior with dose rate (R2=1) varying from 50 cGy/s to 1000 cGy/s. PDD profiles obtained in water present identical behavior with a build-up maximum depth dose at 15 mm for different small fields irradiation. A low dimension of 0.5 x 0.5 cm² field profiles have been characterized, and the field cross profile presents a Gaussian-like shape. The standard deviation (1σ) of the scintillating signal remains within 0.02% while having a very low convolution effect, thanks to lower sensitive volume. Finally, during brachytherapy, a comparison with MC simulations shows that considering energy dependency, measurement agrees within 0.8% till 0.2 cm source to detector distance. Conclusion: The proposed scintillating detector in this study shows no- Cerenkov radiation and efficient performance for several radiation therapy measurement parameters. Therefore, it is anticipated that the IR-ISD system can be promoted to validate with direct clinical investigations, such as appropriate dose verification and quality control in the Treatment Planning System (TPS).

Keywords: IR-Scintillating detector, dose measurement, micro-scintillators, Cerenkov effect

Procedia PDF Downloads 180
1573 Steps of the Pancreatic Differentiation in the Grass Snake (Natrix natrix) Embryos

Authors: Magdalena Kowalska, Weronika Rupik

Abstract:

The pancreas is an important organ present in all vertebrate species. It contains two different tissues, exocrine and endocrine, that act as two glands in one. The development and differentiation of the pancreas in reptiles is poorly known in comparison to other vertebrates. Therefore, the aim of this study was to investigate the particular steps concerning the differentiation of the pancreas in the grass snake (Natrix natrix) embryos. For this, histological methods (including hematoxylin and eosin, and Heidenhain's AZAN staining), transmission electron microscopy and three-dimensional (3D) reconstructions from serial paraffin sections were used. The results of this study indicated that the first step of pancreas development in Natrix was the connection of the two pancreatic buds: dorsal and ventral one. Then, duct walls in both buds started to be remodeled from the multilayered to single-layered epithelium. This remodeling started in the dorsal bud and was simultaneously with the differentiation of the duct lumens which occurred by the cavition. During this process, the cells that had no contact with the mesenchyme underwent cell death named anoikis. These findings indicated that the walls of ducts in the embryonic pancreas of the grass snake were initially formed by the abundant principal and single endocrine cells. Later the basal and goblet cells differentiated. Among the endocrine cells, as the first the B and A cells differentiated, then the D and PP cells. The next step of the pancreatic development was the withdrawing of the endocrine cells from the duct walls to form the pancreatic islets. The endocrine cells and islets were found only in the dorsal part of the pancreas in Natrix embryos what is different than in other vertebrate species. The islets were formed mainly by the A cells. Simultaneously, with the differentiation of the endocrine pancreas, the acinar tissue started to differentiate. The source of the acinar cells were pancreatic ducts similar as in other vertebrates. The acini formation began at the proximal part of the pancreas and went towards the caudal direction. Differentiating pancreatic ducts developed into the branched system that can be divided into extralobular, intralobular, and intercalated ducts, similarly as in other vertebrate species. However, the pattern of branching was different. In conclusions, particular steps of the pancreas differentiation in the grass snake were different than in other vertebrates. It can be supposed that these differences are related to the specific topography of the snake’s internal organs and their taxonomy position. All specimens used in the study were captured according to the Polish regulations concerning the protection of wild species. Permission was granted by the Local Ethics Commission in Katowice (41/2010; 87/2015) and the Regional Directorate for Environmental Protection in Katowice (WPN.6401.257.2015.DC).

Keywords: embryogenesis, organogenesis, pancreas, Squamata

Procedia PDF Downloads 167
1572 Attitude-Behavior Consistency: A Descriptive Study in the Context of Climate Change and Acceptance of Psychological Findings by the Public

Authors: Nita Mitra, Pranab Chanda

Abstract:

In this paper, the issue of attitude-behavior consistency has been addressed in the context of climate change. Scientists (about 98 percent) opine that human behavior has a significant role in climate change. Such climate changes are harmful for human life. Thus, it is natural to conclude that only change of human behavior can avoid harmful consequences. Government and Non-Government Organizations are taking steps to bring in the desired changes in behavior. However, it seems that although the efforts are achieving changes in the attitudes to some degree, those steps are failing to materialize the corresponding behavioral changes. This has been a great concern for environmentalists. Psychologists have noticed the problem as a particular case of the general psychological problem of making attitude and behavior consistent with each other. The present study is in continuation of a previous work of the same author based upon descriptive research on the status of attitude and behavior of the people of a foot-hill region of the Himalayas in India regarding climate change. The observations confirm the mismatch of attitude and behavior of the people of the region with respect to climate change. While doing so an attitude-behavior mismatch has been noticed with respect to the acceptance of psychological findings by the public. People have been found to be interested in Psychology as an important subject, but they are reluctant to take the observations of psychologists seriously. A comparative study in this regard has been made with similar studies done elsewhere. Finally, an attempt has been made to perceive observations in the framework of observational learning due to Bandura's and behavior change due to Lewin.

Keywords: acceptance of psychological variables, attitude-behavior consistency, behavior change, climate change, observational learning

Procedia PDF Downloads 151
1571 A Study on the Treatment of Municipal Waste Water Using Sequencing Batch Reactor

Authors: Bhaven N. Tandel, Athira Rajeev

Abstract:

Sequencing batch reactor process is a suspended growth process operating under non-steady state conditions which utilizes a fill and draw reactor with complete mixing during the batch reaction step (after filling) and where the subsequent steps of aeration and clarification occur in the same tank. All sequencing batch reactor systems have five steps in common, which are carried out in sequence as follows, (1) fill (2) react (3) settle (sedimentation/clarification) (4) draw (decant) and (5) idle. The study was carried out in a sequencing batch reactor of dimensions 44cmx30cmx70cm with a working volume of 40 L. Mechanical stirrer of 100 rpm was used to provide continuous mixing in the react period and oxygen was supplied by fish tank aerators. The duration of a complete cycle of sequencing batch reactor was 8 hours. The cycle period was divided into different phases in sequence as follows-0.25 hours fill phase, 6 hours react period, 1 hour settling phase, 0.5 hours decant period and 0.25 hours idle phase. The study consisted of two runs, run 1 and run 2. Run 1 consisted of 6 hours aerobic react period and run 2 consisted of 3 hours aerobic react period followed by 3 hours anoxic react period. The influent wastewater used for the study had COD, BOD, NH3-N and TKN concentrations of 308.03±48.94 mg/L, 100.36±22.05 mg/L, 14.12±1.18 mg/L, and 24.72±2.21 mg/L respectively. Run 1 had an average COD removal efficiency of 41.28%, BOD removal efficiency of 56.25%, NH3-N removal efficiency of 86.19% and TKN removal efficiency of 54.4%. Run 2 had an average COD removal efficiency of 63.19%, BOD removal efficiency of 73.85%, NH3-N removal efficiency of 90.74% and TKN removal efficiency of 65.25%. It was observed that run 2 gave better performance than run 1 in the removal of COD, BOD and TKN.

Keywords: municipal waste water, aerobic, anoxic, sequencing batch reactor

Procedia PDF Downloads 542
1570 Physical Property Characterization of Adult Dairy Nutritional Products for Powder Reconstitution

Authors: Wei Wang, Martin Chen

Abstract:

The reconstitution behaviours of nutritional products could impact user experience. Reconstitution issues such as lump formation and white flecks sticking to bottles surfaces could be very unappealing for the consumers in milk preparation. The controlling steps in dissolving instant milk powders include wetting, swelling, sinking, dispersing, and dissolution as in the literature. Each stage happens simultaneously with the others during milk preparation, and it is challenging to isolate and measure each step individually. This study characterized three adult nutritional products for different properties including particle size, density, dispersibility, stickiness, and capillary wetting to understand the relationship between powder physical properties and their reconstitution behaviours. From the results, the formation of clumps can be caused by different factors limiting the critical steps of powder reconstitution. It can be caused by small particle size distribution, light particle density limiting powder wetting, or the rapid swelling and dissolving of particle surface materials to impede water penetration in the capillary channels formed by powder agglomerates. For the grain or white flecks formation in milk preparation, it was believed to be controlled by dissolution speed of the particles after dispersion into water. By understanding those relationship between fundamental powder structure and their user experience in reconstitution, this information provides us new and multiple perspectives on how to improve the powder characteristics in the commercial manufacturing.

Keywords: characterization, dairy nutritional powder, physical property, reconstitution

Procedia PDF Downloads 100
1569 Economic Analysis of a Carbon Abatement Technology

Authors: Hameed Rukayat Opeyemi, Pericles Pilidis Pagone Emmanuele, Agbadede Roupa, Allison Isaiah

Abstract:

Climate change represents one of the single most challenging problems facing the world today. According to the National Oceanic and Administrative Association, Atmospheric temperature rose almost 25% since 1958, Artic sea ice has shrunk 40% since 1959 and global sea levels have risen more than 5.5cm since 1990. Power plants are the major culprits of GHG emission to the atmosphere. Several technologies have been proposed to reduce the amount of GHG emitted to the atmosphere from power plant, one of which is the less researched Advanced zero-emission power plant. The advanced zero emission power plants make use of mixed conductive membrane (MCM) reactor also known as oxygen transfer membrane (OTM) for oxygen transfer. The MCM employs membrane separation process. The membrane separation process was first introduced in 1899 when Walter Hermann Nernst investigated electric current between metals and solutions. He found that when a dense ceramic is heated, the current of oxygen molecules move through it. In the bid to curb the amount of GHG emitted to the atmosphere, the membrane separation process was applied to the field of power engineering in the low carbon cycle known as the Advanced zero emission power plant (AZEP cycle). The AZEP cycle was originally invented by Norsk Hydro, Norway and ABB Alstom power (now known as Demag Delaval Industrial turbomachinery AB), Sweden. The AZEP drew a lot of attention because its ability to capture ~100% CO2 and also boasts of about 30-50% cost reduction compared to other carbon abatement technologies, the penalty in efficiency is also not as much as its counterparts and crowns it with almost zero NOx emissions due to very low nitrogen concentrations in the working fluid. The advanced zero emission power plants differ from a conventional gas turbine in the sense that its combustor is substituted with the mixed conductive membrane (MCM-reactor). The MCM-reactor is made up of the combustor, low-temperature heat exchanger LTHX (referred to by some authors as air preheater the mixed conductive membrane responsible for oxygen transfer and the high-temperature heat exchanger and in some layouts, the bleed gas heat exchanger. Air is taken in by the compressor and compressed to a temperature of about 723 Kelvin and pressure of 2 Mega-Pascals. The membrane area needed for oxygen transfer is reduced by increasing the temperature of 90% of the air using the LTHX; the temperature is also increased to facilitate oxygen transfer through the membrane. The air stream enters the LTHX through the transition duct leading to inlet of the LTHX. The temperature of the air stream is then increased to about 1150 K depending on the design point specification of the plant and the efficiency of the heat exchanging system. The amount of oxygen transported through the membrane is directly proportional to the temperature of air going through the membrane. The AZEP cycle was developed using the Fortran software and economic analysis was conducted using excel and Matlab followed by optimization case study. The Simple bleed gas heat exchange layout (100 % CO2 capture), Bleed gas heat exchanger layout with flue gas turbine (100 % CO2 capture), Pre-expansion reheating layout (Sequential burning layout)–AZEP 85% (85% CO2 capture) and Pre-expansion reheating layout (Sequential burning layout) with flue gas turbine–AZEP 85% (85% CO2 capture). This paper discusses monte carlo risk analysis of four possible layouts of the AZEP cycle.

Keywords: gas turbine, global warming, green house gas, fossil fuel power plants

Procedia PDF Downloads 392
1568 Removal of Bulk Parameters and Chromophoric Fractions of Natural Organic Matter by Porous Kaolin/Fly Ash Ceramic Membrane at South African Drinking Water Treatment Plants

Authors: Samkeliso S. Ndzimandze, Welldone Moyo, Oranso T. Mahlangu, Adolph A. Muleja, Alex T. Kuvarega, Thabo T. I. Nkambule

Abstract:

The high cost of precursor materials has hindered the commercialization of ceramic membrane technology in water treatment. In this work, a ceramic membrane disc (approximately 50 mm in diameter and 4 mm thick) was prepared from low-cost starting materials, kaolin, and fly ash by pressing at 200 bar and calcining at 900 °C. The fabricated membrane was characterized for various physicochemical properties, natural organic matter (NOM) removal as well as fouling propensity using several techniques. Further, the ceramic membrane was tested on samples collected from four drinking water treatment plants in KwaZulu-Natal, South Africa (named plants 1-4). The membrane achieved 48.6%, 54.6%, 57.4%, and 76.4% bulk UV254 reduction for raw water at plants 1, 2, 3, and 4, respectively. These removal rates were comparable to UV254 reduction achieved by coagulation/flocculation steps at the respective plants. Further, the membrane outperformed sand filtration steps in plants 1-4 in removing disinfection by-product precursors (8%-32%) through size exclusion. Fluorescence excitation-emission matrices (FEEM) studies showed the removal of fluorescent NOM fractions present in the water samples by the membrane. The membrane was fabricated using an up-scalable facile method, and it has the potential for application as a polishing step to complement conventional processes in water treatment for drinking purposes.

Keywords: crossflow filtration, drinking water treatment plants, fluorescence excitation-emission matrices, ultraviolet 254 (UV₂₅₄)

Procedia PDF Downloads 40
1567 Ways of Innovative Sustainable Agriculture in India

Authors: Shailja Thakur

Abstract:

In this paper it is shown that how farmers are suffering from all sides including vagaries of weather then price fluctuations, demand supply constraints, poor soil health etc. Also the ICT can prove to be of great help if incorporated rightly into Indian agriculture. Some innovative ways to reward farmers and distribution of subsidies to them can improve the current scenario.

Keywords: cost of farming, information and communication technology, innovative steps, roof gardening, vermicomposting

Procedia PDF Downloads 301
1566 Monte Carlo Risk Analysis of a Carbon Abatement Technology

Authors: Hameed Rukayat Opeyemi, Pericles Pilidis, Pagone Emanuele

Abstract:

Climate change represents one of the single most challenging problems facing the world today. According to the National Oceanic and Administrative Association, Atmospheric temperature rose almost 25% since 1958, Artic sea ice has shrunk 40% since 1959 and global sea levels have risen more than 5.5 cm since 1990. Power plants are the major culprits of GHG emission to the atmosphere. Several technologies have been proposed to reduce the amount of GHG emitted to the atmosphere from power plant, one of which is the less researched Advanced zero emission power plant. The advanced zero emission power plants make use of mixed conductive membrane (MCM) reactor also known as oxygen transfer membrane (OTM) for oxygen transfer. The MCM employs membrane separation process. The membrane separation process was first introduced in 1899 when Walter Hermann Nernst investigated electric current between metals and solutions. He found that when a dense ceramic is heated, current of oxygen molecules move through it. In the bid to curb the amount of GHG emitted to the atmosphere, the membrane separation process was applied to the field of power engineering in the low carbon cycle known as the Advanced zero emission power plant (AZEP cycle). The AZEP cycle was originally invented by Norsk Hydro, Norway and ABB Alstom power (now known as Demag Delaval Industrial turbo machinery AB), Sweden. The AZEP drew a lot of attention because its ability to capture ~100% CO2 and also boasts of about 30-50 % cost reduction compared to other carbon abatement technologies, the penalty in efficiency is also not as much as its counterparts and crowns it with almost zero NOx emissions due to very low nitrogen concentrations in the working fluid. The advanced zero emission power plants differ from a conventional gas turbine in the sense that its combustor is substituted with the mixed conductive membrane (MCM-reactor). The MCM-reactor is made up of the combustor, low temperature heat exchanger LTHX (referred to by some authors as air pre-heater the mixed conductive membrane responsible for oxygen transfer and the high temperature heat exchanger and in some layouts, the bleed gas heat exchanger. Air is taken in by the compressor and compressed to a temperature of about 723 Kelvin and pressure of 2 Mega-Pascals. The membrane area needed for oxygen transfer is reduced by increasing the temperature of 90% of the air using the LTHX; the temperature is also increased to facilitate oxygen transfer through the membrane. The air stream enters the LTHX through the transition duct leading to inlet of the LTHX. The temperature of the air stream is then increased to about 1150 K depending on the design point specification of the plant and the efficiency of the heat exchanging system. The amount of oxygen transported through the membrane is directly proportional to the temperature of air going through the membrane. The AZEP cycle was developed using the Fortran software and economic analysis was conducted using excel and Matlab followed by optimization case study. This paper discusses techno-economic analysis of four possible layouts of the AZEP cycle. The Simple bleed gas heat exchange layout (100 % CO2 capture), Bleed gas heat exchanger layout with flue gas turbine (100 % CO2 capture), Pre-expansion reheating layout (Sequential burning layout) – AZEP 85 % (85 % CO2 capture) and Pre-expansion reheating layout (Sequential burning layout) with flue gas turbine– AZEP 85 % (85 % CO2 capture). This paper discusses Montecarlo risk analysis of four possible layouts of the AZEP cycle.

Keywords: gas turbine, global warming, green house gases, power plants

Procedia PDF Downloads 467
1565 The Use of Non-Parametric Bootstrap in Computing of Microbial Risk Assessment from Lettuce Consumption Irrigated with Contaminated Water by Sanitary Sewage in Infulene Valley

Authors: Mario Tauzene Afonso Matangue, Ivan Andres Sanchez Ortiz

Abstract:

The Metropolitan area of Maputo (Mozambique Capital City) is located in semi-arid zone (800 mm annual rainfall) with 1101170 million inhabitants. On the west side, there are the flatlands of Infulene where the Mulauze River flows towards to the Indian Ocean, receiving at this site, the storm water contaminated with sanitary sewage from Maputo, transported through a concrete open channel. In Infulene, local communities grow salads crops such as tomato, onion, garlic, lettuce, and cabbage, which are then commercialized and consumed in several markets in Maputo City. Lettuce is the most daily consumed salad crop in different meals, generally in fast-foods, breakfasts, lunches, and dinners. However, the risk of infection by several pathogens due to the consumption of lettuce, using the Quantitative Microbial Risk Assessment (QMRA) tools, is still unknown since there are few studies or publications concerning to this matter in Mozambique. This work is aimed at determining the annual risk arising from the consumption of lettuce grown in Infulene valley, in Maputo, using QMRA tools. The exposure model was constructed upon the volume of contaminated water remaining in the lettuce leaves, the empirical relations between the number of pathogens and the indicator of microorganisms (E. coli), the consumption of lettuce (g) and reduction of pathogens (days). The reference pathogens were Vibrio cholerae, Cryptosporidium, norovirus, and Ascaris. The water quality samples (E. coli) were collected in the storm water channel from January 2016 to December 2018, comprising 65 samples, and the urban lettuce consumption data were collected through inquiry in Maputo Metropolis covering 350 persons. A non-parametric bootstrap was performed involving 10,000 iterations over the collected dataset, namely, water quality (E. coli) and lettuce consumption. The dose-response models were: Exponential for Cryptosporidium, Kummer Confluent hypergeomtric function (1F1) for Vibrio and Ascaris Gaussian hypergeometric function (2F1-(a,b;c;z) for norovirus. The annual infection risk estimates were performed using R 3.6.0 (CoreTeam) software by Monte Carlo (Latin hypercubes), a sampling technique involving 10,000 iterations. The annual infection risks values expressed by Median and the 95th percentile, per person per year (pppy) arising from the consumption of lettuce are as follows: Vibrio cholerae (1.00, 1.00), Cryptosporidium (3.91x10⁻³, 9.72x 10⁻³), nororvirus (5.22x10⁻¹, 9.99x10⁻¹) and Ascaris (2.59x10⁻¹, 9.65x10⁻¹). Thus, the consumption of the lettuce would result in greater risks than the tolerable levels ( < 10⁻³ pppy or 10⁻⁶ DALY) for all pathogens, and the Vibrio cholerae is the most virulent pathogens, according to the hit-single models followed by the Ascaris lumbricoides and norovirus. The sensitivity analysis carried out in this work pointed out that in the whole QMRA, the most important input variable was the reduction of pathogens (Spearman rank value was 0.69) between harvest and consumption followed by water quality (Spearman rank value was 0.69). The decision-makers (Mozambique Government) must strengthen the prevention measures related to pathogens reduction in lettuce (i.e., washing) and engage in wastewater treatment engineering.

Keywords: annual infections risk, lettuce, non-parametric bootstrapping, quantitative microbial risk assessment tools

Procedia PDF Downloads 116
1564 The Association Between Objectively Measured Physical Activity and Health-related Quality of Life, Life-space Mobility and Successful Aging in Older Indian Adults

Authors: Jeanne Grace, Jacqueline Naiker

Abstract:

Background: Longevity is increasing, accompanied by a rise in disability and chronic diseases with physical activity (PA) delaying disability, ensuring successful aging (SA) and independent living in older adults. Aim: This study aimed to determine objectively measured PA levels, health-related quality of life (HRQoL), life-space mobility, and successful aging (SA) of older adults in KwaZulu-Natal province, South Africa, as well as their mutual associations. Methods: A total of 210 older adults aged 65–92 years were purposively sampled and completed the Medical Outcomes Study 36-Item Short-Form Health Survey, the Life-Space Mobility, and Successful Aging questionnaires. PA levels were measured using an Omron Pedometer, which the participants wore for seven consecutive days. Results: The average number of steps taken per day for the seven days was 2025, with 98.6% of the entire study population classified as sedentary. The Vitality domain (one of 8 categorized) reflected the best health status (M = 59.9, SD ± 18.8), with a significant 93% of the participants indicating that they had not visited places outside their immediate neighborhood (P < 0.0005). A significant, negative association between the average number of steps taken in 7 days and all three SA variables – namely, the physical (r = –0.152, P = 0.027), sociological (r = –0.148, P = 0.032) and psychological (r = –0.176, P = 0.010), and a significant, positive association with life-space mobility (r = 0.224, P = 0.001) was noted. Conclusion: The majority of the elderly were sedentary, affecting their HRQoL, life-space mobility, and SA negatively.

Keywords: active life expectancy, geriatrics, nursing homes, well-being

Procedia PDF Downloads 164
1563 Environmental Decision Making Model for Assessing On-Site Performances of Building Subcontractors

Authors: Buket Metin

Abstract:

Buildings cause a variety of loads on the environment due to activities performed at each stage of the building life cycle. Construction is the first stage that affects both the natural and built environments at different steps of the process, which can be defined as transportation of materials within the construction site, formation and preparation of materials on-site and the application of materials to realize the building subsystems. All of these steps require the use of technology, which varies based on the facilities that contractors and subcontractors have. Hence, environmental consequences of the construction process should be tackled by focusing on construction technology options used in every step of the process. This paper presents an environmental decision-making model for assessing on-site performances of subcontractors based on the construction technology options which they can supply. First, construction technologies, which constitute information, tools and methods, are classified. Then, environmental performance criteria are set forth related to resource consumption, ecosystem quality, and human health issues. Finally, the model is developed based on the relationships between the construction technology components and the environmental performance criteria. The Fuzzy Analytical Hierarchy Process (FAHP) method is used for weighting the environmental performance criteria according to environmental priorities of decision-maker(s), while the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method is used for ranking on-site environmental performances of subcontractors using quantitative data related to the construction technology components. Thus, the model aims to provide an insight to decision-maker(s) about the environmental consequences of the construction process and to provide an opportunity to improve the overall environmental performance of construction sites.

Keywords: construction process, construction technology, decision making, environmental performance, subcontractor

Procedia PDF Downloads 243
1562 Photographic Documentation of Archaeological Collections in the Grand Egyptian Museum

Authors: Sameh El Mahdy

Abstract:

Recording and documenting archaeological collections, especially photographic documentation, is considered one of the very important matters that museums care about and give great priority, as photographic documentation is of great importance. We monitor some of them for example, Photographs of collectibles are considered evidence and an archival record that proves the condition of the collectibles at various stages. A photo of the possessions is placed on the paper record of the possessions registration. These photos are used in inventorying archaeological collections. These pictures are viewed by researchers and scholars interested in studying these collections. These images are used in advertising campaigns for museum displays of archaeological collections. The Grand Egyptian Museum is considered one of the museums that is a unique model in terms of establishing a specific system that is used when photographing archaeological collections. The Grand Egyptian Museum sets standards for the photos that are taken inside the Grand Egyptian Museum. We mention some of them for example, Pictures must be of high quality. It is necessary to set a color scale for the drawing in order to clarify the dimensions of the collectibles in the picture and also in order to clarify the natural colors of the collectibles without any additions. Putting the numbers of the collectibles in the pictures, especially the number of the Grand Egyptian Museum. To take a good photo of the artifacts in the Grand Egyptian Museum, there are many steps: (1) Create a good location, (2) How to handle the Artifacts. (3) Choose the best position for the artifact, (4) Make the light to create a good photo without shadows to make the photo represent all the artifact details. (5) Be sure of the camera settings, and their quality. All of these steps and other ones are the best criteria for taking the best photo, which helps us in the database to represent the details of the artifact in our interface.

Keywords: grand egyptian museum, photographing, museum collections, registration and documentation

Procedia PDF Downloads 35