Search results for: pose estimates
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1166

Search results for: pose estimates

1076 Investigating the Acquisition of English Emotion Terms by Moroccan EFL Learners

Authors: Khalid El Asri

Abstract:

Culture influences lexicalization of salient concepts in a society. Hence, languages often have different degrees of equivalence regarding lexical items of different fields. The present study focuses on the field of emotions in English and Moroccan Arabic. Findings of a comparative study that involved fifty English emotions revealed that Moroccan Arabic has equivalence of some English emotion terms, partial equivalence of some emotion terms, and no equivalence for some other terms. It is hypothesized then that emotion terms that have near equivalence in Moroccan Arabic will be easier to acquire for EFL learners, while partially equivalent terms will be difficult to acquire, and those that have no equivalence will be even more difficult to acquire. In order to test these hypotheses, the participants (104 advanced Moroccan EFL learners and 104 native speakers of English) were given two tests: the first is a receptive one in which the participants were asked to choose, among four emotion terms, the term that is appropriate to fill in the blanks for a given situation indicating certain kind of feelings. The second test is a productive one in which the participants were asked to give the emotion term that best described the feelings of the people in the situations given. The results showed that conceptually equivalent terms do not pose any problems for Moroccan EFL learners since they can link the concept to an already existing linguistic category; whereas the results concerning the acquisition of partially equivalent terms indicated that this type of emotion terms were difficult for Moroccan EFL learners to acquire, because they need to restructure the boundaries of the target linguistic categories by expanding them when the term includes other range of meanings that are not subsumed in the L1 term. Surprisingly however, the results concerning the case of non-equivalence revealed that Moroccan EFL learners could internalize the target L2 concepts that have no equivalence in their L1. Thus, it is the category of emotion terms that have partial equivalence in the learners’ L1 that pose problems for them.

Keywords: acquisition, culture, emotion terms, lexical equivalence

Procedia PDF Downloads 200
1075 Estimation of Normalized Glandular Doses Using a Three-Layer Mammographic Phantom

Authors: Kuan-Jen Lai, Fang-Yi Lin, Shang-Rong Huang, Yun-Zheng Zeng, Po-Chieh Hsu, Jay Wu

Abstract:

The normalized glandular dose (DgN) estimates the energy deposition of mammography in clinical practice. The Monte Carlo simulations frequently use uniformly mixed phantom for calculating the conversion factor. However, breast tissues are not uniformly distributed, leading to errors of conversion factor estimation. This study constructed a three-layer phantom to estimated more accurate of normalized glandular dose. In this study, MCNP code (Monte Carlo N-Particles code) was used to create the geometric structure. We simulated three types of target/filter combinations (Mo/Mo, Mo/Rh, Rh/Rh), six voltages (25 ~ 35 kVp), six HVL parameters and nine breast phantom thicknesses (2 ~ 10 cm) for the three-layer mammographic phantom. The conversion factor for 25%, 50% and 75% glandularity was calculated. The error of conversion factors compared with the results of the American College of Radiology (ACR) was within 6%. For Rh/Rh, the difference was within 9%. The difference between the 50% average glandularity and the uniform phantom was 7.1% ~ -6.7% for the Mo/Mo combination, voltage of 27 kVp, half value layer of 0.34 mmAl, and breast thickness of 4 cm. According to the simulation results, the regression analysis found that the three-layer mammographic phantom at 0% ~ 100% glandularity can be used to accurately calculate the conversion factors. The difference in glandular tissue distribution leads to errors of conversion factor calculation. The three-layer mammographic phantom can provide accurate estimates of glandular dose in clinical practice.

Keywords: Monte Carlo simulation, mammography, normalized glandular dose, glandularity

Procedia PDF Downloads 164
1074 Infant and Child Mortality among the Low Socio-Economic Households in India

Authors: Narendra Kumar

Abstract:

This study uses data from the ‘National Family Health Survey (NFHS-3) 2005-06’ to investigate the predictors of infant and child mortality among low economic households in East and Northeast region. The cross tabulation, life table survival estimates and Cox proportional hazard model techniques have been used to estimate the predictors of infant and child mortality. The life table survival estimates for infant and child mortality shows that infant mortality in female child is lower in comparison to male child but with child mortality, the rates are higher for female in comparison to male child and the Cox proportional hazard model also give highly significant in female in comparison to male child. The infant and child mortality rates among poor households highest in the Central region followed by North and Northeast region and the lowest in South region in comparison to all regions of India. Education of respondent has been found a significant characteristics in both analyzes, further birth interval, respondent occupation, caste/tribe and place of delivery has substantial impact on infant and child mortality among low economic households in East and Northeast region. Finally these findings specified that an increase in parents’ education, improve health care services and improve socioeconomic conditions of low economic households which should in turn raise infant and child survival and should decrease child mortality among low economic households in India.

Keywords: infant, child, mortality, socio-economic, India

Procedia PDF Downloads 288
1073 Allometric Models for Biomass Estimation in Savanna Woodland Area, Niger State, Nigeria

Authors: Abdullahi Jibrin, Aishetu Abdulkadir

Abstract:

The development of allometric models is crucial to accurate forest biomass/carbon stock assessment. The aim of this study was to develop a set of biomass prediction models that will enable the determination of total tree aboveground biomass for savannah woodland area in Niger State, Nigeria. Based on the data collected through biometric measurements of 1816 trees and destructive sampling of 36 trees, five species specific and one site specific models were developed. The sample size was distributed equally between the five most dominant species in the study site (Vitellaria paradoxa, Irvingia gabonensis, Parkia biglobosa, Anogeissus leiocarpus, Pterocarpus erinaceous). Firstly, the equations were developed for five individual species. Secondly these five species were mixed and were used to develop an allometric equation of mixed species. Overall, there was a strong positive relationship between total tree biomass and the stem diameter. The coefficient of determination (R2 values) ranging from 0.93 to 0.99 P < 0.001 were realised for the models; with considerable low standard error of the estimates (SEE) which confirms that the total tree above ground biomass has a significant relationship with the dbh. The F-test value for the biomass prediction models were also significant at p < 0.001 which indicates that the biomass prediction models are valid. This study recommends that for improved biomass estimates in the study site, the site specific biomass models should preferably be used instead of using generic models.

Keywords: allometriy, biomass, carbon stock , model, regression equation, woodland, inventory

Procedia PDF Downloads 423
1072 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data

Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer

Abstract:

This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.

Keywords: non-stationary, BINARMA(1, 1) model, Poisson innovations, conditional maximum likelihood, CML

Procedia PDF Downloads 103
1071 Implications of Measuring the Progress towards Financial Risk Protection Using Varied Survey Instruments: A Case Study of Ghana

Authors: Jemima C. A. Sumboh

Abstract:

Given the urgency and consensus for countries to move towards Universal Health Coverage (UHC), health financing systems need to be accurately and consistently monitored to provide valuable data to inform policy and practice. Most of the indicators for monitoring UHC, particularly catastrophe and impoverishment, are established based on the impact of out-of-pocket health payments (OOPHP) on households’ living standards, collected through varied household surveys. These surveys, however, vary substantially in survey methods such as the length of the recall period or the number of items included in the survey questionnaire or the farming of questions, potentially influencing the level of OOPHP. Using different survey instruments can provide inaccurate, inconsistent, erroneous and misleading estimates of UHC, subsequently influencing wrong policy decisions. Using data from a household budget survey conducted by the Navrongo Health Research Center in Ghana from May 2017 to December 2018, this study intends to explore the potential implications of using surveys with varied levels of disaggregation of OOPHP data on estimates of financial risk protection. The household budget survey, structured around food and non-food expenditure, compared three OOPHP measuring instruments: Version I (existing questions used to measure OOPHP in household budget surveys), Version II (new questions developed through benchmarking the existing Classification of the Individual Consumption by Purpose (COICOP) OOPHP questions in household surveys) and Version III (existing questions used to measure OOPHP in health surveys integrated into household budget surveys- for this, the demographic and health surveillance (DHS) health survey was used). Version I, II and III contained 11, 44, and 56 health items, respectively. However, the choice of recall periods was held constant across versions. The sample size for Version I, II and III were 930, 1032 and 1068 households, respectively. Financial risk protection will be measured based on the catastrophic and impoverishment methodologies using STATA 15 and Adept Software for each version. It is expected that findings from this study will present valuable contributions to the repository of knowledge on standardizing survey instruments to obtain estimates of financial risk protection that are valid and consistent.

Keywords: Ghana, household budget surveys, measuring financial risk protection, out-of-pocket health payments, survey instruments, universal health coverage

Procedia PDF Downloads 111
1070 Estimation of Soil Moisture at High Resolution through Integration of Optical and Microwave Remote Sensing and Applications in Drought Analyses

Authors: Donglian Sun, Yu Li, Paul Houser, Xiwu Zhan

Abstract:

California experienced severe drought conditions in the past years. In this study, the drought conditions in California are analyzed using soil moisture anomalies derived from integrated optical and microwave satellite observations along with auxiliary land surface data. Based on the U.S. Drought Monitor (USDM) classifications, three typical drought conditions were selected for the analysis: extreme drought conditions in 2007 and 2013, severe drought conditions in 2004 and 2009, and normal conditions in 2005 and 2006. Drought is defined as negative soil moisture anomaly. To estimate soil moisture at high spatial resolutions, three approaches are explored in this study: the universal triangle model that estimates soil moisture from Normalized Difference Vegetation Index (NDVI) and Land Surface Temperature (LST); the basic model that estimates soil moisture under different conditions with auxiliary data like precipitation, soil texture, topography, and surface types; and the refined model that uses accumulated precipitation and its lagging effects. It is found that the basic model shows better agreements with the USDM classifications than the universal triangle model, while the refined model using precipitation accumulated from the previous summer to current time demonstrated the closest agreements with the USDM patterns.

Keywords: soil moisture, high resolution, regional drought, analysis and monitoring

Procedia PDF Downloads 112
1069 Net Work Meta Analysis to Identify the Most Effective Dressings to Treat Pressure Injury

Authors: Lukman Thalib, Luis Furuya-Kanamori, Rachel Walker, Brigid Gillespie, Suhail Doi

Abstract:

Background and objectives: There are many topical treatments available for Pressure Injury (PI) treatment, yet there is a lack of evidence with regards to the most effective treatment. The objective of this study was to compare the effect of various topical treatments and identify the best treatment choice(s) for PI healing. Methods: Network meta-analysis of published randomized controlled trials that compared the two or more of the following dressing groups: basic, foam, active, hydroactive, and other wound dressings. The outcome complete healing following treatment and the generalised pair-wise modelling framework was used to generate mixed treatment effects against hydroactive wound dressing, currently the standard of treatment for PIs. All treatments were then ranked by their point estimates. Main Results: 40 studies (1,757 participants) comparing 5 dressing groups were included in the analysis. All dressings groups ranked better than basic (i.e. saline gauze or similar inert dressing). The foam (RR 1.18; 95%CI 0.95-1.48) and active wound dressing (RR 1.16; 95%CI 0.92-1.47) ranked better than hydroactive wound dressing in terms of healing of PIs when the latter was used as the reference group. Conclusion & Recommendations: There was considerable uncertainty around the estimates, yet, the use of hydroactive wound dressings appear to perform better than basic dressings. Foam and active wound dressing groups show promise and need further investigation. High-quality research on clinical effectiveness of the topical treatments are warranted to identify if foam and active wound dressings do provide advantages over hydroactive dressings.

Keywords: Net work Meta Analysis, Pressure Injury, Dresssing, Pressure Ulcer

Procedia PDF Downloads 97
1068 CLOUD Japan: Prospective Multi-Hospital Study to Determine the Population-Based Incidence of Hospitalized Clostridium difficile Infections

Authors: Kazuhiro Tateda, Elisa Gonzalez, Shuhei Ito, Kirstin Heinrich, Kevin Sweetland, Pingping Zhang, Catia Ferreira, Michael Pride, Jennifer Moisi, Sharon Gray, Bennett Lee, Fred Angulo

Abstract:

Clostridium difficile (C. difficile) is the most common cause of antibiotic-associated diarrhea and infectious diarrhea in healthcare settings. Japan has an aging population; the elderly are at increased risk of hospitalization, antibiotic use, and C. difficile infection (CDI). Little is known about the population-based incidence and disease burden of CDI in Japan although limited hospital-based studies have reported a lower incidence than the United States. To understand CDI disease burden in Japan, CLOUD (Clostridium difficile Infection Burden of Disease in Adults in Japan) was developed. CLOUD will derive population-based incidence estimates of the number of CDI cases per 100,000 population per year in Ota-ku (population 723,341), one of the districts in Tokyo, Japan. CLOUD will include approximately 14 of the 28 Ota-ku hospitals including Toho University Hospital, which is a 1,000 bed tertiary care teaching hospital. During the 12-month patient enrollment period, which is scheduled to begin in November 2018, Ota-ku residents > 50 years of age who are hospitalized at a participating hospital with diarrhea ( > 3 unformed stools (Bristol Stool Chart 5-7) in 24 hours) will be actively ascertained, consented, and enrolled by study surveillance staff. A stool specimen will be collected from enrolled patients and tested at a local reference laboratory (LSI Medience, Tokyo) using QUIK CHEK COMPLETE® (Abbott Laboratories). which simultaneously tests specimens for the presence of glutamate dehydrogenase (GDH) and C. difficile toxins A and B. A frozen stool specimen will also be sent to the Pfizer Laboratory (Pearl River, United States) for analysis using a two-step diagnostic testing algorithm that is based on detection of C. difficile strains/spores harboring toxin B gene by PCR followed by detection of free toxins (A and B) using a proprietary cell cytotoxicity neutralization assay (CCNA) developed by Pfizer. Positive specimens will be anaerobically cultured, and C. difficile isolates will be characterized by ribotyping and whole genomic sequencing. CDI patients enrolled in CLOUD will be contacted weekly for 90 days following diarrhea onset to describe clinical outcomes including recurrence, reinfection, and mortality, and patient reported economic, clinical and humanistic outcomes (e.g., health-related quality of life, worsening of comorbidities, and patient and caregiver work absenteeism). Studies will also be undertaken to fully characterize the catchment area to enable population-based estimates. The 12-month active ascertainment of CDI cases among hospitalized Ota-ku residents with diarrhea in CLOUD, and the characterization of the Ota-ku catchment area, including estimation of the proportion of all hospitalizations of Ota-ku residents that occur in the CLOUD-participating hospitals, will yield CDI population-based incidence estimates, which can be stratified by age groups, risk groups, and source (hospital-acquired or community-acquired). These incidence estimates will be extrapolated, following age standardization using national census data, to yield CDI disease burden estimates for Japan. CLOUD also serves as a model for studies in other countries that can use the CLOUD protocol to estimate CDI disease burden.

Keywords: Clostridium difficile, disease burden, epidemiology, study protocol

Procedia PDF Downloads 229
1067 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test

Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman

Abstract:

At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. These findings need to be confirmed with a greater number of stations across other Australian states.

Keywords: floods, FLIKE, probability distributions, flood frequency, outlier

Procedia PDF Downloads 416
1066 MB-Slam: A Slam Framework for Construction Monitoring

Authors: Mojtaba Noghabaei, Khashayar Asadi, Kevin Han

Abstract:

Simultaneous Localization and Mapping (SLAM) technology has recently attracted the attention of construction companies for real-time performance monitoring. To effectively use SLAM for construction performance monitoring, SLAM results should be registered to a Building Information Models (BIM). Registring SLAM and BIM can provide essential insights for construction managers to identify construction deficiencies in real-time and ultimately reduce rework. Also, registering SLAM to BIM in real-time can boost the accuracy of SLAM since SLAM can use features from both images and 3d models. However, registering SLAM with the BIM in real-time is a challenge. In this study, a novel SLAM platform named Model-Based SLAM (MB-SLAM) is proposed, which not only provides automated registration of SLAM and BIM but also improves the localization accuracy of the SLAM system in real-time. This framework improves the accuracy of SLAM by aligning perspective features such as depth, vanishing points, and vanishing lines from the BIM to the SLAM system. This framework extracts depth features from a monocular camera’s image and improves the localization accuracy of the SLAM system through a real-time iterative process. Initially, SLAM can be used to calculate a rough camera pose for each keyframe. In the next step, each SLAM video sequence keyframe is registered to the BIM in real-time by aligning the keyframe’s perspective with the equivalent BIM view. The alignment method is based on perspective detection that estimates vanishing lines and points by detecting straight edges on images. This process will generate the associated BIM views from the keyframes' views. The calculated poses are later improved during a real-time gradient descent-based iteration method. Two case studies were presented to validate MB-SLAM. The validation process demonstrated promising results and accurately registered SLAM to BIM and significantly improved the SLAM’s localization accuracy. Besides, MB-SLAM achieved real-time performance in both indoor and outdoor environments. The proposed method can fully automate past studies and generate as-built models that are aligned with BIM. The main contribution of this study is a SLAM framework for both research and commercial usage, which aims to monitor construction progress and performance in a unified framework. Through this platform, users can improve the accuracy of the SLAM by providing a rough 3D model of the environment. MB-SLAM further boosts the application to practical usage of the SLAM.

Keywords: perspective alignment, progress monitoring, slam, stereo matching.

Procedia PDF Downloads 188
1065 The Effect of Accounting Conservatism on Cost of Capital: A Quantile Regression Approach for MENA Countries

Authors: Maha Zouaoui Khalifa, Hakim Ben Othman, Hussaney Khaled

Abstract:

Prior empirical studies have investigated the economic consequences of accounting conservatism by examining its impact on the cost of equity capital (COEC). However, findings are not conclusive. We assume that inconsistent results of such association may be attributed to the regression models used in data analysis. To address this issue, we re-examine the effect of different dimension of accounting conservatism: unconditional conservatism (U_CONS) and conditional conservatism (C_CONS) on the COEC for a sample of listed firms from Middle Eastern and North Africa (MENA) countries, applying quantile regression (QR) approach developed by Koenker and Basset (1978). While classical ordinary least square (OLS) method is widely used in empirical accounting research, however it may produce inefficient and bias estimates in the case of departures from normality or long tail error distribution. QR method is more powerful than OLS to handle this kind of problem. It allows the coefficient on the independent variables to shift across the distribution of the dependent variable whereas OLS method only estimates the conditional mean effects of a response variable. We find as predicted that U_CONS has a significant positive effect on the COEC however, C_CONS has a negative impact. Findings suggest also that the effect of the two dimensions of accounting conservatism differs considerably across COEC quantiles. Comparing results from QR method with those of OLS, this study throws more lights on the association between accounting conservatism and COEC.

Keywords: unconditional conservatism, conditional conservatism, cost of equity capital, OLS, quantile regression, emerging markets, MENA countries

Procedia PDF Downloads 330
1064 The Improvement of Turbulent Heat Flux Parameterizations in Tropical GCMs Simulations Using Low Wind Speed Excess Resistance Parameter

Authors: M. O. Adeniyi, R. T. Akinnubi

Abstract:

The parameterization of turbulent heat fluxes is needed for modeling land-atmosphere interactions in Global Climate Models (GCMs). However, current GCMs still have difficulties with producing reliable turbulent heat fluxes for humid tropical regions, which may be due to inadequate parameterization of the roughness lengths for momentum (z0m) and heat (z0h) transfer. These roughness lengths are usually expressed in term of excess resistance factor (κB^(-1)), and this factor is used to account for different resistances for momentum and heat transfers. In this paper, a more appropriate excess resistance factor (〖 κB〗^(-1)) suitable for low wind speed condition was developed and incorporated into the aerodynamic resistance approach (ARA) in the GCMs. Also, the performance of various standard GCMs κB^(-1) schemes developed for high wind speed conditions were assessed. Based on the in-situ surface heat fluxes and profile measurements of wind speed and temperature from Nigeria Micrometeorological Experimental site (NIMEX), new κB^(-1) was derived through application of the Monin–Obukhov similarity theory and Brutsaert theoretical model for heat transfer. Turbulent flux parameterizations with this new formula provides better estimates of heat fluxes when compared with others estimated using existing GCMs κB^(-1) schemes. The derived κB^(-1) MBE and RMSE in the parameterized QH ranged from -1.15 to – 5.10 Wm-2 and 10.01 to 23.47 Wm-2, while that of QE ranged from - 8.02 to 6.11 Wm-2 and 14.01 to 18.11 Wm-2 respectively. The derived 〖 κB〗^(-1) gave better estimates of QH than QE during daytime. The derived 〖 κB〗^(-1)=6.66〖 Re〗_*^0.02-5.47, where Re_* is the Reynolds number. The derived κB^(-1) scheme which corrects a well documented large overestimation of turbulent heat fluxes is therefore, recommended for most regional models within the tropic where low wind speed is prevalent.

Keywords: humid, tropic, excess resistance factor, overestimation, turbulent heat fluxes

Procedia PDF Downloads 172
1063 Characterization of Petrophysical Properties of Reservoirs in Bima Formation, Northeastern Nigeria: Implication for Hydrocarbon Exploration

Authors: Gabriel Efomeh Omolaiye, Jimoh Ajadi, Olatunji Seminu, Yusuf Ayoola Jimoh, Ubulom Daniel

Abstract:

Identification and characterization of petrophysical properties of reservoirs in the Bima Formation were undertaken to understand their spatial distribution and impacts on hydrocarbon saturation in the highly heterolithic siliciclastic sequence. The study was carried out using nine well logs from Maiduguri and Baga/Lake sub-basins within the Borno Basin. The different log curves were combined to decipher the lithological heterogeneity of the serrated sand facies and to aid the geologic correlation of sand bodies within the sub-basins. Evaluation of the formation reveals largely undifferentiated to highly serrated and lenticular sand bodies from which twelve reservoirs named Bima Sand-1 to Bima Sand-12 were identified. The reservoir sand bodies are bifurcated by shale beds, which reduced their thicknesses variably from 0.61 to 6.1 m. The shale content in the sand bodies ranged from 11.00% (relatively clean) to high shale content of 88.00%. The formation also has variable porosity values, with calculated total porosity ranged as low as 10.00% to as high as 35.00%. Similarly, effective porosity values spanned between 2.00 to 24.00%. The irregular porosity values also accounted for a wide range of field average permeability estimates computed for the formation, which measured between 0.03 to 319.49 mD. Hydrocarbon saturation (Sh) in the thin lenticular sand bodies also varied from 40.00 to 78.00%. Hydrocarbon was encountered in three intervals in Ga-1, four intervals in Da-1, two intervals in Ar-1, and one interval in Ye-1. Ga-1 well encountered 30.78 m thick of hydrocarbon column in 14 thin sand lobes in Bima Sand-1, with thicknesses from 0.60 m to 5.80 m and average saturation of 51.00%, while Bima Sand-2 intercepted 45.11 m thick of hydrocarbon column in 12 thin sand lobes with an average saturation of 61.00% and Bima Sand-9 has 6.30 m column in 4 thin sand lobes. Da-1 has hydrocarbon in Bima Sand-8 (5.30 m, Sh of 58.00% in 5 sand lobes), Bima Sand-10 (13.50 m, Sh of 52.00% in 6 sand lobes), Bima Sand-11 (6.20 m, Sh of 58.00% in 2 sand lobes) and Bima Sand-12 (16.50 m, Sh of 66% in 6 sand lobes). In the Ar-1 well, hydrocarbon occurs in Bima Sand-3 (2.40 m column, Sh of 48% in a sand lobe) and Bima Sand-9 (6.0 m, Sh of 58% in a sand lobe). Ye-1 well only intersected 0.5 m hydrocarbon in Bima Sand-1 with 78% saturation. Although Bima Formation has variable saturation of hydrocarbon, mainly gas in Maiduguri, and Baga/Lake sub-basins of the research area, its highly thin serrated sand beds, coupled with very low effective porosity and permeability in part, would pose a significant exploitation challenge. The sediments were deposited in a fluvio-lacustrine environment, resulting in a very thinly laminated or serrated alternation of sand and shale beds lithofacies.

Keywords: Bima, Chad Basin, fluvio-lacustrine, lithofacies, serrated sand

Procedia PDF Downloads 145
1062 Ensemble Sampler For Infinite-Dimensional Inverse Problems

Authors: Jeremie Coullon, Robert J. Webber

Abstract:

We introduce a Markov chain Monte Carlo (MCMC) sam-pler for infinite-dimensional inverse problems. Our sam-pler is based on the affine invariant ensemble sampler, which uses interacting walkers to adapt to the covariance structure of the target distribution. We extend this ensem-ble sampler for the first time to infinite-dimensional func-tion spaces, yielding a highly efficient gradient-free MCMC algorithm. Because our ensemble sampler does not require gradients or posterior covariance estimates, it is simple to implement and broadly applicable. In many Bayes-ian inverse problems, Markov chain Monte Carlo (MCMC) meth-ods are needed to approximate distributions on infinite-dimensional function spaces, for example, in groundwater flow, medical imaging, and traffic flow. Yet designing efficient MCMC methods for function spaces has proved challenging. Recent gradi-ent-based MCMC methods preconditioned MCMC methods, and SMC methods have improved the computational efficiency of functional random walk. However, these samplers require gradi-ents or posterior covariance estimates that may be challenging to obtain. Calculating gradients is difficult or impossible in many high-dimensional inverse problems involving a numerical integra-tor with a black-box code base. Additionally, accurately estimating posterior covariances can require a lengthy pilot run or adaptation period. These concerns raise the question: is there a functional sampler that outperforms functional random walk without requir-ing gradients or posterior covariance estimates? To address this question, we consider a gradient-free sampler that avoids explicit covariance estimation yet adapts naturally to the covariance struc-ture of the sampled distribution. This sampler works by consider-ing an ensemble of walkers and interpolating and extrapolating between walkers to make a proposal. This is called the affine in-variant ensemble sampler (AIES), which is easy to tune, easy to parallelize, and efficient at sampling spaces of moderate dimen-sionality (less than 20). The main contribution of this work is to propose a functional ensemble sampler (FES) that combines func-tional random walk and AIES. To apply this sampler, we first cal-culate the Karhunen–Loeve (KL) expansion for the Bayesian prior distribution, assumed to be Gaussian and trace-class. Then, we use AIES to sample the posterior distribution on the low-wavenumber KL components and use the functional random walk to sample the posterior distribution on the high-wavenumber KL components. Alternating between AIES and functional random walk updates, we obtain our functional ensemble sampler that is efficient and easy to use without requiring detailed knowledge of the target dis-tribution. In past work, several authors have proposed splitting the Bayesian posterior into low-wavenumber and high-wavenumber components and then applying enhanced sampling to the low-wavenumber components. Yet compared to these other samplers, FES is unique in its simplicity and broad applicability. FES does not require any derivatives, and the need for derivative-free sam-plers has previously been emphasized. FES also eliminates the requirement for posterior covariance estimates. Lastly, FES is more efficient than other gradient-free samplers in our tests. In two nu-merical examples, we apply FES to challenging inverse problems that involve estimating a functional parameter and one or more scalar parameters. We compare the performance of functional random walk, FES, and an alternative derivative-free sampler that explicitly estimates the posterior covariance matrix. We conclude that FES is the fastest available gradient-free sampler for these challenging and multimodal test problems.

Keywords: Bayesian inverse problems, Markov chain Monte Carlo, infinite-dimensional inverse problems, dimensionality reduction

Procedia PDF Downloads 130
1061 The Impact of Unconditional and Conditional Conservatism on Cost of Equity Capital: A Quantile Regression Approach for MENA Countries

Authors: Khalifa Maha, Ben Othman Hakim, Khaled Hussainey

Abstract:

Prior empirical studies have investigated the economic consequences of accounting conservatism by examining its impact on the cost of equity capital (COEC). However, findings are not conclusive. We assume that inconsistent results of such association may be attributed to the regression models used in data analysis. To address this issue, we re-examine the effect of different dimension of accounting conservatism: unconditional conservatism (U_CONS) and conditional conservatism (C_CONS) on the COEC for a sample of listed firms from Middle Eastern and North Africa (MENA) countries, applying quantile regression (QR) approach developed by Koenker and Basset (1978). While classical ordinary least square (OLS) method is widely used in empirical accounting research, however it may produce inefficient and bias estimates in the case of departures from normality or long tail error distribution. QR method is more powerful than OLS to handle this kind of problem. It allows the coefficient on the independent variables to shift across the distribution of the dependent variable whereas OLS method only estimates the conditional mean effects of a response variable. We find as predicted that U_CONS has a significant positive effect on the COEC however, C_CONS has a negative impact. Findings suggest also that the effect of the two dimensions of accounting conservatism differs considerably across COEC quantiles. Comparing results from QR method with those of OLS, this study throws more lights on the association between accounting conservatism and COEC.

Keywords: unconditional conservatism, conditional conservatism, cost of equity capital, OLS, quantile regression, emerging markets, MENA countries

Procedia PDF Downloads 336
1060 Groupthink: The Dark Side of Team Cohesion

Authors: Farhad Eizakshiri

Abstract:

The potential for groupthink to explain the issues contributing to deterioration of decision-making ability within the unitary team and so to cause poor outcomes attracted a great deal of attention from a variety of disciplines, including psychology, social and organizational studies, political science, and others. Yet what remains unclear is how and why the team members’ strivings for unanimity and cohesion override their motivation to realistically appraise alternative courses of action. In this paper, the findings of a sequential explanatory mixed-methods research containing an experiment with thirty groups of three persons each and interviews with all experimental groups to investigate this issue is reported. The experiment sought to examine how individuals aggregate their views in order to reach a consensual group decision concerning the completion time of a task. The results indicated that groups made better estimates when they had no interaction between members in comparison with the situation that groups collectively agreed on time estimates. To understand the reasons, the qualitative data and informal observations collected during the task were analyzed through conversation analysis, thus leading to four reasons that caused teams to neglect divergent viewpoints and reduce the number of ideas being considered. Reasons found were the concurrence-seeking tendency, pressure on dissenters, self-censorship, and the illusion of invulnerability. It is suggested that understanding the dynamics behind the aforementioned reasons of groupthink will help project teams to avoid making premature group decisions by enhancing careful evaluation of available information and analysis of available decision alternatives and choices.

Keywords: groupthink, group decision, cohesiveness, project teams, mixed-methods research

Procedia PDF Downloads 374
1059 Using Computer Vision and Machine Learning to Improve Facility Design for Healthcare Facility Worker Safety

Authors: Hengameh Hosseini

Abstract:

Design of large healthcare facilities – such as hospitals, multi-service line clinics, and nursing facilities - that can accommodate patients with wide-ranging disabilities is a challenging endeavor and one that is poorly understood among healthcare facility managers, administrators, and executives. An even less-understood extension of this problem is the implications of weakly or insufficiently accommodative design of facilities for healthcare workers in physically-intensive jobs who may also suffer from a range of disabilities and who are therefore at increased risk of workplace accident and injury. Combine this reality with the vast range of facility types, ages, and designs, and the problem of universal accommodation becomes even more daunting and complex. In this study, we focus on the implication of facility design for healthcare workers suffering with low vision who also have physically active jobs. The points of difficulty are myriad and could span health service infrastructure, the equipment used in health facilities, and transport to and from appointments and other services can all pose a barrier to health care if they are inaccessible, less accessible, or even simply less comfortable for people with various disabilities. We conduct a series of surveys and interviews with employees and administrators of 7 facilities of a range of sizes and ownership models in the Northeastern United States and combine that corpus with in-facility observations and data collection to identify five major points of failure common to all the facilities that we concluded could pose safety threats to employees with vision impairments, ranging from very minor to severe. We determine that lack of design empathy is a major commonality among facility management and ownership. We subsequently propose three methods for remedying this lack of empathy-informed design, to remedy the dangers posed to employees: the use of an existing open-sourced Augmented Reality application to simulate the low-vision experience for designers and managers; the use of a machine learning model we develop to automatically infer facility shortcomings from large datasets of recorded patient and employee reviews and feedback; and the use of a computer vision model fine tuned on images of each facility to infer and predict facility features, locations, and workflows, that could again pose meaningful dangers to visually impaired employees of each facility. After conducting a series of real-world comparative experiments with each of these approaches, we conclude that each of these are viable solutions under particular sets of conditions, and finally characterize the range of facility types, workforce composition profiles, and work conditions under which each of these methods would be most apt and successful.

Keywords: artificial intelligence, healthcare workers, facility design, disability, visually impaired, workplace safety

Procedia PDF Downloads 75
1058 Accidental Electrocution, Reconstruction of Events

Authors: Y. P. Raghavendra Babu

Abstract:

Electrocution is a common cause of morbidity and mortality as electricity is an indispensible part of today’s World. Deaths due to electrocution which are witnessed do not pose a problem at the manner and cause of death. However un-witnessed deaths can raise suspicion of manner of death. A case of fatal electrocution is reported here which was diagnosed to be accidental in manner with the help of reconstruction of events by proper investigation.

Keywords: electrocution, manner of death, reconstruction of events, health information

Procedia PDF Downloads 241
1057 Designing Agricultural Irrigation Systems Using Drone Technology and Geospatial Analysis

Authors: Yongqin Zhang, John Lett

Abstract:

Geospatial technologies have been increasingly used in agriculture for various applications and purposes in recent years. Unmanned aerial vehicles (drones) fit the needs of farmers in farming operations, from field spraying to grow cycles and crop health. In this research, we conducted a practical research project that used drone technology to design and map optimal locations and layouts of irrigation systems for agriculture farms. We flew a DJI Mavic 2 Pro drone to acquire aerial remote sensing images over two agriculture fields in Forest, Mississippi, in 2022. Flight plans were first designed to capture multiple high-resolution images via a 20-megapixel RGB camera mounted on the drone over the agriculture fields. The Drone Deploy web application was then utilized to develop flight plans and subsequent image processing and measurements. The images were orthorectified and processed to estimate the area of the area and measure the locations of the water line and sprinkle heads. Field measurements were conducted to measure the ground targets and validate the aerial measurements. Geospatial analysis and photogrammetric measurements were performed for the study area to determine optimal layout and quantitative estimates for irrigation systems. We created maps and tabular estimates to demonstrate the locations, spacing, amount, and layout of sprinkler heads and water lines to cover the agricultural fields. This research project provides scientific guidance to Mississippi farmers for a precision agricultural irrigation practice.

Keywords: drone images, agriculture, irrigation, geospatial analysis, photogrammetric measurements

Procedia PDF Downloads 54
1056 The Effects of the Introduction of a One-day Waiting Period on Absences for Ordinary Illness of Public Employees

Authors: Mohamed Ali Ben Halima, Malik Koubi, Joseph Lanfranchi, Yohan Wloczysiak

Abstract:

This article assesses the consequences on the frequency and duration of ordinary sick leave of the January 2012 and 2018 reforms modifying the scope of sick leave reimbursement in the French civil service. These reforms introduce a one-day waiting period which removes the compensation for the first day of ordinary sick leave. In order to evaluate these reforms, we use an administrative database from the National Pension Fund for local public employees (FPT). The first important result of our data analysis is that the one-day waiting period was not introduced at the same time in the French Local Public Service establishments, or even never in some. This peculiarity allows for an identification strategy using a difference-in-differences method based on the definition at each date of groups of employees treated and not treated by the reform, since establishments that apply the one-day waiting period coexist with establishments that do not apply it. Two types of estimators are used for this evaluation: individual and time fixed effects estimators and DIDM estimators which correct for the biases of the Two Way Fixed Effects one. The results confirm that the change in the sick pay system decreases the probability of having at least one ordinary sick leave as well as the number and duration of these episodes. On the other hand, the estimates show that longer leave episodes are not less affected than shorter ones. Finally, the validity tests of the estimators support the results obtained for the second period of 2018-2019, but suggest estimation biases for the period 2012-2013. The extent to which the endogeneity of the choices of implementation of the reform at the local level impact these estimates needs to be further tested.

Keywords: sick leave, one-day waiting period, territorial civil service, public policy evaluation

Procedia PDF Downloads 56
1055 Predicting Ecological Impacts of Sea-Level Change on Coastal Conservation Areas in India

Authors: Mohammad Zafar-ul Islam, Shaily Menon, Xingong Li, A. Townsend Peterson

Abstract:

In addition to the mounting empirical data on direct implications of climate change for natural and human systems, evidence is increasing for other, indirect climate change phenomena such as sea-level rise. Rising sea levels and associated marine intrusion into terrestrial environments are predicted to be among the most serious eventual consequences of climate change. The many complex and interacting factors affecting sea levels create considerable uncertainty in sea-level rise projections: conservative estimates are on the order of 0.5-1.0 m globally, while other estimates are much higher, approaching 6 m. Marine intrusion associated with 1– 6 m sea-level rise will impact species and habitats in coastal ecosystems severely. Examining areas most vulnerable to such impacts may allow design of appropriate adaptation and mitigation strategies. We present an overview of potential effects of 1 and 6 m sea level rise for coastal conservation areas in the Indian Subcontinent. In particular, we examine the projected magnitude of areal losses in relevant biogeographic zones, ecoregions, protected areas (PAs), and Important Bird Areas (IBAs). In addition, we provide a more detailed and quantitative analysis of likely effects of marine intrusion on 22 coastal PAs and IBAs that provide critical habitat for birds in the form of breeding areas, migratory stopover sites, and overwintering habitats. Several coastal PAs and IBAs are predicted to experience higher than 50% losses to marine intrusion. We explore consequences of such inundation levels on species and habitat in these areas.

Keywords: sea-level change, coastal inundation, marine intrusion, biogeographic zones, ecoregions, protected areas, important bird areas, adaptation, mitigation

Procedia PDF Downloads 232
1054 Asymptotic Spectral Theory for Nonlinear Random Fields

Authors: Karima Kimouche

Abstract:

In this paper, we consider the asymptotic problems in spectral analysis of stationary causal random fields. We impose conditions only involving (conditional) moments, which are easily verifiable for a variety of nonlinear random fields. Limiting distributions of periodograms and smoothed periodogram spectral density estimates are obtained and applications to the spectral domain bootstrap are given.

Keywords: spatial nonlinear processes, spectral estimators, GMC condition, bootstrap method

Procedia PDF Downloads 425
1053 ROSgeoregistration: Aerial Multi-Spectral Image Simulator for the Robot Operating System

Authors: Andrew R. Willis, Kevin Brink, Kathleen Dipple

Abstract:

This article describes a software package called ROS-georegistration intended for use with the robot operating system (ROS) and the Gazebo 3D simulation environment. ROSgeoregistration provides tools for the simulation, test, and deployment of aerial georegistration algorithms and is available at github.com/uncc-visionlab/rosgeoregistration. A model creation package is provided which downloads multi-spectral images from the Google Earth Engine database and, if necessary, incorporates these images into a single, possibly very large, reference image. Additionally a Gazebo plugin which uses the real-time sensor pose and image formation model to generate simulated imagery using the specified reference image is provided along with related plugins for UAV relevant data. The novelty of this work is threefold: (1) this is the first system to link the massive multi-spectral imaging database of Google’s Earth Engine to the Gazebo simulator, (2) this is the first example of a system that can simulate geospatially and radiometrically accurate imagery from multiple sensor views of the same terrain region, and (3) integration with other UAS tools creates a new holistic UAS simulation environment to support UAS system and subsystem development where real-world testing would generally be prohibitive. Sensed imagery and ground truth registration information is published to client applications which can receive imagery synchronously with telemetry from other payload sensors, e.g., IMU, GPS/GNSS, barometer, and windspeed sensor data. To highlight functionality, we demonstrate ROSgeoregistration for simulating Electro-Optical (EO) and Synthetic Aperture Radar (SAR) image sensors and an example use case for developing and evaluating image-based UAS position feedback, i.e., pose for image-based Guidance Navigation and Control (GNC) applications.

Keywords: EO-to-EO, EO-to-SAR, flight simulation, georegistration, image generation, robot operating system, vision-based navigation

Procedia PDF Downloads 85
1052 Assessing Autism Spectrum Disorders (ASD) Challenges in Young Children in Dubai: A Qualitative Study, 2016

Authors: Kadhim Alabady

Abstract:

Background: Autism poses a particularly large public health challenge and an inspiring lifelong challenge for many families; it is a lifelong challenge of a different kind. Purpose: Therefore, it is important to understand what the key challenges are and how to improve the lives of children who are affected with autism in Dubai. Method: In order to carry out this research we have used a qualitative methodology. We performed structured in–depth interviews and focus groups with mental health professionals working at: Al Jalila hospital (AJH), Dubai Autism Centre (DAC), Dubai Rehabilitation Centre for Disabilities, Latifa hospital, Private Sector Healthcare (PSH). In addition to that, we conducted quantitative approach to estimate ASD prevalence or incidence data due to lack of registry. ASD estimates are based on research from national and international documents. This approach was applied to increase the validity of the findings by using a variety of data collection techniques in order to explore issues that might not be highlighted through one method alone. Key findings: Autism is the most common of the Pervasive Developmental Disorders. Dubai Autism Center estimates it affects 1 in 146 births (0.68%). If we apply these estimates to the total number of births in Dubai for 2014, it is predicted there would be approximately 199 children (of which 58 were Nationals and 141 were Non–Nationals) suffering from autism at some stage. 16.4% of children (through their families) seek help for ASD assessment between the age group 6–18+. It is critical to understand and address factors for seeking late–stage diagnosis, as ASD can be diagnosed much earlier and how many of these later presenters are actually diagnosed with ASD. Autism spectrum disorder (ASD) is a public health concern in Dubai. Families do not consult GPs for early diagnosis for a variety of reasons including cultural reasons. Recommendations: Effective school health strategies is needed and implemented by nurses who are qualified and experienced in identifying children with ASD. There is a need for the DAC to identify and develop a closer link with neurologists specializing in Autism, to work alongside and for referrals. Autism can be attributed to many factors, some of those are neurological. Currently, when families need their child to see a neurologist they have to go independently and search through the many that are available in Dubai and who are not necessarily specialists in Autism. Training of GP’s to aid early diagnosis of Autism and increase awareness. Since not all GP’s are trained to make such assessments increasing awareness about where to send families for a complete assessment and the necessary support. There is an urgent need for an adult autism center for when the children leave the safe environment of the school at 18 years. These individuals require a day center or suitable job training/placements where appropriate. There is a need for further studies to cover the needs of people with an Autism Spectrum Disorder (ASD).

Keywords: autism spectrum disorder, autism, pervasive developmental disorders, incidence

Procedia PDF Downloads 194
1051 Maximum Likelihood Estimation Methods on a Two-Parameter Rayleigh Distribution under Progressive Type-Ii Censoring

Authors: Daniel Fundi Murithi

Abstract:

Data from economic, social, clinical, and industrial studies are in some way incomplete or incorrect due to censoring. Such data may have adverse effects if used in the estimation problem. We propose the use of Maximum Likelihood Estimation (MLE) under a progressive type-II censoring scheme to remedy this problem. In particular, maximum likelihood estimates (MLEs) for the location (µ) and scale (λ) parameters of two Parameter Rayleigh distribution are realized under a progressive type-II censoring scheme using the Expectation-Maximization (EM) and the Newton-Raphson (NR) algorithms. These algorithms are used comparatively because they iteratively produce satisfactory results in the estimation problem. The progressively type-II censoring scheme is used because it allows the removal of test units before the termination of the experiment. Approximate asymptotic variances and confidence intervals for the location and scale parameters are derived/constructed. The efficiency of EM and the NR algorithms is compared given root mean squared error (RMSE), bias, and the coverage rate. The simulation study showed that in most sets of simulation cases, the estimates obtained using the Expectation-maximization algorithm had small biases, small variances, narrower/small confidence intervals width, and small root of mean squared error compared to those generated via the Newton-Raphson (NR) algorithm. Further, the analysis of a real-life data set (data from simple experimental trials) showed that the Expectation-Maximization (EM) algorithm performs better compared to Newton-Raphson (NR) algorithm in all simulation cases under the progressive type-II censoring scheme.

Keywords: expectation-maximization algorithm, maximum likelihood estimation, Newton-Raphson method, two-parameter Rayleigh distribution, progressive type-II censoring

Procedia PDF Downloads 133
1050 Genetic Variability and Heritability Among Indigenous Pearl Millet (Pennisetum Glaucum L. R. BR.) in Striga Infested Fields of Sudan Savanna, Nigeria

Authors: Adamu Usman, Grace Stanley Balami

Abstract:

Pearl millet (Pennisetum glaucum L. R. Br.) is a cereal cultivated in arid and semi-arid areas of the world. It supports more than 100 million people around the world. Parasitic weed (Striga hermonthica Del. Benth) is a major constraint to its production. Estimated yield losses are put at 10 - 95% depending on variety, ecology and cultural practices. Potentials in selection of traits in pearl millets for grain yield have been reported and it depends on genotypic variability and heritability among landraces. Variability and heritability among cultivars could offer opportunities for improvement. The study was conducted to determine the genetic variability among cultivars and estimate broad sense heritability among grain yield and related traits. F1 breeding populations were generated with 9 parental cultivars, viz; Ex-Gubio, Ex-Monguno, Ex-Baga as males and PEO 5984, Super-SOSAT, SOSAT-C88, Ex-Borno and LCIC9702 as females through Line × Tester mating during 2017 dry season at Lushi Irrigation Station, Bauchi Metropolitan in Bauchi State, Nigeria. The F1 population and the parents were evaluated during cropping season of 2018 at Bauchi and Maiduguri. Data collected were subjected to analysis of variance. Results showed significant difference among cultivars and among traits indicating variability. Number of plants at emergence, days to 50% flowering, days to 100% flowering, plant height, panicle length, number of plants at harvest, Striga count at 90 days after sowing, panicle weight and grain yield were significantly different. Significant variability offer opportunity for improvement as superior individuals can be isolated. Genotypic variance estimates of traits were largely greater than environmental variances except in plant height and 1000 seed weight. Environmental variances were low and in some cases negligible. The phenotypic variances of all traits were higher than genotypic variances. Similarly phenotypic coefficient of variation (PCV) was higher than genotypic coefficient of variation (GCV). High heritability was found in days to 50% flowering (90.27%), Striga count at 90 days after sowing (90.07%), number of plants at harvest (87.97%), days to 100% flowering (83.89%), number of plants at emergence (82.19%) and plant height (73.18%). Greater heritability estimates could be due to presence of additive gene. The result revealed wider variability among genotypes and traits. Traits having high heritability could easily respond to selection. High value of GCV, PCV and heritability estimates indicate that selection for these traits are possible and could be effective.

Keywords: variability, heritability, phenotypic, genotypic, striga

Procedia PDF Downloads 28
1049 Food for Health: Understanding the Importance of Food Safety in the Context of Food Security

Authors: Carmen J. Savelli, Romy Conzade

Abstract:

Background: Access to sufficient amounts of safe and nutritious food is a basic human necessity, required to sustain life and promote good health. Food safety and food security are therefore inextricably linked, yet the importance of food safety in this relationship is often overlooked. Methodologies: A literature review and desk study were conducted to examine existing frameworks for discussing food security, especially from an international perspective, to determine the entry points for enhancing considerations for food safety in national and international policies. Major Findings: Food security is commonly understood as the state when all people at all times have physical, social and economic access to sufficient, safe and nutritious food to meet their dietary needs and food preferences for an active and healthy life. Conceptually, food security is built upon four pillars including food availability, access, utilization and stability. Within this framework, the safety of food is often wrongly assumed as a given. However, in places where food supplies are insufficient, coping mechanisms for food insecurity are primarily focused on access to food without considerations for ensuring safety. Under such conditions, hygiene and nutrition are often ignored as people shift to less nutritious diets and consume more potentially unsafe foods, in which chemical, microbiological, zoonotic and other hazards can pose serious, acute and chronic health risks. While food supplies might be safe and nutritious, if consumed in quantities insufficient to support normal growth, health and activity, the result is hunger and famine. Recent estimates indicate that at least 842 million people, or roughly one in eight, still suffer from chronic hunger. Even if people eat enough food that is safe, they will become malnourished if the food does not provide the proper amounts of micronutrients and/or macronutrients to meet daily nutritional requirements, resulting in under- or over-nutrition. Two billion people suffer from one or more micronutrient deficiencies and over half a billion adults are obese. Access to sufficient amounts of nutritious food is not enough. If food is unsafe, whether arising from poor quality supplies or inadequate treatment and preparation, it increases the risk of foodborne infections such as diarrhoea. 70% of diarrhoea episodes occurring annually in children under five are due to biologically contaminated food. Conclusions: An integrated approach is needed where food safety and nutrition are systematically introduced into mainstream food system policies and interventions worldwide in order to achieve health and development goals. A new framework, “Food for Health” is proposed to guide policy development and requires all three aspects of food security to be addressed in balance: sufficiency, nutrition and safety.

Keywords: food safety, food security, nutrition, policy

Procedia PDF Downloads 389
1048 Epistemological and Ethical Dimensions of Current Concepts of Human Resilience in the Neurosciences

Authors: Norbert W. Paul

Abstract:

Since a number of years, scientific interest in human resilience is rapidly increasing especially in psychology and more recently and highly visible in neurobiological research. Concepts of resilience are regularly discussed in the light of liminal experiences and existential challenges in human life. Resilience research is providing both, explanatory models and strategies to promote or foster human resilience. Surprisingly, these approaches attracted little attention so far in philosophy in general and in ethics in particular. This is even more astonishing given the fact that the neurosciences as such have been and still are of major interest to philosophy and ethics and even brought about the specialized field of neuroethics, which, however, is not concerned with concepts of resilience, so far. As a result of the little attention given to the topic of resilience, the whole concept has to date been a philosophically under-theorized. This abstinence of ethics and philosophy in resilience research is lamentable because resilience as a concept as well as resilience interventions based on neurobiological findings do undoubtedly pose philosophical, social and ethical questions. In this paper, we will argue that particular notions of resilience are crossing the sometimes fine line between maintaining a person’s mental health despite the impact of severe psychological or physical adverse events and ethically more debatable discourses of enhancement. While we neither argue for or against enhancement nor re-interpret resilience research and interventions by subsuming them strategies of psychological and/or neuro-enhancement, we encourage those who see social or ethical problems with enhancement technologies should also take a closer look on resilience and the related neurobiological concepts. We will proceed in three steps. In our first step, we will describe the concept of resilience in general and its neurobiological study in particular. Here, we will point out some important differences in the way ‘resilience’ is conceptualized and how neurobiological research understands resilience. In what follows we will try to show that a one-sided concept of resilience – as it is often presented in neurobiological research on resilience – does pose social and ethical problems. Secondly, we will identify and explore the social and ethical challenges of (neurobiological) enhancement. In the last and final step of this paper, we will argue that a one-sided reading of resilience can be understood as latent form of enhancement in transition and poses ethical questions similar to those discussed in relation to other approaches to the biomedical enhancement of humans.

Keywords: resilience, neurosciences, epistemology, bioethics

Procedia PDF Downloads 130
1047 Artificial Intelligence and Liability within Healthcare: A South African Analysis

Authors: M. Naidoo

Abstract:

AI in healthcare can have a massive positive effect in low-resource states like South Africa, where patients outnumber personnel greatly. However, the complexity and ‘black box’ aspects of these technologies pose challenges for the liability regimes of states. This is currently being discussed at the international level. This research finds that within the South African medical negligence context, the current common law fault-based inquiry proves to be wholly inadequate for patient redress. As a solution to this, this research paper culminates in legal reform recommendations designed to solve these issues.

Keywords: artificial intelligence, law, liability, policy

Procedia PDF Downloads 82