Search results for: Nicholas Alexander
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 384

Search results for: Nicholas Alexander

174 The Low-Cost Design and 3D Printing of Structural Knee Orthotics for Athletic Knee Injury Patients

Authors: Alexander Hendricks, Sean Nevin, Clayton Wikoff, Melissa Dougherty, Jacob Orlita, Rafiqul Noorani

Abstract:

Knee orthotics play an important role in aiding in the recovery of those with knee injuries, especially athletes. However, structural knee orthotics is often very expensive, ranging between $300 and $800. The primary reason for this project was to answer the question: can 3D printed orthotics represent a viable and cost-effective alternative to present structural knee orthotics? The primary objective for this research project was to design a knee orthotic for athletes with knee injuries for a low-cost under $100 and evaluate its effectiveness. The initial design for the orthotic was done in SolidWorks, a computer-aided design (CAD) software available at Loyola Marymount University. After this design was completed, finite element analysis (FEA) was utilized to understand how normal stresses placed upon the knee affected the orthotic. The knee orthotic was then adjusted and redesigned to meet a specified factor-of-safety of 3.25 based on the data gathered during FEA and literature sources. Once the FEA was completed and the orthotic was redesigned based from the data gathered, the next step was to move on to 3D-printing the first design of the knee brace. Subsequently, physical therapy movement trials were used to evaluate physical performance. Using the data from these movement trials, the CAD design of the brace was refined to accommodate the design requirements. The final goal of this research means to explore the possibility of replacing high-cost, outsourced knee orthotics with a readily available low-cost alternative.

Keywords: 3D printing, knee orthotics, finite element analysis, design for additive manufacturing

Procedia PDF Downloads 148
173 Diagnostic Value of Different Noninvasive Criteria of Latent Myocarditis in Comparison with Myocardial Biopsy

Authors: Olga Blagova, Yuliya Osipova, Evgeniya Kogan, Alexander Nedostup

Abstract:

Purpose: to quantify the value of various clinical, laboratory and instrumental signs in the diagnosis of myocarditis in comparison with morphological studies of the myocardium. Methods: in 100 patients (65 men, 44.7±12.5 years) with «idiopathic» arrhythmias (n = 20) and dilated cardiomyopathy (DCM, n = 80) were performed 71 endomyocardial biopsy (EMB), 13 intraoperative biopsy, 5 study of explanted hearts, 11 autopsy with virus investigation (real-time PCR) of the blood and myocardium. Anti-heart antibodies (AHA) were also measured as well as cardiac CT (n = 45), MRI (n = 25), coronary angiography (n = 47). The comparison group included of 50 patients (25 men, 53.7±11.7 years) with non-inflammatory heart diseases who underwent open heart surgery. Results. Active/borderline myocarditis was diagnosed in 76.0% of the study group and in 21.6% of patients of the comparison group (p < 0.001). The myocardial viral genome was observed more frequently in patients of comparison group than in study group (group (65.0% and 40.2%; p < 0.01. Evaluated the diagnostic value of noninvasive markers of myocarditis. The panel of anti-heart antibodies had the greatest importance to identify myocarditis: sensitivity was 81.5%, positive and negative predictive value was 75.0 and 60.5%. It is defined diagnostic value of non-invasive markers of myocarditis and diagnostic algorithm providing an individual assessment of the likelihood of myocarditis is developed. Conclusion. The greatest significance in the diagnosis of latent myocarditis in patients with 'idiopathic' arrhythmias and DCM have AHA. The use of complex of noninvasive criteria allows estimate the probability of myocarditis and determine the indications for EMB.

Keywords: myocarditis, "idiopathic" arrhythmias, dilated cardiomyopathy, endomyocardial biopsy, viral genome, anti-heart antibodies

Procedia PDF Downloads 147
172 Fashion Performing/Fashioning Performances: Catwalks as Communication Tools between Market, Branding and Performing Art

Authors: V. Linfante

Abstract:

Catwalks are one of the key moments in fashion: the first and most relevant display where brands stage their collections, products, ideas, and style. The garment is 'the star' of the catwalk and must show itself not just as a product but as a result of a design process endured for several months. All contents developed within this process become ingredients for connecting scenography, music, lights, and direction into a unique fashion narrative. According to the spirit of different ages, fashion shows have been transformed and shaped into peculiar formats: from Pandoras to presentations organized by Parisian couturiers, across the 'marathons' typical of the beginning of modern fashion system, coming up to the present structure of fashion weeks, with their complex organization and related creative and technical businesses. The paper intends to introduce the evolution of the fashion system through its unique process of seasonally staging and showing its production. The paper intends to analyse the evolution of the fashion shows from the intimacy of ballrooms at the beginning of the 20th century, passing through the enthusiasm attitude typical from the '70s and the '80s, to finally depict our present. In this last scenario, catwalks are not anymore a standard collections presentation but become one of the most exciting expression of contemporary culture (and sub-cultures), going from sophisticated performances (as Karl Lagerfeld's Chanel shows) to real artistic happenings (as the events of Victor&Rolf, Alexander McQueen, OFF_WHITE, Vetements, and Martin Margiela), often involving contemporary architecture, digital world, technology, social media, performing art and artists.

Keywords: branding, communication, fashion, new media, performing art

Procedia PDF Downloads 125
171 Study of Storms on the Javits Center Green Roof

Authors: Alexander Cho, Harsho Sanyal, Joseph Cataldo

Abstract:

A quantitative analysis of the different variables on both the South and North green roofs of the Jacob K. Javits Convention Center was taken to find mathematical relationships between net radiation and evapotranspiration (ET), average outside temperature, and the lysimeter weight. Groups of datasets were analyzed, and the relationships were plotted on linear and semi-log graphs to find consistent relationships. Antecedent conditions for each rainstorm were also recorded and plotted against the volumetric water difference within the lysimeter. The first relation was the inverse parabolic relationship between the lysimeter weight and the net radiation and ET. The peaks and valleys of the lysimeter weight corresponded to valleys and peaks in the net radiation and ET respectively, with the 8/22/15 and 1/22/16 datasets showing this trend. The U-shaped and inverse U-shaped plots of the two variables coincided, indicating an inverse relationship between the two variables. Cross variable relationships were examined through graphs with lysimeter weight as the dependent variable on the y-axis. 10 out of 16 of the plots of lysimeter weight vs. outside temperature plots had R² values > 0.9. Antecedent conditions were also recorded for rainstorms, categorized by the amount of precipitation accumulating during the storm. Plotted against the change in the volumetric water weight difference within the lysimeter, a logarithmic regression was found with large R² values. The datasets were compared using the Mann Whitney U-test to see if the datasets were statistically different, using a significance level of 5%; all datasets compared showed a U test statistic value, proving the null hypothesis of the datasets being different from being true.

Keywords: green roof, green infrastructure, Javits Center, evapotranspiration, net radiation, lysimeter

Procedia PDF Downloads 80
170 Effect of Repellent Coatings, Aerosol Protective Liners, and Lamination on the Properties of Chemical/Biological Protective Textiles

Authors: Natalie Pomerantz, Nicholas Dugan, Molly Richards, Walter Zukas

Abstract:

The primary research question to be answered for Chemical/Biological (CB) protective clothing, is how to protect wearers from a range of chemical and biological threats in liquid, vapor, and aerosol form, while reducing the thermal burden. Currently, CB protective garments are hot, heavy, and wearers are limited by short work times in order to prevent heat injury. This study demonstrates how to incorporate different levels of protection on a material level and modify fabric composites such that the thermal burden is reduced to such an extent it approaches that of a standard duty uniform with no CB protection. CB protective materials are usually comprised of several fabric layers: a cover fabric with a liquid repellent coating, a protective layer which is comprised of a carbon-based sorptive material or semi-permeable membrane, and a comfort next-to-skin liner. In order to reduce thermal burden, all of these layers were laminated together to form one fabric composite which had no insulative air gap in between layers. However, the elimination of the air gap also reduced the CB protection of the fabric composite. In order to increase protection in the laminated composite, different nonwoven aerosol protective liners were added, and a super repellent coating was applied to the cover fabric, prior to lamination. Different adhesive patterns were investigated to determine the durability of the laminate with the super repellent coating, and the effect on air permeation. After evaluating the thermal properties, textile properties and protective properties of the iterations of these fabric composites, it was found that the thermal burden of these materials was greatly reduced by decreasing the thermal resistance with the elimination of the air gap between layers. While the level of protection was reduced in laminate composites, the addition of a super repellent coating increased protection towards low volatility agents without impacting thermal burden. Similarly, the addition of aerosol protective liner increased protection without reducing water vapor transport, depending on the nonwoven used, however, the air permeability was significantly decreased. The balance of all these properties and exploration of the trade space between thermal burden and protection will be discussed.

Keywords: aerosol protection, CBRNe protection, lamination, nonwovens, repellent coatings, thermal burden

Procedia PDF Downloads 332
169 Large Eddy Simulation with Energy-Conserving Schemes: Understanding Wind Farm Aerodynamics

Authors: Dhruv Mehta, Alexander van Zuijlen, Hester Bijl

Abstract:

Large Eddy Simulation (LES) numerically resolves the large energy-containing eddies of a turbulent flow, while modelling the small dissipative eddies. On a wind farm, these large scales carry the energy wind turbines extracts and are also responsible for transporting the turbines’ wakes, which may interact with downstream turbines and certainly with the atmospheric boundary layer (ABL). In this situation, it is important to conserve the energy that these wake’s carry and which could be altered artificially through numerical dissipation brought about by the schemes used for the spatial discretisation and temporal integration. Numerical dissipation has been reported to cause the premature recovery of turbine wakes, leading to an over prediction in the power produced by wind farms.An energy-conserving scheme is free from numerical dissipation and ensures that the energy of the wakes is increased or decreased only by the action of molecular viscosity or the action of wind turbines (body forces). The aim is to create an LES package with energy-conserving schemes to simulate wind turbine wakes correctly to gain insight into power-production, wake meandering etc. Such knowledge will be useful in designing more efficient wind farms with minimal wake interaction, which if unchecked could lead to major losses in energy production per unit area of the wind farm. For their research, the authors intend to use the Energy-Conserving Navier-Stokes code developed by the Energy Research Centre of the Netherlands.

Keywords: energy-conserving schemes, modelling turbulence, Large Eddy Simulation, atmospheric boundary layer

Procedia PDF Downloads 443
168 Methane versus Carbon Dioxide Mitigation Prospects

Authors: Alexander J. Severinsky, Allen L. Sessoms

Abstract:

Atmospheric carbon dioxide (CO₂) has dominated the discussion about the causes of climate change. This is a reflection of the time horizon that has become the norm adopted by the IPCC as the planning horizon. Recently, it has become clear that a 100-year time horizon is much too long, and yet almost all mitigation efforts, including those in the near-term horizon of 30 years, are geared toward it. In this paper, we show that, for a 30-year time horizon, methane (CH₄) is the greenhouse gas whose radiative forcing exceeds that of CO₂. In our analysis, we used radiative forcing of greenhouse gases in the atmosphere since they directly affect the temperature rise on Earth. In 2019, the radiative forcing of methane was ~2.5 W/m² and that of carbon dioxide ~2.1 W/m². Under a business-as-usual (BAU) scenario until 2050, such forcing would be ~2.8 W/m² and ~3.1 W/m², respectively. There is a substantial spread in the data for anthropogenic and natural methane emissions as well as CH₄ leakages from production to consumption. We estimated the minimum and maximum effects of the reduction of these leakages. Such action may reduce the annual radiative forcing of all CH₄ emissions by between ~15% and ~30%. This translates into a reduction of the RF by 2050 from ~2.8 W/m² to ~2.5 W/m² in the case of the minimum effect and to ~2.15 W/m² in the case of the maximum. Under the BAU, we found that the RF of CO₂ would increase from ~2.1 W/m² nowadays to ~3.1 W/m² by 2050. We assumed a reduction of 50% of anthropogenic emission linearly over the next 30 years. That would reduce radiative forcing from ~3.1 W/m² to ~2.9 W/m². In the case of ‘net zero,’ the other 50% of reduction of only anthropogenic emissions would be limited to either from sources of emissions or directly from the atmosphere. The total reduction would be from ~3.1 to ~2.7, or ~0.4 W/m². To achieve the same radiative forcing as in the scenario of maximum reduction of methane leakages of ~2.15 W/m², then an additional reduction of radiative forcing of CO₂ would be approximately 2.7 -2.15=0.55 W/m². This is a much larger value than in expectations from ‘net zero’. In total, one needs to remove from the atmosphere ~660 GT to match the maximum reduction of current methane leakages and ~270 GT to achieve ‘net zero.’ This amounts to over 900 GT in total.

Keywords: methane leakages, methane radiative forcing, methane mitigation, methane net zero

Procedia PDF Downloads 117
167 Knowledge Transfer among Cross-Functional Teams as a Continual Improvement Process

Authors: Sergio Mauricio Pérez López, Luis Rodrigo Valencia Pérez, Juan Manuel Peña Aguilar, Adelina Morita Alexander

Abstract:

The culture of continuous improvement in organizations is very important as it represents a source of competitive advantage. This article discusses the transfer of knowledge between companies which formed cross-functional teams and used a dynamic model for knowledge creation as a framework. In addition, the article discusses the structure of cognitive assets in companies and the concept of "stickiness" (which is defined as an obstacle to the transfer of knowledge). The purpose of this analysis is to show that an improvement in the attitude of individual members of an organization creates opportunities, and that an exchange of information and knowledge leads to generating continuous improvements in the company as a whole. This article also discusses the importance of creating the proper conditions for sharing tacit knowledge. By narrowing gaps between people, mutual trust can be created and thus contribute to an increase in sharing. The concept of adapting knowledge to new environments will be highlighted, as it is essential for companies to translate and modify information so that such information can fit the context of receiving organizations. Adaptation will ensure that the transfer process is carried out smoothly by preventing "stickiness". When developing the transfer process on cross-functional teams (as opposed to working groups), the team acquires the flexibility and responsiveness necessary to meet objectives. These types of cross-functional teams also generate synergy due to the array of different work backgrounds of their individuals. When synergy is established, a culture of continuous improvement is created.

Keywords: knowledge transfer, continuous improvement, teamwork, cognitive assets

Procedia PDF Downloads 298
166 Content Analysis of ‘Junk Food’ Content in Children’s TV Programmes: A Comparison of UK Broadcast TV and Video-On-Demand Services

Authors: Shreesh Sinha, Alexander B. Barker, Megan Parkin, Emma Wilson, Rachael L. Murray

Abstract:

Background and Objectives: Exposure to HFSS imagery is associated with the consumption of foods high in fat, sugar or salt (HFSS), and subsequently obesity, among young people. We report and compare the results of two content analyses, one of two popular terrestrial children's television channels in the UK and the other of a selection of children's programmes available on video-on-demand (VOD) streaming sites. Methods: Content analysis of three days' worth of programmes (including advertisements) on two popular children's television channels broadcast on UK television (CBeebies and Milkshake) as well as a sample of 40 highest-rated children's programmes available on the VOD platforms, Netflix and Amazon Prime, using 1-minute interval coding. Results: HFSS content was seen in 181 broadcasts (36%) and in 417 intervals (13%) on terrestrial television, 'Milkshake' had a significantly higher proportion of programmes/adverts which contained HFSS content than 'CBeebies'. In VOD platforms, HFSS content was seen in 82 episodes (72% of the total number of episodes), across 459 intervals (19% of the total number of intervals), with no significant difference in the proportion of programmes containing HFSS content between Netflix and Amazon Prime. Conclusions: This study demonstrates that HFSS content is common in both popular UK children's television channels and children's programmes on VOD services. Since previous research has shown that HFSS content in the media has an effect on HFSS consumption, children's television programmes broadcast either on TV or VOD services are likely to have an effect on HFSS consumption in children, and legislative opportunities to prevent this exposure are being missed.

Keywords: public health, junk food, children's TV, HFSS

Procedia PDF Downloads 66
165 Compliance of Systematic Reviews in Plastic Surgery with the PRISMA Statement: A Systematic Review

Authors: Seon-Young Lee, Harkiran Sagoo, Katherine Whitehurst, Georgina Wellstead, Alexander Fowler, Riaz Agha, Dennis Orgill

Abstract:

Introduction: Systematic reviews attempt to answer research questions by synthesising the data within primary papers. They are an increasingly important tool within evidence-based medicine, guiding clinical practice, future research and healthcare policy. We sought to determine the reporting quality of recent systematic reviews in plastic surgery. Methods: This systematic review was conducted in line with the Cochrane handbook, reported in line with the PRISMA statement and registered at the ResearchRegistry (UIN: reviewregistry18). MEDLINE and EMBASE databases were searched in 2013 and 2014 for systematic reviews by five major plastic surgery journals. Screening, identification and data extraction was performed independently by two teams. Results: From an initial set of 163 articles, 79 met the inclusion criteria. The median PRISMA score was 16 out of 27 items (59.3%; range 6-26, 95% CI 14-17). Compliance between individual PRISMA items showed high variability. It was poorest for items related to the use of review protocol (item 5; 5%) and presentation of data on risk of bias of each study (item 19; 18%), while being the highest for description of rationale (item 3; 99%) and sources of funding and other support (item 27; 95%), and for structured summary in the abstract (item 2; 95%). Conclusion: The reporting quality of systematic reviews in plastic surgery requires improvement. ‘Hard-wiring’ of compliance through journal submission systems, as well as improved education, awareness and a cohesive strategy among all stakeholders is called for.

Keywords: PRISMA, reporting quality, plastic surgery, systematic review, meta-analysis

Procedia PDF Downloads 268
164 Estimation of Endogenous Brain Noise from Brain Response to Flickering Visual Stimulation Magnetoencephalography Visual Perception Speed

Authors: Alexander N. Pisarchik, Parth Chholak

Abstract:

Intrinsic brain noise was estimated via magneto-encephalograms (MEG) recorded during perception of flickering visual stimuli with frequencies of 6.67 and 8.57 Hz. First, we measured the mean phase difference between the flicker signal and steady-state event-related field (SSERF) in the occipital area where the brain response at the flicker frequencies and their harmonics appeared in the power spectrum. Then, we calculated the probability distribution of the phase fluctuations in the regions of frequency locking and computed its kurtosis. Since kurtosis is a measure of the distribution’s sharpness, we suppose that inverse kurtosis is related to intrinsic brain noise. In our experiments, the kurtosis value varied among subjects from K = 3 to K = 5 for 6.67 Hz and from 2.6 to 4 for 8.57 Hz. The majority of subjects demonstrated leptokurtic kurtosis (K < 3), i.e., the distribution tails approached zero more slowly than Gaussian. In addition, we found a strong correlation between kurtosis and brain complexity measured as the correlation dimension, so that the MEGs of subjects with higher kurtosis exhibited lower complexity. The obtained results are discussed in the framework of nonlinear dynamics and complex network theories. Specifically, in a network of coupled oscillators, phase synchronization is mainly determined by two antagonistic factors, noise, and the coupling strength. While noise worsens phase synchronization, the coupling improves it. If we assume that each neuron and each synapse contribute to brain noise, the larger neuronal network should have stronger noise, and therefore phase synchronization should be worse, that results in smaller kurtosis. The described method for brain noise estimation can be useful for diagnostics of some brain pathologies associated with abnormal brain noise.

Keywords: brain, flickering, magnetoencephalography, MEG, visual perception, perception time

Procedia PDF Downloads 111
163 Sports Business Services Model: A Research Model Study in Reginal Sport Authority of Thailand

Authors: Siriraks Khawchaimaha, Sangwian Boonto

Abstract:

Sport Authority of Thailand (SAT) is the state enterprise, promotes and supports all sports kind both professional and athletes for competitions, and administer under government policy and government officers and therefore, all financial supports whether cash inflows and cash outflows are strictly committed to government budget and limited to the planned projects at least 12 to 16 months ahead of reality, as results of ineffective in sport events, administration and competitions. In order to retain in the sports challenges around the world, SAT need to has its own sports business services model by each stadium, region and athletes’ competencies. Based on the HMK model of Khawchaimaha, S. (2007), this research study is formalized into each 10 regional stadiums to details into the characteristics root of fans, athletes, coaches, equipments and facilities, and stadiums. The research designed is firstly the evaluation of external factors: hardware whereby competition or practice of stadiums, playground, facilities, and equipments. Secondly, to understand the software of the organization structure, staffs and management, administrative model, rules and practices. In addition, budget allocation and budget administration with operating plan and expenditure plan. As results for the third step, issues and limitations which require action plan for further development and support, or to cease that unskilled sports kind. The final step, based on the HMK model and modeling canvas by Alexander O and Yves P (2010) are those of template generating Sports Business Services Model for each 10 SAT’s regional stadiums.

Keywords: HMK model, not for profit organization, sport business model, sport services model

Procedia PDF Downloads 282
162 Trading off Accuracy for Speed in Powerdrill

Authors: Filip Buruiana, Alexander Hall, Reimar Hofmann, Thomas Hofmann, Silviu Ganceanu, Alexandru Tudorica

Abstract:

In-memory column-stores make interactive analysis feasible for many big data scenarios. PowerDrill is a system used internally at Google for exploration in logs data. Even though it is a highly parallelized column-store and uses in memory caching, interactive response times cannot be achieved for all datasets (note that it is common to analyze data with 50 billion records in PowerDrill). In this paper, we investigate two orthogonal approaches to optimize performance at the expense of an acceptable loss of accuracy. Both approaches can be implemented as outer wrappers around existing database engines and so they should be easily applicable to other systems. For the first optimization we show that memory is the limiting factor in executing queries at speed and therefore explore possibilities to improve memory efficiency. We adapt some of the theory behind data sketches to reduce the size of particularly expensive fields in our largest tables by a factor of 4.5 when compared to a standard compression algorithm. This saves 37% of the overall memory in PowerDrill and introduces a 0.4% relative error in the 90th percentile for results of queries with the expensive fields. We additionally evaluate the effects of using sampling on accuracy and propose a simple heuristic for annotating individual result-values as accurate (or not). Based on measurements of user behavior in our real production system, we show that these estimates are essential for interpreting intermediate results before final results are available. For a large set of queries this effectively brings down the 95th latency percentile from 30 to 4 seconds.

Keywords: big data, in-memory column-store, high-performance SQL queries, approximate SQL queries

Procedia PDF Downloads 229
161 A Content Analysis of ‘Junk Food’ Content in Children’s TV Programs: A Comparison of UK Broadcast TV and Video-On-Demand Services

Authors: Alexander B. Barker, Megan Parkin, Shreesh Sinha, Emma Wilson, Rachael L. Murray

Abstract:

Objectives: Exposure to HFSS imagery is associated with consumption of foods high in fat, sugar, or salt (HFSS), and subsequently obesity, among young people. We report and compare the results of two content analyses, one of two popular terrestrial children’s television channels in the UK and the other of a selection of children’s programs available on video-on-demand (VOD) streaming sites. Design: Content analysis of three days’ worth of programs (including advertisements) on two popular children’s television channels broadcast on UK television (CBeebies and Milkshake) as well as a sample of 40 highest-rated children’s programs available on the VOD platforms, Netflix and Amazon Prime, using 1-minute interval coding. Setting: United Kingdom, Participants: None. Results: HFSS content was seen in 181 broadcasts (36%) and in 417 intervals (13%) on terrestrial television, ‘Milkshake’ had a significantly higher proportion of programs/adverts which contained HFSS content than ‘CBeebies’. In VOD platforms, HFSS content was seen in 82 episodes (72% of the total number of episodes), across 459 intervals (19% of the total number of intervals), with no significant difference in the proportion of programs containing HFSS content between Netflix and Amazon Prime. Conclusions: This study demonstrates that HFSS content is common in both popular UK children’s television channels and children's programs on VOD services. Since previous research has shown that HFSS content in the media has an effect on HFSS consumption, children’s television programs broadcast either on TV or VOD services are likely having an effect on HFSS consumption in children and legislative opportunities to prevent this exposure are being missed.

Keywords: public health, epidemiology, obesity, content analysis

Procedia PDF Downloads 149
160 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria

Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov

Abstract:

This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.

Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model

Procedia PDF Downloads 22
159 Modified Clusterwise Regression for Pavement Management

Authors: Mukesh Khadka, Alexander Paz, Hanns de la Fuente-Mella

Abstract:

Typically, pavement performance models are developed in two steps: (i) pavement segments with similar characteristics are grouped together to form a cluster, and (ii) the corresponding performance models are developed using statistical techniques. A challenge is to select the characteristics that define clusters and the segments associated with them. If inappropriate characteristics are used, clusters may include homogeneous segments with different performance behavior or heterogeneous segments with similar performance behavior. Prediction accuracy of performance models can be improved by grouping the pavement segments into more uniform clusters by including both characteristics and a performance measure. This grouping is not always possible due to limited information. It is impractical to include all the potential significant factors because some of them are potentially unobserved or difficult to measure. Historical performance of pavement segments could be used as a proxy to incorporate the effect of the missing potential significant factors in clustering process. The current state-of-the-art proposes Clusterwise Linear Regression (CLR) to determine the pavement clusters and the associated performance models simultaneously. CLR incorporates the effect of significant factors as well as a performance measure. In this study, a mathematical program was formulated for CLR models including multiple explanatory variables. Pavement data collected recently over the entire state of Nevada were used. International Roughness Index (IRI) was used as a pavement performance measure because it serves as a unified standard that is widely accepted for evaluating pavement performance, especially in terms of riding quality. Results illustrate the advantage of the using CLR. Previous studies have used CLR along with experimental data. This study uses actual field data collected across a variety of environmental, traffic, design, and construction and maintenance conditions.

Keywords: clusterwise regression, pavement management system, performance model, optimization

Procedia PDF Downloads 226
158 More Precise: Patient-Reported Outcomes after Stroke

Authors: Amber Elyse Corrigan, Alexander Smith, Anna Pennington, Ben Carter, Jonathan Hewitt

Abstract:

Background and Purpose: Morbidity secondary to stroke is highly heterogeneous, but it is important to both patients and clinicians in post-stroke management and adjustment to life after stroke. The consideration of post-stroke morbidity clinically and from the patient perspective has been poorly measured. The patient-reported outcome measures (PROs) in morbidity assessment help improve this knowledge gap. The primary aim of this study was to consider the association between PRO outcomes and stroke predictors. Methods: A multicenter prospective cohort study assessed 549 stroke patients at 19 hospital sites across England and Wales during 2019. Following a stroke event, demographic, clinical, and PRO measures were collected. Prevalence of morbidity within PRO measures was calculated with associated 95% confidence intervals. Predictors of domain outcome were calculated using a multilevel generalized linear model. Associated P -values and 95% confidence intervals are reported. Results: Data were collected from 549 participants, 317 men (57.7%) and 232 women (42.3%) with ages ranging from 25 to 97 (mean 72.7). PRO morbidity was high post-stroke; 93.2% of the cohort report post-stroke PRO morbidity. Previous stroke, diabetes, and gender are associated with worse patient-reported outcomes across both the physical and cognitive domains. Conclusions: This large-scale multicenter cohort study illustrates the high proportion of morbidity in PRO measures. Further, we demonstrate key predictors of adverse outcomes (Diabetes, previous stroke, and gender) congruence with clinical predictors. The PRO has been demonstrated to be an informative and useful stroke when considering patient-reported outcomes and has wider implications for considerations of PROs in clinical management. Future longitudinal follow-up with PROs is needed to consider association of long-term morbidity.

Keywords: morbidity, patient-reported outcome, PRO, stroke

Procedia PDF Downloads 103
157 Utilization of Standard Paediatric Observation Chart to Evaluate Infants under Six Months Presenting with Non-Specific Complaints

Authors: Michael Zhang, Nicholas Marriage, Valerie Astle, Marie-Louise Ratican, Jonathan Ash, Haddijatou Hughes

Abstract:

Objective: Young infants are often brought to the Emergency Department (ED) with a variety of complaints, some of them are non-specific and present as a diagnostic challenge to the attending clinician. Whilst invasive investigations such as blood tests and lumbar puncture are necessary in some cases to exclude serious infections, some basic clinical tools in additional to thorough clinical history can be useful to assess the risks of serious conditions in these young infants. This study aimed to examine the utilization of one of clinical tools in this regard. Methods: This retrospective observational study examined the medical records of infants under 6 months presenting to a mixed urban ED between January 2013 and December 2014. The infants deemed to have non-specific complaints or diagnoses by the emergency clinicians were selected for analysis. The ones with clear systemic diagnoses were excluded. Among all relevant clinical information and investigation results, utilization of Standard Paediatric Observation Chart (SPOC) was particularly scrutinized in these medical records. This specific chart was developed by the expert clinicians in local health department. It categorizes important clinical signs into some color-coded zones as a visual cue for serious implication of some abnormalities. An infant is regarded as SPOC positive when fulfills 1 red zone or 2 yellow zones criteria, and the attending clinician would be prompted to investigate and treat for potential serious conditions accordingly. Results: Eight hundred and thirty-five infants met the inclusion criteria for this project. The ones admitted to the hospital for further management were more likely to have SPOC positive criteria than the discharged infants (Odds ratio: 12.26, 95% CI: 8.04 – 18.69). Similarly, Sepsis alert criteria on SPOC were positive in a higher percentage of patients with serious infections (56.52%) in comparison to those with mild conditions (15.89%) (p < 0.001). The SPOC sepsis criteria had a sensitivity of 56.5% (95% CI: 47.0% - 65.7%) and a moderate specificity of 84.1% (95% CI: 80.8% - 87.0%) to identify serious infections. Applying to this infant population, with a 17.4% prevalence of serious infection, the positive predictive value was only 42.8% (95% CI: 36.9% - 49.0%). However, the negative predictive value was high at 90.2% (95% CI: 88.1% - 91.9%). Conclusions: Standard Paediatric Observation Chart has been applied as a useful clinical tool in the clinical practice to help identify and manage young sick infants in ED effectively.

Keywords: clinical tool, infants, non-specific complaints, Standard Paediatric Observation Chart

Procedia PDF Downloads 225
156 Intelligent Fault Diagnosis for the Connection Elements of Modular Offshore Platforms

Authors: Jixiang Lei, Alexander Fuchs, Franz Pernkopf, Katrin Ellermann

Abstract:

Within the Space@Sea project, funded by the Horizon 2020 program, an island consisting of multiple platforms was designed. The platforms are connected by ropes and fenders. The connection is critical with respect to the safety of the whole system. Therefore, fault detection systems are investigated, which could detect early warning signs for a possible failure in the connection elements. Previously, a model-based method called Extended Kalman Filter was developed to detect the reduction of rope stiffness. This method detected several types of faults reliably, but some types of faults were much more difficult to detect. Furthermore, the model-based method is sensitive to environmental noise. When the wave height is low, a long time is needed to detect a fault and the accuracy is not always satisfactory. In this sense, it is necessary to develop a more accurate and robust technique that can detect all rope faults under a wide range of operational conditions. Inspired by this work on the Space at Sea design, we introduce a fault diagnosis method based on deep neural networks. Our method cannot only detect rope degradation by using the acceleration data from each platform but also estimate the contributions of the specific acceleration sensors using methods from explainable AI. In order to adapt to different operational conditions, the domain adaptation technique DANN is applied. The proposed model can accurately estimate rope degradation under a wide range of environmental conditions and help users understand the relationship between the output and the contributions of each acceleration sensor.

Keywords: fault diagnosis, deep learning, domain adaptation, explainable AI

Procedia PDF Downloads 151
155 Social Media Idea Ontology: A Concept for Semantic Search of Product Ideas in Customer Knowledge through User-Centered Metrics and Natural Language Processing

Authors: Martin H¨ausl, Maximilian Auch, Johannes Forster, Peter Mandl, Alexander Schill

Abstract:

In order to survive on the market, companies must constantly develop improved and new products. These products are designed to serve the needs of their customers in the best possible way. The creation of new products is also called innovation and is primarily driven by a company’s internal research and development department. However, a new approach has been taking place for some years now, involving external knowledge in the innovation process. This approach is called open innovation and identifies customer knowledge as the most important source in the innovation process. This paper presents a concept of using social media posts as an external source to support the open innovation approach in its initial phase, the Ideation phase. For this purpose, the social media posts are semantically structured with the help of an ontology and the authors are evaluated using graph-theoretical metrics such as density. For the structuring and evaluation of relevant social media posts, we also use the findings of Natural Language Processing, e. g. Named Entity Recognition, specific dictionaries, Triple Tagger and Part-of-Speech-Tagger. The selection and evaluation of the tools used are discussed in this paper. Using our ontology and metrics to structure social media posts enables users to semantically search these posts for new product ideas and thus gain an improved insight into the external sources such as customer needs.

Keywords: idea ontology, innovation management, semantic search, open information extraction

Procedia PDF Downloads 164
154 Knowledge Management Barriers: A Statistical Study of Hardware Development Engineering Teams within Restricted Environments

Authors: Nicholas S. Norbert Jr., John E. Bischoff, Christopher J. Willy

Abstract:

Knowledge Management (KM) is globally recognized as a crucial element in securing competitive advantage through building and maintaining organizational memory, codifying and protecting intellectual capital and business intelligence, and providing mechanisms for collaboration and innovation. KM frameworks and approaches have been developed and defined identifying critical success factors for conducting KM within numerous industries ranging from scientific to business, and for ranges of organization scales from small groups to large enterprises. However, engineering and technical teams operating within restricted environments are subject to unique barriers and KM challenges which cannot be directly treated using the approaches and tools prescribed for other industries. This research identifies barriers in conducting KM within Hardware Development Engineering (HDE) teams and statistically compares significance to barriers upholding the four KM pillars of organization, technology, leadership, and learning for HDE teams. HDE teams suffer from restrictions in knowledge sharing (KS) due to classification of information (national security risks), customer proprietary restrictions (non-disclosure agreement execution for designs), types of knowledge, complexity of knowledge to be shared, and knowledge seeker expertise. As KM evolved leveraging information technology (IT) and web-based tools and approaches from Web 1.0 to Enterprise 2.0, KM may also seek to leverage emergent tools and analytics including expert locators and hybrid recommender systems to enable KS across barriers of the technical teams. The research will test hypothesis statistically evaluating if KM barriers for HDE teams affect the general set of expected benefits of a KM System identified through previous research. If correlations may be identified, then generalizations of success factors and approaches may also be garnered for HDE teams. Expert elicitation will be conducted using a questionnaire hosted on the internet and delivered to a panel of experts including engineering managers, principal and lead engineers, senior systems engineers, and knowledge management experts. The feedback to the questionnaire will be processed using analysis of variance (ANOVA) to identify and rank statistically significant barriers of HDE teams within the four KM pillars. Subsequently, KM approaches will be recommended for upholding the KM pillars within restricted environments of HDE teams.

Keywords: engineering management, knowledge barriers, knowledge management, knowledge sharing

Procedia PDF Downloads 244
153 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things

Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker

Abstract:

Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.

Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data

Procedia PDF Downloads 303
152 Noncovalent Antibody-Nanomaterial Conjugates: A Simple Approach to Produce Targeted Nanomedicines

Authors: Nicholas Fletcher, Zachary Houston, Yongmei Zhao, Christopher Howard, Kristofer Thurecht

Abstract:

One promising approach to enhance nanomedicine therapeutic efficacy is to include a targeting agent, such as an antibody, to increase accumulation at the tumor site. However, the application of such targeted nanomedicines remains limited, in part due to difficulties involved with biomolecule conjugation to synthetic nanomaterials. One approach recently developed to overcome this has been to engineer bispecific antibodies (BsAbs) with dual specificity, whereby one portion binds to methoxy polyethyleneglycol (mPEG) epitopes present on synthetic nanomedicines, while the other binds to molecular disease markers of interest. In this way, noncovalent complexes of nanomedicine core, comprising a hyperbranched polymer (HBP) of primarily mPEG, decorated with targeting ligands are able to be produced by simple mixing. Further work in this area has now demonstrated such complexes targeting the breast cancer marker epidermal growth factor receptor (EGFR) to show enhanced binding to tumor cells both in vitro and in vivo. Indeed the enhanced accumulation at the tumor site resulted in improved therapeutic outcomes compared to untargeted nanomedicines and free chemotherapeutics. The current work on these BsAb-HBP conjugates focuses on further probing antibody-nanomaterial interactions and demonstrating broad applicability to a range of cancer types. Herein are reported BsAb-HBP materials targeted towards prostate-specific membrane antigen (PSMA) and study of their behavior in vivo using ⁸⁹Zr positron emission tomography (PET) in a dual-tumor prostate cancer xenograft model. In this model mice bearing both PSMA+ and PSMA- tumors allow for PET imaging to discriminate between nonspecific and targeted uptake in tumors, and better quantify the increased accumulation following BsAb conjugation. Also examined is the potential for formation of these targeted complexes in situ following injection of individual components? The aim of this approach being to avoid undesirable clearance of proteinaceous complexes upon injection limiting available therapeutic. Ultimately these results demonstrate BsAb functionalized nanomaterials as a powerful and versatile approach for producing targeted nanomedicines for a variety of cancers.

Keywords: bioengineering, cancer, nanomedicine, polymer chemistry

Procedia PDF Downloads 113
151 Impacts of Commercial Honeybees on Native Butterflies in High-Elevation Meadows in Utah, USA

Authors: Jacqueline Kunzelman, Val Anderson, Robert Johnson, Nicholas Anderson, Rebecca Bates

Abstract:

In an effort to protect honeybees from colony collapse disorder, beekeepers are filing for government permits to use natural lands as summer pasture for honeybees under the multiple-use management regime in the United States. Utilizing natural landscapes in high mountain ranges may help strengthen honeybee colonies, as this natural setting is generally void of chemical pollutants and pesticides that are found in agricultural and urban settings. However, the introduction of a competitive species could greatly impact the native species occupying these natural landscapes. While honeybees and butterflies have different life histories, behavior, and foraging strategies, they compete for the same nectar resources. Few, if any, studies have focused on the potential population effects of commercial honeybees on native butterfly abundance and diversity. This study attempts to observe this impact using a paired before-after control-impact (BACI) design. Over the course of two years, malaise trap samples were collected every week during the months of the flowering season in two similar areas separated by 11 kilometers. Each area contained nine malaise trap sites for replication. In the first year, samples were taken to analyze and establish trends within the pollinating communities. In the second year, honeybees were introduced to only one of the two areas, and a change in trends between the two areas was assessed. Contrary to the original hypothesis, the resulting observation was an overall significant increase in the mean butterfly abundance in the impact areas after honeybees were introduced, while control areas remained relatively stable. This overall increase in abundance over the season can be attributed to an increase in butterflies during the first and second periods of the data collection when populations were near their peak. Several potential theories are 1) Honeybees are deterring a natural predator/competitor of butterflies that previously limited population growth. 2) Honeybees are consuming resources regularly used by butterflies, which may extend the foraging time and consequent capture rates of butterflies. 3) Environmental factors such as number of rainy days were inconsistent between control and impact areas, biasing capture rates. This ongoing research will help determine the suitability of high mountain ranges for the summer pasturing of honeybees and the population impacts on many different pollinators.

Keywords: butterfly, competition, honeybee, pollinator

Procedia PDF Downloads 119
150 Photocaged Carbohydrates: Versatile Tools for Biotechnological Applications

Authors: Claus Bier, Dennis Binder, Alexander Gruenberger, Dagmar Drobietz, Dietrich Kohlheyer, Anita Loeschcke, Karl Erich Jaeger, Thomas Drepper, Joerg Pietruszka

Abstract:

Light absorbing chromophoric systems are important optogenetic tools for biotechnical and biophysical investigations. Processes such as fluorescence or photolysis can be triggered by light-absorption of chromophores. These play a central role in life science. Photocaged compounds belong to such chromophoric systems. The photo-labile protecting groups enable them to release biologically active substances with high temporal and spatial resolution. The properties of photocaged compounds are specified by the characteristics of the caging group as well as the characteristics of the linked effector molecule. In our research, we work with different types of photo-labile protecting groups and various effector molecules giving us possible access to a large library of caged compounds. As a function of the caged effector molecule, a nearly limitless number of biological systems can be directed. Our main interest focusses on photocaging carbohydrates (e.g. arabinose) and their derivatives as effector molecules. Based on these resulting photocaged compounds a precisely controlled photoinduced gene expression will give us access to studies of numerous biotechnological and synthetic biological applications. It could be shown, that the regulation of gene expression via light is possible with photocaged carbohydrates achieving a higher-order control over this processes. With the one-step cleavable photocaged carbohydrate, a homogeneous expression was achieved in comparison to free carbohydrates.

Keywords: bacterial gene expression, biotechnology, caged compounds, carbohydrates, optogenetics, photo-removable protecting group

Procedia PDF Downloads 193
149 Biological Studies of N-O Donor 4-Acypyrazolone Heterocycle and Its Pd/Pt Complexes of Therapeutic Importance

Authors: Omoruyi Gold Idemudia, Alexander P. Sadimenko

Abstract:

The synthesis of N-heterocycles with novel properties, having broad spectrum biological activities that may become alternative medicinal drugs, have been attracting a lot of research attention due to the emergence of medicinal drug’s limitations such as disease resistance and their toxicity effects among others. Acylpyrazolones have been employed as pharmaceuticals as well as analytical reagent and their application as coordination complexes with transition metal ions have been well established. By way of a condensation reaction with amines acylpyrazolone ketones form a more chelating and superior group of compounds known as azomethines. 4-propyl-3-methyl-1-phenyl-2-pyrazolin-5-one was reacted with phenylhydrazine to get a new phenylhydrazone which was further reacted with aqueous solutions of palladium and platinum salts, in an effort towards the discovery of transition metal based synthetic drugs. The compounds were characterized by means of analytical, spectroscopic, thermogravimetric analysis TGA, as well as x-ray crystallography. 4-propyl-3-methyl-1-phenyl-2-pyrazolin-5-one phenylhydrazone crystallizes in a triclinic crystal system with a P-1 (No. 2) space group based on x-ray crystallography. The bidentate ON ligand formed a square planar geometry on coordinating with metal ions based on FTIR, electronic and NMR spectra as well as magnetic moments. Reported compounds showed antibacterial activities against the nominated bacterial isolates using the disc diffusion technique at 20 mg/ml in triplicates. The metal complexes exhibited a better antibacterial activity with platinum complex having an MIC value of 0.63 mg/ml. Similarly, ligand and complexes also showed antioxidant scavenging properties against 2, 2-diphenyl-1-picrylhydrazyl DPPH radical at 0.5mg/ml relative to ascorbic acid (standard drug).

Keywords: acylpyrazolone, antibacterial studies, metal complexes, phenylhydrazone, spectroscopy

Procedia PDF Downloads 225
148 A Review on Benzo(a)pyrene Emission Factors from Biomass Combustion

Authors: Franziska Klauser, Manuel Schwabl, Alexander Weissinger, Christoph Schmidl, Walter Haslinger, Anne Kasper-Giebl

Abstract:

Benzo(a)pyrene (BaP) is the most widely investigated representative of Polycyclic Aromatic Hydrocarbons (PAH) as well as one of the most toxic compounds in this group. Since 2013 in the European Union a limit value for BaP concentration in the ambient air is applied, which was set to a yearly average value of 1 ng m-3. Several reports show that in some regions, even where industry and traffic are of minor impact this threshold is regularly exceeded. This is taken as proof that biomass combustion for heating purposes contributes significantly to BaP pollution. Several investigations have been already carried out on the BaP emission behavior of biomass combustion furnaces, mostly focusing on a certain aspect like the influences from wood type, of operation type or of technology type. However, a superior view on emission patterns of BaP from biomass combustion and the aggregation of determined values also from recent studies is not presented so far. The combination of determined values allows a better understanding of the BaP emission behavior from biomass combustion. In this work the review conclusions are driven from the combination of outcomes from different publication. In two examples it was shown that technical progress leads to 10 to 100 fold lower BaP emission from modern furnaces compared to old technologies of equivalent type. It was also indicated that the operation with pellets or wood chips exhibits clearly lower BaP emission factors compared to operation with log wood. Although, the BaP emission level from automatic furnaces is strongly impacted by the kind of operation. This work delivers an overview on BaP emission factors from different biomass combustion appliances, from different operation modes and from the combustion of different fuel and wood types. The main impact factors are depicted, and suggestions for low BaP emission biomass combustion are derived. As one result possible investigation fields concerning BaP emissions from biomass combustion that seem to be most important to be clarified are suggested.

Keywords: benzo(a)pyrene, biomass, combustion, emission, pollution

Procedia PDF Downloads 334
147 Long Term Survival after a First Transient Ischemic Attack in England: A Case-Control Study

Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski

Abstract:

Transient ischaemic attacks (TIAs) are warning signs for future strokes. TIA patients are at increased risk of stroke and cardio-vascular events after a first episode. A majority of studies on TIA focused on the occurrence of these ancillary events after a TIA. Long-term mortality after TIA received only limited attention. We undertook this study to determine the long-term hazards of all-cause mortality following a first episode of a TIA using anonymised electronic health records (EHRs). We used a retrospective case-control study using electronic primary health care records from The Health Improvement Network (THIN) database. Patients born prior to or in year 1960, resident in England, with a first diagnosis of TIA between January 1986 and January 2017 were matched to three controls on age, sex and general medical practice. The primary outcome was all-cause mortality. The hazards of all-cause mortality were estimated using a time-varying Weibull-Cox survival model which included both scale and shape effects and a random frailty effect of GP practice. 20,633 cases and 58,634 controls were included. Cases aged 39 to 60 years at the first TIA event had the highest hazard ratio (HR) of mortality compared to matched controls (HR = 3.04, 95% CI (2.91 - 3.18)). The HRs for cases aged 61-70 years, 71-76 years and 77+ years were 1.98 (1.55 - 2.30), 1.79 (1.20 - 2.07) and 1.52 (1.15 - 1.97) compared to matched controls. Aspirin provided long-term survival benefits to cases. Cases aged 39-60 years on aspirin had HR of 0.93 (0.84 - 1.00), 0.90 (0.82 - 0.98) and 0.88 (0.80 - 0.96) at 5 years, 10 years and 15 years, respectively, compared to cases in the same age group who were not on antiplatelets. Similar beneficial effects of aspirin were observed in other age groups. There were no significant survival benefits with other antiplatelet options. No survival benefits of antiplatelet drugs were observed in controls. Our study highlights the excess long-term risk of death of TIA patients and cautions that TIA should not be treated as a benign condition. The study further recommends aspirin as the better option for secondary prevention for TIA patients compared to clopidogrel recommended by NICE guidelines. Management of risk factors and treatment strategies should be important challenges to reduce the burden of disease.

Keywords: dual antiplatelet therapy (DAPT), General Practice, Multiple Imputation, The Health Improvement Network(THIN), hazard ratio (HR), Weibull-Cox model

Procedia PDF Downloads 117
146 Farmers Willingness to Pay for Irrigated Maize Production in Rural Kenya

Authors: Dennis Otieno, Lilian Kirimi Nicholas Odhiambo, Hillary Bii

Abstract:

Kenya is considered to be a middle level income country and usuaaly does not meet household food security needs especially in North and South eastern parts. Approximately half of the population is living under the poverty line (www, CIA 1, 2012). Agriculture is the largest sector in the country, employing 80% of the population. These are thereby directly dependent on the sufficiency of outputs received. This makes efficient, easy-accessible and cheap agricultural practices an important matter in order to improve food security. Maize is the prime staple food commodity in Kenya and represents a substantial share of people’s nutritional intake. This study is the result of questionnaire based interviews, Key informant and focus group discussion involving 220 small scale maize farmers Kenyan. The study was located to two separated areas; Lower Kuja, Bunyala, Nandi, Lower Nzoia, Perkerra, Mwea Bura, Hola and Galana Kulalu in Kenya. The questionnaire captured the farmers’ use and perceived importance of the use irrigation services and irrigated maize production. Viability was evaluated using the four indices which were all positive with NPV giving positive cash flows in less than 21 years at most for one season output. The mean willingness to pay was found to be KES 3082 and willingness to pay increased with increase in irrigation premiums. The economic value of water was found to be greater than the willingness to pay implying that irrigated maize production is sustainable. Farmers stated that viability was influenced by high output levels, good produce quality, crop of choice, availability of sufficient water and enforcement the last two factors had a positive influence while the other had negative effect on the viability of irrigated maize. A regression was made over the correlation between the willingness to pay for irrigated maize production using scheme and plot level factors. Farmers that already use other inputs such as animal manure, hired labor and chemical fertilizer should also have a demand for improved seeds according to Liebig's law of minimum and expansion path theory. The regression showed that premiums, and high yields have a positive effect on willingness to pay while produce quality, efficient fertilizer use, and crop season have a negative effect.

Keywords: maize, food security, profits, sustainability, willingness to pay

Procedia PDF Downloads 192
145 Association of Vascular Endothelial Growth Factor Gene +405 C>G and -460 T>C Polymorphism with Type 2 Diabetic Foot Ulcer Patient in Cipto Mangunkusumo National Hospital Jakarta

Authors: Dedy Pratama, Akhmadu Muradi, Hilman Ibrahim, Raden Suhartono, Alexander Jayadi Utama, Patrianef Darwis, S. Dwi Anita, Luluk Yunaini, Kemas Dahlan

Abstract:

Introduction: Vascular endothelial growth factor (VEGF) gene shows association with various angiogenesis conditions including Diabetic Foot Ulcer (DFU) disease. In this study, we performed this study to examine VEGF gene polymorphism associated with DFU. Methods: Case-control study of polymorphism of VEGF gene +405 C>G and -460 T>C, of diabetes mellitus (DM) type 2 with Diabetic Foot Ulcer (DFU) in Cipto Mangunkusumo National Hospital (RSCM) Jakarta from June to December 2016. Results: There were 203 patients, 102 patients with DFU and 101 patients without DFU. Forty-nine point 8 percent of total samples is male and 50,2% female with mean age 56,06 years. Distribution of the wild-type genotype VEGF +405 C>G wild type CC was found in 6,9% of respondents, the number of mutant heterozygote CG was 69,5% and mutant homozygote GG was 19,7%. Cumulatively, there were 6,9% wild-type and 85,2% mutant and 3,9% of total blood samples could not be detected on PCR-RFLP. Distribution of VEGF allele +405 C>G C alleles were 43% and G alleles were 57%. Distribution of genotype from VEGF gene -460 T>C is wild type TT 42,9%, mutant heterozygote TC 37,9% and mutant homozygote CC 13,3%. Cumulatively, there were 42,9% wild-type and 51% mutant type. Distribution of VEGF -460 T>C were 62% T allele and 38% C allele. Conclusion: In this study we found the distribution of alleles from VEGF +405 C>G is C 43% and G 57% and from VEGF -460 T>C; T 62% and C 38%. We propose that G allele in VEGF +405 C>G can act as a protective allele and on the other hands T allele in VEGF -460 T>C could be acted as a risk factor for DFU in diabetic patients.

Keywords: diabetic foot ulcer, diabetes mellitus, polymorphism, VEGF

Procedia PDF Downloads 264