Search results for: fuzzy-set qualitative comparative analysis (fsQCA)
18111 Determining Variables in Mathematics Performance According to Gender in Mexican Elementary School
Authors: Nora Gavira Duron, Cinthya Moreda Gonzalez-Ortega, Reyna Susana Garcia Ruiz
Abstract:
This paper objective is to analyze the mathematics performance in the Learning Evaluation National Plan (PLANEA for its Spanish initials: Plan Nacional para la Evaluación de los Aprendizajes), applied to Mexican students who are enrolled in the last elementary-school year over the 2017-2018 academic year. Such test was conducted nationwide in 3,573 schools, using a sample of 108,083 students, whose average in mathematics, on a scale of 0 to 100, was 45.6 points. 75% of the sample analyzed did not reach the sufficiency level (60 points). It should be noted that only 2% got a 90 or higher score result. The performance is analyzed while considering whether there are differences in gender, marginalization level, public or private school enrollment, parents’ academic background, and living-with-parents situation. Likewise, this variable impact (among other variables) on school performance by gender is evaluated, considering multivariate logistic (Logit) regression analysis. The results show there are no significant differences in mathematics performance regarding gender in elementary school; nevertheless, the impact exerted by mothers who studied at least high school is of great relevance for students, particularly for girls. Other determining variables are students’ resilience, their parents’ economic status, and the fact they attend private schools, strengthened by the mother's education.Keywords: multivariate regression analysis, academic performance, learning evaluation, mathematics result per gender
Procedia PDF Downloads 14718110 Identity and Economics: The Economic Welfare and Behavior of Romani People in Turkey
Authors: Sinem Bagce, Ensar Yilmaz
Abstract:
As a well-known fact, neoclassical economics excludes 'what is humanized' out of the literature for a long time. Rationality is defined in a very narrow context in the mainstream economics. Identity economics is one of the challenges raised against this tradition. The concept of 'identity' has been introduced to economics by Akerlof and Kranton (2000). The identity-based analysis mainly searches the links between economic welfare and decision of the actors in question related to ethnic, racial, gender and immigrant issues. This is more about discrimination and its repercussions on economic decisions of the relevant actors in a social sphere. In this article, we, in the context of identity economics, search the economic welfare and decisions of Romani people in Turkey. It is plainly observed that identity is clearly the major determinant for Romani people in economic and social life. They have their own distinctive rationality in making economic decisions. For a more scrutinized and academic analysis, we aim to trace their economic identity in their real social environment. This study is an extension of surveys conducted on Romani people in Turkey. Using data similar to SILC (Statistics for Income and Living Conditions) conducted on Romani people across the whole Turkey, we look for some questions about the income/welfare distribution among them, consumer preferences/habits, living conditions, occupations, education and as such. For this, by employing econometric and statistical analytical tools, we aim to obtain the answers for these questions. We think these analytic results will provide us to evaluate the links between their economic state and their identity more thoroughly. JEL Codes: D1, J 15, R23.Keywords: identity economics, Romani people, discrimination, social identity and preferences
Procedia PDF Downloads 20118109 Role of Pulp Volume Method in Assessment of Age and Gender in Lucknow, India, an Observational Study
Authors: Anurag Tripathi, Sanad Khandelwal
Abstract:
Age and gender determination are required in forensic for victim identification. There is secondary dentine deposition throughout life, resulting in decreased pulp volume and size. Evaluation of pulp volume using Cone Beam Computed Tomography (CBCT)is a noninvasive method to evaluate the age and gender of an individual. The study was done to evaluate the efficacy of pulp volume method in the determination of age and gender.Aims/Objectives: The study was conducted to estimate age and determine sex by measuring tooth pulp volume with the help of CBCT. An observational study of one year duration on CBCT data of individuals was conducted in Lucknow. Maxillary central incisors (CI) and maxillary canine (C) of the randomly selected samples were assessed for measurement of pulp volume using a software. Statistical analysis: Chi Square Test, Arithmetic Mean, Standard deviation, Pearson’s Correlation, Linear & Logistic regression analysis. Results: The CBCT data of Ninety individuals with age range between 18-70 years was evaluated for pulp volume of central incisor and canine (CI & C). The Pearson correlation coefficient between the tooth pulp volume (CI & C) and chronological age suggested that pulp volume decreased with age. The validation of the equations for sex determination showed higher prediction accuracy for CI (56.70%) and lower for C (53.30%).Conclusion: Pulp volume obtained from CBCT is a reliable indicator for age estimation and gender prediction.Keywords: forensic, dental age, pulp volume, cone beam computed tomography
Procedia PDF Downloads 9918108 Analysis of Brownfield Soil Contamination Using Local Government Planning Data
Authors: Emma E. Hellawell, Susan J. Hughes
Abstract:
BBrownfield sites are currently being redeveloped for residential use. Information on soil contamination on these former industrial sites is collected as part of the planning process by the local government. This research project analyses this untapped resource of environmental data, using site investigation data submitted to a local Borough Council, in Surrey, UK. Over 150 site investigation reports were collected and interrogated to extract relevant information. This study involved three phases. Phase 1 was the development of a database for soil contamination information from local government reports. This database contained information on the source, history, and quality of the data together with the chemical information on the soil that was sampled. Phase 2 involved obtaining site investigation reports for development within the study area and extracting the required information for the database. Phase 3 was the data analysis and interpretation of key contaminants to evaluate typical levels of contaminants, their distribution within the study area, and relating these results to current guideline levels of risk for future site users. Preliminary results for a pilot study using a sample of the dataset have been obtained. This pilot study showed there is some inconsistency in the quality of the reports and measured data, and careful interpretation of the data is required. Analysis of the information has found high levels of lead in shallow soil samples, with mean and median levels exceeding the current guidance for residential use. The data also showed elevated (but below guidance) levels of potentially carcinogenic polyaromatic hydrocarbons. Of particular concern from the data was the high detection rate for asbestos fibers. These were found at low concentrations in 25% of the soil samples tested (however, the sample set was small). Contamination levels of the remaining chemicals tested were all below the guidance level for residential site use. These preliminary pilot study results will be expanded, and results for the whole local government area will be presented at the conference. The pilot study has demonstrated the potential for this extensive dataset to provide greater information on local contamination levels. This can help inform regulators and developers and lead to more targeted site investigations, improving risk assessments, and brownfield development.Keywords: Brownfield development, contaminated land, local government planning data, site investigation
Procedia PDF Downloads 14018107 Assessing Gender Mainstreaming Practices in the Philippine Basic Education System
Authors: Michelle Ablian Mejica
Abstract:
Female drop-outs due to teenage pregnancy and gender-based violence in schools are two of the most contentious and current gender-related issues faced by the Department of Education (DepEd) in the Philippines. The country adopted gender mainstreaming as the main strategy to eliminate gender inequalities in all aspects of the society including education since 1990. This research examines the extent and magnitude by which gender mainstreaming is implemented in the basic education from the national to the school level. It seeks to discover the challenges faced by the central and field offices, particularly by the principals who served as decision-makers in the schools where teaching and learning take place and where opportunities that may aggravate, conform and transform gender inequalities and hierarchies exist. The author conducted surveys and interviews among 120 elementary and secondary principals in the Division of Zambales as well as selected gender division and regional focal persons within Region III- Central Luzon. The study argues that DepEd needs to review, strengthen and revitalize its gender mainstreaming because the efforts do not penetrate the schools and are not enough to lessen or eliminate gender inequalities within the schools. The study found out some of the major challenges in the implementation of gender mainstreaming as follows: absence of a national gender-responsive education policy framework, lack of gender responsive assessment and monitoring tools, poor quality of gender and development related training programs and poor data collection and analysis mechanism. Furthermore, other constraints include poor coordination mechanism among implementing agencies, lack of clear implementation strategy, ineffective or poor utilization of GAD budget and lack of teacher and learner centered GAD activities. The paper recommends the review of the department’s gender mainstreaming efforts to align with the mandate of the agency and provide gender responsive teaching and learning environment. It suggests that the focus must be on formulation of gender responsive policies and programs, improvement of the existing mechanism and conduct of trainings focused on gender analysis, budgeting and impact assessment not only for principals and GAD focal point system but also to parents and other school stakeholders.Keywords: curriculum and instruction, gender analysis, gender budgeting, gender impact assessment
Procedia PDF Downloads 34418106 Advanced Magnetic Field Mapping Utilizing Vertically Integrated Deployment Platforms
Authors: John E. Foley, Martin Miele, Raul Fonda, Jon Jacobson
Abstract:
This paper presents development and implementation of new and innovative data collection and analysis methodologies based on deployment of total field magnetometer arrays. Our research has focused on the development of a vertically-integrated suite of platforms all utilizing common data acquisition, data processing and analysis tools. These survey platforms include low-altitude helicopters and ground-based vehicles, including robots, for terrestrial mapping applications. For marine settings the sensor arrays are deployed from either a hydrodynamic bottom-following wing towed from a surface vessel or from a towed floating platform for shallow-water settings. Additionally, sensor arrays are deployed from tethered remotely operated vehicles (ROVs) for underwater settings where high maneuverability is required. While the primary application of these systems is the detection and mapping of unexploded ordnance (UXO), these system are also used for various infrastructure mapping and geologic investigations. For each application, success is driven by the integration of magnetometer arrays, accurate geo-positioning, system noise mitigation, and stable deployment of the system in appropriate proximity of expected targets or features. Each of the systems collects geo-registered data compatible with a web-enabled data management system providing immediate access of data and meta-data for remote processing, analysis and delivery of results. This approach allows highly sophisticated magnetic processing methods, including classification based on dipole modeling and remanent magnetization, to be efficiently applied to many projects. This paper also briefly describes the initial development of magnetometer-based detection systems deployed from low-altitude helicopter platforms and the subsequent successful transition of this technology to the marine environment. Additionally, we present examples from a range of terrestrial and marine settings as well as ongoing research efforts related to sensor miniaturization for unmanned aerial vehicle (UAV) magnetic field mapping applications.Keywords: dipole modeling, magnetometer mapping systems, sub-surface infrastructure mapping, unexploded ordnance detection
Procedia PDF Downloads 46418105 The Impact of Tourism on the Intangible Cultural Heritage of Pilgrim Routes: The Case of El Camino de Santiago
Authors: Miguel Angel Calvo Salve
Abstract:
This qualitative and quantitative study will identify the impact of tourism pressure on the intangible cultural heritage of the pilgrim route of El Camino de Santiago (Saint James Way) and propose an approach to a sustainable touristic model for these Cultural Routes. Since 1993, the Spanish Section of the Pilgrim Route of El Camino de Santiago has been on the World Heritage List. In 1994, the International Committee on Cultural Routes (CIIC-ICOMOS) initiated its work with the goal of studying, preserving, and promoting the cultural routes and their significance as a whole. Another ICOMOS group, the Charter on Cultural Routes, pointed out in 2008 the importance of both tangible and intangible heritage and the need for a holistic vision in preserving these important cultural assets. Tangible elements provide a physical confirmation of the existence of these cultural routes, while the intangible elements serve to give sense and meaning to it as a whole. Intangible assets of a Cultural Route are key to understanding the route's significance and its associated heritage values. Like many pilgrim routes, the Route to Santiago, as the result of a long evolutionary process, exhibits and is supported by intangible assets, including hospitality, cultural and religious expressions, music, literature, and artisanal trade, among others. A large increase in pilgrims walking the route, with very different aims and tourism pressure, has shown how the dynamic links between the intangible cultural heritage and the local inhabitants along El Camino are fragile and vulnerable. Economic benefits for the communities and population along the cultural routes are commonly fundamental for the micro-economies of the people living there, substituting traditional productive activities, which, in fact, modifies and has an impact on the surrounding environment and the route itself. Consumption of heritage is one of the major issues of sustainable preservation promoted with the intention of revitalizing those sites and places. The adaptation of local communities to new conditions aimed at preserving and protecting existing heritage has had a significant impact on immaterial inheritance. Based on questionnaires to pilgrims, tourists and local communities along El Camino during the peak season of the year, and using official statistics from the Galician Pilgrim’s Office, this study will identify the risk and threats to El Camino de Santiago as a Cultural Route. The threats visible nowadays due to the impact of mass tourism include transformations of tangible heritage, consumerism of the intangible, changes of local activities, loss in the authenticity of symbols and spiritual significance, and pilgrimage transformed into a tourism ‘product’, among others. The study will also approach some measures and solutions to mitigate those impacts and better preserve this type of cultural heritage. Therefore, this study will help the Route services providers and policymakers to better preserve the Cultural Route as a whole to ultimately improve the satisfying experience of pilgrims.Keywords: cultural routes, El Camino de Santiago, impact of tourism, intangible heritage
Procedia PDF Downloads 8418104 Optimization of Air Pollution Control Model for Mining
Authors: Zunaira Asif, Zhi Chen
Abstract:
The sustainable measures on air quality management are recognized as one of the most serious environmental concerns in the mining region. The mining operations emit various types of pollutants which have significant impacts on the environment. This study presents a stochastic control strategy by developing the air pollution control model to achieve a cost-effective solution. The optimization method is formulated to predict the cost of treatment using linear programming with an objective function and multi-constraints. The constraints mainly focus on two factors which are: production of metal should not exceed the available resources, and air quality should meet the standard criteria of the pollutant. The applicability of this model is explored through a case study of an open pit metal mine, Utah, USA. This method simultaneously uses meteorological data as a dispersion transfer function to support the practical local conditions. The probabilistic analysis and the uncertainties in the meteorological conditions are accomplished by Monte Carlo simulation. Reasonable results have been obtained to select the optimized treatment technology for PM2.5, PM10, NOx, and SO2. Additional comparison analysis shows that baghouse is the least cost option as compared to electrostatic precipitator and wet scrubbers for particulate matter, whereas non-selective catalytical reduction and dry-flue gas desulfurization are suitable for NOx and SO2 reduction respectively. Thus, this model can aid planners to reduce these pollutants at a marginal cost by suggesting control pollution devices, while accounting for dynamic meteorological conditions and mining activities.Keywords: air pollution, linear programming, mining, optimization, treatment technologies
Procedia PDF Downloads 20818103 Mining User-Generated Contents to Detect Service Failures with Topic Model
Authors: Kyung Bae Park, Sung Ho Ha
Abstract:
Online user-generated contents (UGC) significantly change the way customers behave (e.g., shop, travel), and a pressing need to handle the overwhelmingly plethora amount of various UGC is one of the paramount issues for management. However, a current approach (e.g., sentiment analysis) is often ineffective for leveraging textual information to detect the problems or issues that a certain management suffers from. In this paper, we employ text mining of Latent Dirichlet Allocation (LDA) on a popular online review site dedicated to complaint from users. We find that the employed LDA efficiently detects customer complaints, and a further inspection with the visualization technique is effective to categorize the problems or issues. As such, management can identify the issues at stake and prioritize them accordingly in a timely manner given the limited amount of resources. The findings provide managerial insights into how analytics on social media can help maintain and improve their reputation management. Our interdisciplinary approach also highlights several insights by applying machine learning techniques in marketing research domain. On a broader technical note, this paper illustrates the details of how to implement LDA in R program from a beginning (data collection in R) to an end (LDA analysis in R) since the instruction is still largely undocumented. In this regard, it will help lower the boundary for interdisciplinary researcher to conduct related research.Keywords: latent dirichlet allocation, R program, text mining, topic model, user generated contents, visualization
Procedia PDF Downloads 18718102 Application of GPRS in Water Quality Monitoring System
Authors: V. Ayishwarya Bharathi, S. M. Hasker, J. Indhu, M. Mohamed Azarudeen, G. Gowthami, R. Vinoth Rajan, N. Vijayarangan
Abstract:
Identification of water quality conditions in a river system based on limited observations is an essential task for meeting the goals of environmental management. The traditional method of water quality testing is to collect samples manually and then send to laboratory for analysis. However, it has been unable to meet the demands of water quality monitoring today. So a set of automatic measurement and reporting system of water quality has been developed. In this project specifies Water quality parameters collected by multi-parameter water quality probe are transmitted to data processing and monitoring center through GPRS wireless communication network of mobile. The multi parameter sensor is directly placed above the water level. The monitoring center consists of GPRS and micro-controller which monitor the data. The collected data can be monitor at any instant of time. In the pollution control board they will monitor the water quality sensor data in computer using Visual Basic Software. The system collects, transmits and processes water quality parameters automatically, so production efficiency and economy benefit are improved greatly. GPRS technology can achieve well within the complex environment of poor water quality non-monitored, and more specifically applicable to the collection point, data transmission automatically generate the field of water analysis equipment data transmission and monitoring.Keywords: multiparameter sensor, GPRS, visual basic software, RS232
Procedia PDF Downloads 41218101 Error Analysis of Pronunciation of French by Sinhala Speaking Learners
Authors: Chandeera Gunawardena
Abstract:
The present research analyzes the pronunciation errors encountered by thirty Sinhala speaking learners of French on the assumption that the pronunciation errors were systematic and they reflect the interference of the native language of the learners. The thirty participants were selected using random sampling method. By the time of the study, the subjects were studying French as a foreign language for their Bachelor of Arts Degree at University of Kelaniya, Sri Lanka. The participants were from a homogenous linguistics background. All participants speak the same native language (Sinhala) thus they had completed their secondary education in Sinhala medium and during which they had also learnt French as a foreign language. A battery operated audio tape recorder and a 120-minute blank cassettes were used for recording. A list comprised of 60 words representing all French phonemes was used to diagnose pronunciation difficulties. Before the recording process commenced, the subjects were requested to familiarize themselves with the words through reading them several times. The recording was conducted individually in a quiet classroom and each recording approximately took fifteen minutes. Each subject was required to read at a normal speed. After the completion of recording, the recordings were replayed to identify common errors which were immediately transcribed using the International Phonetic Alphabet. Results show that Sinhala speaking learners face problems with French nasal vowels and French initial consonants clusters. The learners also exhibit errors which occur because of their second language (English) interference.Keywords: error analysis, pronunciation difficulties, pronunciation errors, Sinhala speaking learners of French
Procedia PDF Downloads 21018100 The Budget Impact of the DISCERN™ Diagnostic Test for Alzheimer’s Disease in the United States
Authors: Frederick Huie, Lauren Fusfeld, William Burchenal, Scott Howell, Alyssa McVey, Thomas F. Goss
Abstract:
Alzheimer’s Disease (AD) is a degenerative brain disease characterized by memory loss and cognitive decline that presents a substantial economic burden for patients and health insurers in the US. This study evaluates the payer budget impact of the DISCERN™ test in the diagnosis and management of patients with symptoms of dementia evaluated for AD. DISCERN™ comprises three assays that assess critical factors related to AD that regulate memory, formation of synaptic connections among neurons, and levels of amyloid plaques and neurofibrillary tangles in the brain and can provide a quicker, more accurate diagnosis than tests in the current diagnostic pathway (CDP). An Excel-based model with a three-year horizon was developed to assess the budget impact of DISCERN™ compared with CDP in a Medicare Advantage plan with 1M beneficiaries. Model parameters were identified through a literature review and were verified through consultation with clinicians experienced in diagnosis and management of AD. The model assesses direct medical costs/savings for patients based on the following categories: •Diagnosis: costs of diagnosis using DISCERN™ and CDP. •False Negative (FN) diagnosis: incremental cost of care avoidable with a correct AD diagnosis and appropriately directed medication. •True Positive (TP) diagnosis: AD medication costs; cost from a later TP diagnosis with the CDP versus DISCERN™ in the year of diagnosis, and savings from the delay in AD progression due to appropriate AD medication in patients who are correctly diagnosed after a FN diagnosis.•False Positive (FP) diagnosis: cost of AD medication for patients who do not have AD. A one-way sensitivity analysis was conducted to assess the effect of varying key clinical and cost parameters ±10%. An additional scenario analysis was developed to evaluate the impact of individual inputs. In the base scenario, DISCERN™ is estimated to decrease costs by $4.75M over three years, equating to approximately $63.11 saved per test per year for a cohort followed over three years. While the diagnosis cost is higher with DISCERN™ than with CDP modalities, this cost is offset by the higher overall costs associated with CDP due to the longer time needed to receive a TP diagnosis and the larger number of patients who receive a FN diagnosis and progress more rapidly than if they had received appropriate AD medication. The sensitivity analysis shows that the three parameters with the greatest impact on savings are: reduced sensitivity of DISCERN™, improved sensitivity of the CDP, and a reduction in the percentage of disease progression that is avoided with appropriate AD medication. A scenario analysis in which DISCERN™ reduces the utilization for patients of computed tomography from 21% in the base case to 16%, magnetic resonance imaging from 37% to 27% and cerebrospinal fluid biomarker testing, positive emission tomography, electroencephalograms, and polysomnography testing from 4%, 5%, 10%, and 8%, respectively, in the base case to 0%, results in an overall three-year net savings of $14.5M. DISCERN™ improves the rate of accurate, definitive diagnosis of AD earlier in the disease and may generate savings for Medicare Advantage plans.Keywords: Alzheimer’s disease, budget, dementia, diagnosis.
Procedia PDF Downloads 13818099 Analysis of Ionosphere Anomaly Before Great Earthquake in Java on 2009 Using GPS Tec Data
Authors: Aldilla Damayanti Purnama Ratri, Hendri Subakti, Buldan Muslim
Abstract:
Ionosphere’s anomalies as an effect of earthquake activity is a phenomenon that is now being studied in seismo-ionospheric coupling. Generally, variation in the ionosphere caused by earthquake activity is weaker than the interference generated by different source, such as geomagnetic storms. However, disturbances of geomagnetic storms show a more global behavior, while the seismo-ionospheric anomalies occur only locally in the area which is largely determined by magnitude of the earthquake. It show that the earthquake activity is unique and because of its uniqueness it has been much research done thus expected to give clues as early warning before earthquake. One of the research that has been developed at this time is the approach of seismo-ionospheric-coupling. This study related the state in the lithosphere-atmosphere and ionosphere before and when earthquake occur. This paper choose the total electron content in a vertical (VTEC) in the ionosphere as a parameter. Total Electron Content (TEC) is defined as the amount of electron in vertical column (cylinder) with cross-section of 1m2 along GPS signal trajectory in ionosphere at around 350 km of height. Based on the analysis of data obtained from the LAPAN agency to identify abnormal signals by statistical methods, obtained that there are an anomaly in the ionosphere is characterized by decreasing of electron content of the ionosphere at 1 TECU before the earthquake occurred. Decreasing of VTEC is not associated with magnetic storm that is indicated as an earthquake precursor. This is supported by the Dst index showed no magnetic interference.Keywords: earthquake, DST Index, ionosphere, seismoionospheric coupling, VTEC
Procedia PDF Downloads 58618098 West Nile Virus Outbreaks in Canada under Expected Climate Conditions
Authors: Jalila Jbilou, Salaheddine El Adlouni, Pierre Gosselin
Abstract:
Background: West Nile virus is increasingly an important public health issue in North America. In Canada, WVN was officially reported in Toronto and Montréal for the first time in 2001. During the last decade, several WNV events have been reported in several Canadian provinces. The main objective of the present study is to update the frequency of the climate conditions favorable to WNV outbreaks in Canada. Method: Statistical frequency analysis has been used to estimate the return period for climate conditions associated with WNV outbreaks for the 1961–2050 period. The best fit is selected through the Akaike Information Criterion, and the parameters are estimated using the maximum likelihood approach. Results: Results show that the climate conditions related to the 2002 event, for Montreal and Toronto, are becoming more frequent. For Saskatoon, the highest DD20 events recorded for the last few decades were observed in 2003 and 2007. The estimated return periods are 30 years and 70 years, respectively. Conclusion: The emergence of WNV was related to extremely high DD values in the summer. However, some exceptions may be related to several factors such as virus persistence, vector migration, and also improved diagnosis and reporting levels. It is clear that such climate conditions have become much more common in the last decade and will likely continue to do so over future decades.Keywords: West Nile virus, climate, North America, statistical frequency analysis, risk estimation, public health, modeling, scenario, temperature, precipitation
Procedia PDF Downloads 34618097 Numerical Analysis of Mandible Fracture Stabilization System
Authors: Piotr Wadolowski, Grzegorz Krzesinski, Piotr Gutowski
Abstract:
The aim of the presented work is to recognize the impact of mini-plate application approach on the stress and displacement within the stabilization devices and surrounding bones. The mini-plate osteosynthesis technique is widely used by craniofacial surgeons as an improved replacement of wire connection approach. Many different types of metal plates and screws are used to the physical connection of fractured bones. Below investigation is based on a clinical observation of patient hospitalized with mini-plate stabilization system. Analysis was conducted on a solid mandible geometry, which was modeled basis on the computed tomography scan of the hospitalized patient. In order to achieve most realistic connected system behavior, the cortical and cancellous bone layers were assumed. The temporomandibular joint was simplified to the elastic element to allow physiological movement of loaded bone. The muscles of mastication system were reduced to three pairs, modeled as shell structures. Finite element grid was created by the ANSYS software, where hexahedral and tetrahedral variants of SOLID185 element were used. A set of nonlinear contact conditions were applied on connecting devices and bone common surfaces. Properties of particular contact pair depend on screw - mini-plate connection type and possible gaps between fractured bone around osteosynthesis region. Some of the investigated cases contain prestress introduced to the mini-plate during the application, what responds the initial bending of the connecting device to fit the retromolar fossa region. Assumed bone fracture occurs within the mandible angle zone. Due to the significant deformation of the connecting plate in some of the assembly cases the elastic-plastic model of titanium alloy was assumed. The bone tissues were covered by the orthotropic material. As a loading were used the gauge force of magnitude of 100N applied in three different locations. Conducted analysis shows significant impact of mini-plate application methodology on the stress distribution within the miniplate. Prestress effect introduces additional loading, which leads to locally exceed the titanium alloy yield limit. Stress in surrounding bone increases rapidly around the screws application region, exceeding assumed bone yield limit, what indicate the local bone destruction. Approach with the doubled mini-plate shows increased stress within the connector due to the too rigid connection, where the main path of loading leads through the mini-plates instead of plates and connected bones. Clinical observations confirm more frequent plate destruction of stiffer connections. Some of them could be an effect of decreased low cyclic fatigue capability caused by the overloading. The executed analysis prove that the mini-plate system provides sufficient support to mandible fracture treatment, however, many applicable solutions shifts the entire system to the allowable material limits. The results show that connector application with the initial loading needs to be carefully established due to the small material capability tolerances. Comparison to the clinical observations allows optimizing entire connection to prevent future incidents.Keywords: mandible fracture, mini-plate connection, numerical analysis, osteosynthesis
Procedia PDF Downloads 27518096 Performance Gap and near Zero Energy Buildings Compliance of Monitored Passivhaus in Northern Ireland, the Republic of Ireland and Italy
Authors: S. Colclough, V. Costanzo, K. Fabbri, S. Piraccini, P. Griffiths
Abstract:
The near Zero Energy Building (nZEB) standard is required for all buildings from 2020. The Passive House (PH) standard is a well-established low-energy building standard, having been designed over 25 years ago, and could potentially be used to achieve the nZEB standard in combination with renewables. By comparing measured performance with design predictions, this paper considers if there is a performance gap for a number of monitored properties and assesses if the nZEB standard can be achieved by following the well-established PH scheme. Analysis is carried out based on monitoring results from real buildings located in Northern Ireland, the Republic of Ireland and Italy respectively, with particular focus on the indoor air quality including the assumed and measured indoor temperature and heating periods for both standards as recorded during a full annual cycle. An analysis is carried out also on the energy performance certificates of each of the dwellings to determine if they meet the near Zero Energy Buildings primary energy consumption targets set in the respective jurisdictions. Each of the dwellings is certified as complying with the passive house standard, and accordingly have very good insulation levels, heat recovery and ventilation systems of greater than 75% efficiency and an airtightness of less than 0.6 air changes per hour at 50 Pa. It is found that indoor temperature and relative humidity were within the comfort boundaries set in the design stage, while carbon dioxide concentrations are sometimes higher than the values suggested by EN 15251 Standard for comfort class I especially in bedrooms.Keywords: monitoring campaign, nZEB (near zero energy buildings), Passivhaus, performance gap
Procedia PDF Downloads 15218095 Gut Microbiota in Patients with Opioid Use Disorder: A 12-week Follow up Study
Authors: Sheng-Yu Lee
Abstract:
Aim: Opioid use disorder is often characterized by repetitive drug-seeking and drug-taking behaviors with severe public health consequences. Animal model showed that opioid-induced perturbations in the gut microbiota causally relate to neuroinflammation, deficits in reward responding, and opioid tolerance, possibly due to changes in gut microbiota. Therefore, we propose that the dysbiosis of gut microbiota can be associated with pathogenesis of opioid dependence. In this current study, we explored the differences in gut microbiota between patients and normal controls and in patients before and after initiation of methadone treatment program for 12 weeks. Methods: Patients with opioid use disorder between 20 and 65 years were recruited from the methadone maintenance outpatient clinic in 2 medical centers in the Southern Taiwan. Healthy controls without any family history of major psychiatric disorders (schizophrenia, bipolar disorder and major depressive disorder) were recruited from the community. After initial screening, 15 patients with opioid use disorder joined the study for initial evaluation (Week 0), 12 of them completed the 12-week follow-up while receiving methadone treatment and ceased heroin use (Week 12). Fecal samples were collected from the patients at baseline and the end of 12th week. A one-time fecal sample was collected from the healthy controls. The microbiota of fecal samples were investigated using 16S rRNA V3V4 amplicon sequencing, followed by bioinformatics and statistical analyses. Results: We found no significant differences in species diversity in opioid dependent patients between Week 0 and Week 12, nor compared between patients at both points and controls. For beta diversity, using principal component analysis, we found no significant differences between patients at Week 0 and Week 12, however, both patient groups showed significant differences compared to control (P=0.011). Furthermore, the linear discriminant analysis effect size (LEfSe) analysis was used to identify differentially enriched bacteria between opioid use patients and healthy controls. Compared to controls, the relative abundance of Lactobacillaceae Lactobacillus (L. Lactobacillus), Megasphaera Megasphaerahexanoica (M. Megasphaerahexanoica) and Caecibacter Caecibactermassiliensis (C Caecibactermassiliensis) were increased in patients at Week 0, while Coriobacteriales Atopobiaceae (C. Atopobiaceae), Acidaminococcus Acidaminococcusintestini (A. Acidaminococcusintestini) and Tractidigestivibacter Tractidigestivibacterscatoligenes (T. Tractidigestivibacterscatoligenes) were increased in patients at Week 12. Conclusion: In conclusion, we suggest that the gut microbiome community maybe linked to opioid use disorder, such differences may not be altered even after 12-week of cessation of opioid use.Keywords: opioid use disorder, gut microbiota, methadone treatment, follow up study
Procedia PDF Downloads 10618094 Geoplanology Modeling and Applications Engineering of Earth in Spatial Planning Related with Geological Hazard in Cilegon, Banten, Indonesia
Authors: Muhammad L. A. Dwiyoga
Abstract:
The condition of a spatial land in the industrial park needs special attention to be studied more deeply. Geoplanology modeling can help arrange area according to his ability. This research method is to perform the analysis of remote sensing, Geographic Information System, and more comprehensive analysis to determine geological characteristics and the ability to land on the area of research and its relation to the geological disaster. Cilegon is part of Banten province located in western Java, and the direction of the north is the Strait of Borneo. While the southern part is bordering the Indian Ocean. Morphology study area is located in the highlands to low. In the highlands of identified potential landslide prone, whereas in low-lying areas of potential flooding. Moreover, in the study area has the potential prone to earthquakes, this is due to the proximity of enough research to Mount Krakatau and Subdcution Zone. From the results of this study show that the study area has a susceptibility to landslides located around the District Waringinkurung. While the region as a potential flood areas in the District of Cilegon and surrounding areas. Based on the seismic data, this area includes zones with a range of magnitude 1.5 to 5.5 magnitude at a depth of 1 to 60 Km. As for the ability of its territory, based on the analyzes and studies carried out the need for renewal of the map Spatial Plan that has been made, considering the development of a fairly rapid Cilegon area.Keywords: geoplanology, spatial plan, geological hazard, cilegon, Indonesia
Procedia PDF Downloads 50418093 The Internet of Things in Luxury Hotels: Generating Customized Multisensory Guest Experiences
Authors: Jean-Eric Pelet, Erhard Lick, Basma Taieb
Abstract:
Purpose This research bridges the gap between sensory marketing and the use of the Internet of Things (IoT) in luxury hotels. We investigated how stimulating guests’ senses through IoT devices influenced their emotions, affective experiences, eudaimonism (well-being), and, ultimately, guest behavior. We examined potential moderating effects of gender. Design/methodology/approach We adopted a mixed method approach, combining qualitative research (semi-structured interviews) to explore hotel managers’ perspectives on the potential use of IoT in luxury hotels and quantitative research (surveying hotel guests; n=357). Findings The results showed that while the senses of smell, hearing, and sight had an impact on guests’ emotions, the senses of touch, hearing, and sight impacted guests’ affective experiences. The senses of smell and taste influenced guests’ eudaimonism. The sense of smell had a greater effect on eudaimonism and behavioral intentions among women compared to men. Originality IoT can be applied in creating customized multi-sensory hotel experiences. For example, hotels may offer unique and diverse ambiences in their rooms and suites to improve guest experiences. Research limitations/implications This study concentrated on luxury hotels located in Europe. Further research may explore the generalizability of the findings (e.g., in other cultures, comparison between high-end and low-end hotels). Practical implications Context awareness and hyper-personalization, through intensive and continuous data collection (hyper-connectivity) and real time processing, are key trends in the service industry. Therefore, big data plays a crucial role in the collection of information since it allows hoteliers to retrieve, analyze, and visualize data to provide personalized services in real time. Together with their guests, hotels may co-create customized sensory experiences. For instance, if the hotel knows about the guest’s music preferences based on social media as well as their age and gender, etc. and considers the temperature and size (standard, suite, etc.) of the guest room, this may determine the playlist of the concierge-tablet made available in the guest room. Furthermore, one may record the guest’s voice to use it for voice command purposes once the guest arrives at the hotel. Based on our finding that the sense of smell has a greater impact on eudaimonism and behavioral intentions among women than men, hotels may deploy subtler scents with lower intensities, or even different scents, for female guests in comparison to male guests.Keywords: affective experience, emotional value, eudaimonism, hospitality industry, Internet of Things, sensory marketing
Procedia PDF Downloads 5718092 An Analytical Approach to Assess and Compare the Vulnerability Risk of Operating Systems
Authors: Pubudu K. Hitigala Kaluarachchilage, Champike Attanayake, Sasith Rajasooriya, Chris P. Tsokos
Abstract:
Operating system (OS) security is a key component of computer security. Assessing and improving OSs strength to resist against vulnerabilities and attacks is a mandatory requirement given the rate of new vulnerabilities discovered and attacks occurring. Frequency and the number of different kinds of vulnerabilities found in an OS can be considered an index of its information security level. In the present study five mostly used OSs, Microsoft Windows (windows 7, windows 8 and windows 10), Apple’s Mac and Linux are assessed for their discovered vulnerabilities and the risk associated with each. Each discovered and reported vulnerability has an exploitability score assigned in CVSS score of the national vulnerability database. In this study the risk from vulnerabilities in each of the five Operating Systems is compared. Risk Indexes used are developed based on the Markov model to evaluate the risk of each vulnerability. Statistical methodology and underlying mathematical approach is described. Initially, parametric procedures are conducted and measured. There were, however, violations of some statistical assumptions observed. Therefore the need for non-parametric approaches was recognized. 6838 vulnerabilities recorded were considered in the analysis. According to the risk associated with all the vulnerabilities considered, it was found that there is a statistically significant difference among average risk levels for some operating systems, indicating that according to our method some operating systems have been more risk vulnerable than others given the assumptions and limitations. Relevant test results revealing a statistically significant difference in the Risk levels of different OSs are presented.Keywords: cybersecurity, Markov chain, non-parametric analysis, vulnerability, operating system
Procedia PDF Downloads 18318091 Finding the Longest Common Subsequence in Normal DNA and Disease Affected Human DNA Using Self Organizing Map
Authors: G. Tamilpavai, C. Vishnuppriya
Abstract:
Bioinformatics is an active research area which combines biological matter as well as computer science research. The longest common subsequence (LCSS) is one of the major challenges in various bioinformatics applications. The computation of the LCSS plays a vital role in biomedicine and also it is an essential task in DNA sequence analysis in genetics. It includes wide range of disease diagnosing steps. The objective of this proposed system is to find the longest common subsequence which presents in a normal and various disease affected human DNA sequence using Self Organizing Map (SOM) and LCSS. The human DNA sequence is collected from National Center for Biotechnology Information (NCBI) database. Initially, the human DNA sequence is separated as k-mer using k-mer separation rule. Mean and median values are calculated from each separated k-mer. These calculated values are fed as input to the Self Organizing Map for the purpose of clustering. Then obtained clusters are given to the Longest Common Sub Sequence (LCSS) algorithm for finding common subsequence which presents in every clusters. It returns nx(n-1)/2 subsequence for each cluster where n is number of k-mer in a specific cluster. Experimental outcomes of this proposed system produce the possible number of longest common subsequence of normal and disease affected DNA data. Thus the proposed system will be a good initiative aid for finding disease causing sequence. Finally, performance analysis is carried out for different DNA sequences. The obtained values show that the retrieval of LCSS is done in a shorter time than the existing system.Keywords: clustering, k-mers, longest common subsequence, SOM
Procedia PDF Downloads 26718090 Sustainability of Vernacular Architecture in Zegalli Houses in Northern Iran with Emphasis on Their Seismic Behavior
Authors: Mona Zaryoun, Mahmood Hosseini, Seyed Mohammad Hassan Khalkhali, Haniyeh Okhovat
Abstract:
Zegalli houses in Guilan province, northern Iran, are a type of vernacular houses which their foundation, skeleton and walls all have been made of wood. The only houses which could survive the major Manjil-Rudbar earthquake of 1990 with a magnitude of 7.2 were these houses. Regarding this fact, some researchers started thinking of this type of foundations used in these houses to benefit from rocking-wise behavior. On the one hand, the relatively light weight of the houses, have helped these houses to withstand well against seismic excitations. In this paper at first a brief description of Zegalli houses and their architectural features, with emphasis on their foundation is presented. in the next stage foundation of one of these houses is modeled as a sample by a using a computer program, which has been developed in MATLAB environment, and by using the horizontal and vertical accelerograms of a set of selected site compatible earthquakes, a series of time history analysis (THA) are carried out to investigate the behavior of this type of houses against earthquake. Based on numerical results of THA it can be said that even without no sliding at the foundation timbers, only due to the rocking which occurs in various levels of the foundation the seismic response of the house is significantly reduced., which results in their stability subjected to earthquakes with peak ground acceleration of around 0.35g. Therefore, it can be recommended the Zegalli houses are considered as sustainable Iranian vernacular architecture, and it can be recommended that the use of these houses and their architecture and their structural merits are reconsidered by architects as well as civil and structural engineers.Keywords: MATLAB software, rocking behavior, time history analysis, Zegalli houses
Procedia PDF Downloads 28818089 Emoji, the Language of the Future: An Analysis of the Usage and Understanding of Emoji across User-Groups
Authors: Sakshi Bhalla
Abstract:
On the one hand, given their seemingly simplistic, near universal usage and understanding, emoji are discarded as a potential step back in the evolution of communication. On the other, their effectiveness, pervasiveness, and adaptability across and within contexts are undeniable. In this study, the responses of 40 people (categorized by age) were recorded based on a uniform two-part questionnaire where they were required to a) identify the meaning of 15 emoji when placed in isolation, and b) interpret the meaning of the same 15 emoji when placed in a context-defining posting on Twitter. Their responses were studied on the basis of deviation from their responses that identified the emoji in isolation, as well as the originally intended meaning ascribed to the emoji. Based on an analysis of these results, it was discovered that each of the five age categories uses, understands and perceives emoji differently, which could be attributed to the degree of exposure they have undergone. For example, in the case of the youngest category (aged < 20), it was observed that they were the least accurate at correctly identifying emoji in isolation (~55%). Further, their proclivity to change their response with respect to the context was also the least (~31%). However, an analysis of each of their individual responses showed that these first-borns of social media seem to have reached a point where emojis no longer inspire their most literal meanings to them. The meaning and implication of these emoji have evolved to imply their context-derived meanings, even when placed in isolation. These trends carry forward meaningfully for the other four groups as well. In the case of the oldest category (aged > 35), however, the trends indicated inaccuracy and therefore, a higher incidence of a proclivity to change their responses. When studied in a continuum, the responses indicate that slowly and steadily, emoji are evolving from pictograms to ideograms. That is to suggest that they do not just indicate a one-to-one relation between a singular form and singular meaning. In fact, they communicate increasingly complicated ideas. This is much like the evolution of ancient hieroglyphics on papyrus reed or cuneiform on Sumerian clay tablets, which evolved from simple pictograms to progressively more complex ideograms. This evolution within communication is parallel to and contingent on the simultaneous evolution of communication. What’s astounding is the capacity of humans to leverage different platforms to facilitate such changes. Twiterese, as it is now called, is one of the instances where language is adapting to the demands of the digital world. That it does not have a spoken component, an ostensible grammar, and lacks standardization of use and meaning, as some might suggest, may seem like impediments in qualifying it as the 'language' of the digital world. However, that kind of a declarative remains a function of time, and time alone.Keywords: communication, emoji, language, Twitter
Procedia PDF Downloads 9518088 Effect of Hydroxy Propyl Methyl Cellulose (HPMC) Coating in Combination with MGSO4 on Some Guava Cultivars
Authors: Muhammad Randhawa, Muhammad Nadeem
Abstract:
Guava (Psidium guajava L.) is a vital source of minerals, vitamins, dietary fiber and antioxidants. Owing to highly perishable nature and proning towards chilling injury, diseases, insect-pests and physical damage the main drawbacks of guava after harvesting, present study was designed. Due to its delicacy in physiology, economic importance, effects of pre and postharvest factors and maturity indices, guava fruits should be given prime importance for good quality attributes. In this study guava fruits were stored at 10°C with 80% relative humidity after treating with different levels of sulphate salt of magnesium followed by dipping in cellulose based edible coating hydroxy propyl methyl cellulose (HPMC). The main objective of this coating was to enhance the shelf life of guava by inhibiting the respiration and also by binding the dissolved solids with salt application. Characterization for quality attributes including physical, physiological and bio chemical analysis was performed after every 7 days interval till the fruit remains edible during the storage period of 4 weeks. Finally, data obtained was subjected to statistical analysis. It was concluded on statistical basis that Surahi variety (treated with 5% MgSO4) showed best storage stability and kept its original quality up to almost 23 days during storage.Keywords: edible coating, guava cultivars, physicochemical attributes, storage
Procedia PDF Downloads 32618087 Ubiquitous Learning Environments in Higher Education: A Scoping Literature Review
Authors: Mari A. Virtanen, Elina Haavisto, Eeva Liikanen, Maria Kääriäinen
Abstract:
Ubiquitous learning and the use of ubiquitous learning environments herald a new era in higher education. Ubiquitous environments fuse together authentic learning situations and digital learning spaces where students can seamlessly immerse themselves into the learning process. Definitions of ubiquitous learning are wide and vary in the previous literature and learning environments are not systemically described. The aim of this scoping review was to identify the criteria and the use of ubiquitous learning environments in higher education contexts. The objective was to provide a clear scope and a wide view for this research area. The original studies were collected from nine electronic databases. Seven publications in total were defined as eligible and included in the final review. An inductive content analysis was used for the data analysis. The reviewed publications described the use of ubiquitous learning environments (ULE) in higher education. Components, contents and outcomes varied between studies, but there were also many similarities. In these studies, the concept of ubiquitousness was defined as context-awareness, embeddedness, content-personalization, location-based, interactivity and flexibility and these were supported by using smart devices, wireless networks and sensing technologies. Contents varied between studies and were customized to specific uses. Measured outcomes in these studies were focused on multiple aspects as learning effectiveness, cost-effectiveness, satisfaction, and usefulness. This study provides a clear scope for ULE used in higher education. It also raises the need for transparent development and publication processes, and for practical implications of ubiquitous learning environments.Keywords: higher education, learning environment, scoping review, ubiquitous learning, u-learning
Procedia PDF Downloads 26318086 The Relationship between Coping Styles and Internet Addiction among High School Students
Authors: Adil Kaval, Digdem Muge Siyez
Abstract:
With the negative effects of internet use in a person's life, the use of the Internet has become an issue. This subject was mostly considered as internet addiction, and it was investigated. In literature, it is noteworthy that some theoretical models have been proposed to explain the reasons for internet addiction. In addition to these theoretical models, it may be thought that the coping style for stressing events can be a predictor of internet addiction. It was aimed to test with logistic regression the effect of high school students' coping styles on internet addiction levels. Sample of the study consisted of 770 Turkish adolescents (471 girls, 299 boys) selected from high schools in the 2017-2018 academic year in İzmir province. Internet Addiction Test, Coping Scale for Child and Adolescents and a demographic information form were used in this study. The results of the logistic regression analysis indicated that the model of coping styles predicted internet addiction provides a statistically significant prediction of internet addiction. Gender does not predict whether or not to be addicted to the internet. The active coping style is not effective on internet addiction levels, while the avoiding and negative coping style are effective on internet addiction levels. With this model, % 79.1 of internet addiction in high school is estimated. The Negelkerke pseudo R2 indicated that the model accounted for %35 of the total variance. The results of this study on Turkish adolescents are similar to the results of other studies in the literature. It can be argued that avoiding and negative coping styles are important risk factors in the development of internet addiction.Keywords: adolescents, coping, internet addiction, regression analysis
Procedia PDF Downloads 17418085 Sociology Perspective on Emotional Maltreatment: Retrospective Case Study in a Japanese Elementary School
Authors: Nozomi Fujisaka
Abstract:
This sociological case study analyzes a sequence of student maltreatment in an elementary school in Japan, based on narratives from former students. Among various forms of student maltreatment, emotional maltreatment has received less attention. One reason for this is that emotional maltreatment is often considered part of education and is difficult to capture in surveys. To discuss the challenge of recognizing emotional maltreatment, it's necessary to consider the social background in which student maltreatment occurs. Therefore, from the perspective of the sociology of education, this study aims to clarify the process through which emotional maltreatment was embraced by students within a Japanese classroom. The focus of this study is a series of educational interactions by a homeroom teacher with 11- or 12-year-old students at a small public elementary school approximately 10 years ago. The research employs retrospective narrative data collected through interviews and autoethnography. The semi-structured interviews, lasting one to three hours each, were conducted with 11 young people who were enrolled in the same class as the researcher during their time in elementary school. Autoethnography, as a critical research method, contributes to existing theories and studies by providing a critical representation of the researcher's own experiences. Autoethnography enables researchers to collect detailed data that is often difficult to verbalize in interviews. These research methods are well-suited for this study, which aims to shift the focus from teachers' educational intentions to students' perspectives and gain a deeper understanding of student maltreatment. The research results imply a pattern of emotional maltreatment that is challenging to differentiate from education. In this study's case, the teacher displayed calm and kind behavior toward students after a threat and an explosion of anger. Former students frequently mentioned this behavior of the teacher and perceived emotional maltreatment as part of education. It was not uncommon for former students to offer positive evaluations of the teacher despite experiencing emotional distress. These findings are analyzed and discussed in conjunction with the deschooling theory and the cycle of violence theory. The deschooling theory provides a sociological explanation for how emotional maltreatment can be overlooked in society. The cycle of violence theory, originally developed within the context of domestic violence, explains how violence between romantic partners can be tolerated due to prevailing social norms. Analyzing the case in association with these two theories highlights the characteristics of teachers' behaviors that rationalize maltreatment as education and hinder students from escaping emotional maltreatment. This study deepens our understanding of the causes of student maltreatment and provides a new perspective for future qualitative and quantitative research. Furthermore, since this research is based on the sociology of education, it has the potential to expand research in the fields of pedagogy and sociology, in addition to psychology and social welfare.Keywords: emotional maltreatment, education, student maltreatment, Japan
Procedia PDF Downloads 8418084 Impact of Behavioral Biases on Indian Investors: Case Analysis of a Mutual Fund Investment Company
Authors: Priyal Motwani, Garvit Goel
Abstract:
In this study, we have studied and analysed the transaction data of investors of a mutual fund investment company based in India. Based on the data available, we have identified the top four biases that affect the investors of the emerging market economies through regression analysis and three uniquely defined ratios. We found that the four most prominent biases that affected the investment making decisions in India are– Chauffer Knowledge, investors tend to make ambitious decisions about sectors they know little about; Bandwagon effect – the response of the market indices to macroeconomic events are more profound and seem to last longer compared to western markets; base-rate neglect – judgement about stocks are too much based on the most recent development ignoring the long-term fundamentals of the stock; availability bias – lack of proper communication channels of market information lead people to be too reliant on limited information they already have. After segregating the investors into six groups, the results have further been studied to identify a correlation among the demographics, gender and unique cultural identity of the derived groups and the corresponding prevalent biases. On the basis of the results obtained from the derived groups, our study recommends six methods, specific to each group, to educate the investors about the prevalent biases and their role in investment decision making.Keywords: Bandwagon effect, behavioural biases, Chauffeur knowledge, demographics, investor literacy, mutual funds
Procedia PDF Downloads 23018083 Impact of Drought on Agriculture in the Upper Middle Gangetic Plain in India
Authors: Reshmita Nath
Abstract:
In this study, we investigate the spatiotemporal characteristics of drought in India and its impact on agriculture during the summer season (April to September). For our analysis, we have used Standardized Precipitation Evapotranspiration Index (SPEI) datasets between 1982 and 2012 at six-month timescale. Based on the criteria SPEI<-1 we obtain the vulnerability map and have found that the Humid subtropical Upper Middle Gangetic Plain (UMGP) region is highly drought prone with an occurrence frequency of 40-45%. This UMGP region contributes at least 18-20% of India’s annual cereal production. Not only the probability, but the region becomes more and more drought-prone in the recent decades. Moreover, the cereal production in the UMGP has experienced a gradual declining trend from 2000 onwards and this feature is consistent with the increase in drought affected areas from 20-25% to 50-60%, before and after 2000, respectively. The higher correlation coefficient (-0.69) between the changes in cereal production and drought affected areas confirms that at least 50% of the agricultural (cereal) losses is associated with drought. While analyzing the individual impact of precipitation and surface temperature anomalies on SPEI (6), we have found that in the UMGP region surface temperature plays the primary role in lowering of SPEI. The linkage is further confirmed by the correlation analysis between the SPEI (6) and surface temperature rise, which exhibits strong negative values in the UMGP region. Higher temperature might have caused more evaporation and drying, which therefore increases the area affected by drought in the recent decade.Keywords: drought, agriculture, SPEI, Indo-Gangetic plain
Procedia PDF Downloads 25818082 Formulation and Evaluation of Glimepiride (GMP)-Solid Nanodispersion and Nanodispersed Tablets
Authors: Ahmed. Abdel Bary, Omneya. Khowessah, Mojahed. al-jamrah
Abstract:
Introduction: The major challenge with the design of oral dosage forms lies with their poor bioavailability. The most frequent causes of low oral bioavailability are attributed to poor solubility and low permeability. The aim of this study was to develop solid nanodispersed tablet formulation of Glimepiride for the enhancement of the solubility and bioavailability. Methodology: Solid nanodispersions of Glimepiride (GMP) were prepared using two different ratios of 2 different carriers, namely; PEG6000, pluronic F127, and by adopting two different techniques, namely; solvent evaporation technique and fusion technique. A full factorial design of 2 3 was adopted to investigate the influence of formulation variables on the prepared nanodispersion properties. The best chosen formula of nanodispersed powder was formulated into tablets by direct compression. The Differential Scanning Calorimetry (DSC) analysis and Fourier Transform Infra-Red (FTIR) analysis were conducted for the thermal behavior and surface structure characterization, respectively. The zeta potential and particle size analysis of the prepared glimepiride nanodispersions was determined. The prepared solid nanodispersions and solid nanodispersed tablets of GMP were evaluated in terms of pre-compression and post-compression parameters, respectively. Results: The DSC and FTIR studies revealed that there was no interaction between GMP and all the excipients used. Based on the resulted values of different pre-compression parameters, the prepared solid nanodispersions powder blends showed poor to excellent flow properties. The resulted values of the other evaluated pre-compression parameters of the prepared solid nanodispersion were within the limits of pharmacopoeia. The drug content of the prepared nanodispersions ranged from 89.6 ± 0.3 % to 99.9± 0.5% with particle size ranged from 111.5 nm to 492.3 nm and the resulted zeta potential (ζ ) values of the prepared GMP-solid nanodispersion formulae (F1-F8) ranged from -8.28±3.62 mV to -78±11.4 mV. The in-vitro dissolution studies of the prepared solid nanodispersed tablets of GMP concluded that GMP- pluronic F127 combinations (F8), exhibited the best extent of drug release, compared to other formulations, and to the marketed product. One way ANOVA for the percent of drug released from the prepared GMP-nanodispersion formulae (F1- F8) after 20 and 60 minutes showed significant differences between the percent of drug released from different GMP-nanodispersed tablet formulae (F1- F8), (P<0.05). Conclusion: Preparation of glimepiride as nanodispersed particles proven to be a promising tool for enhancing the poor solubility of glimepiride.Keywords: glimepiride, solid Nanodispersion, nanodispersed tablets, poorly water soluble drugs
Procedia PDF Downloads 488