Search results for: data quality
27999 Clinical Response of Nuberol Forte® (Paracetamol 650 MG+Orphenadrine 50 MG) For Pain Management with Musculoskeletal Conditions in Routine Pakistani Practice (NFORTE-EFFECT)
Authors: Shahid Noor, Kazim Najjad, Muhammad Nasir, Irshad Bhutto, Abdul Samad Memon, Khurram Anwar, Tehseen Riaz, Mian Muhammad Hanif, Nauman A. Mallik, Saeed Ahmed, Israr Ahmed, Ali Yasir
Abstract:
Background: Musculoskeletal pain is the most common complaint presented to the health practitioner. It is well known that untreated or under-treated pain can have a significant negative impact on an individual’s quality of life (QoL). Objectives: This study was conducted across 10 sites in six (6) major cities of Pakistan to evaluate the tolerability, safety, and the clinical response of Nuberol Forte® (Paracetamol 650 mg + Orphenadrine 50 mg) to musculoskeletal pain in routine Pakistani practice and its impact on improving the patient’s QoL. Design & Methods: This NFORT-EFFECT observational, prospective multicenter study was conducted in compliance with Good Clinical Practice guidelines and local regulatory requirements. The study sponsor was "The Searle Company Limited, Pakistan. To maintain the GCP compliances, the sponsor assigned the CRO for the site and data management. Ethical approval was obtained from an independent ethics committee. The IEC reviewed the progress of the study. Written informed consent was obtained from the study participants, and their confidentiality was maintained throughout the study. A total of 399 patients with known prescreened musculoskeletal conditions and pain who attended the study sites were recruited, as per the inclusion/exclusion criteria (clinicaltrials.gov ID# NCT04765787). The recruited patients were then prescribed Paracetamol (650 mg) and Orphenadrine (50 mg) combination (Nuberol Forte®) for 7 to 14 days as per the investigator's discretion based on the pain intensity. After the initial screening (visit 1), a follow-up visit was conducted after 1-2 weeks of the treatment (visit 2). Study Endpoints: The primary objective was to assess the pain management response of Nuberol Forte treatment and the overall safety of the drug. The Visual Analogue Scale (VAS) scale was used to measure pain severity. Secondary to pain, the patients' health-related quality of life (HRQoL) was also assessed using the Muscle, Joint Measure (MJM) scale. The safety was monitored on the first dose by the patients. These assessments were done on each study visit. Results: Out of 399 enrolled patients, 49.4% were males, and 50.6% were females with a mean age of 47.24 ± 14.20 years. Most patients were presented with Knee Osteoarthritis (OA), i.e., 148(38%), followed by backache 70(18.2%). A significant reduction in the mean pain score was observed after the treatment with the combination of Paracetamol and Orphenadrine (p<0.05). Furthermore, an overall improvement in the patient’s QoL was also observed. During the study, only ten patients reported mild adverse events (AEs). Conclusion: The combination of Paracetamol and Orphenadrine (Nuberol Forte®) exhibited effective pain management among patients with musculoskeletal conditions and also improved their QoL.Keywords: musculoskeletal pain, orphenadrine/paracetamol combination, pain management, quality of life, Pakistani population
Procedia PDF Downloads 16927998 The Effect of Brand Recovery Communications on Embarrassed Consumers’ Cognitive Appraisal and Post-purchase Behavior
Authors: Kin Yan Ho
Abstract:
Negative brand news (such as Volkswagen’s faulty carbon emission reports, China’s Luckin Coffee scandal, and bribery in reputable US universities) influence how people perceive a company. Germany’s citizens claimed Volkswagen’s scandal as a national embarrassment and cannot recover their psychological damages through monetary and non-monetary compensation. The main research question is to examine how consumers evaluate and respond to embarrassing brand publicity. The cognitive appraisal theory is used as a theoretical foundation. This study describes the use of scenario-based experiment. The findings suggest that consumers with different levels of embarrassment evaluate brand remedial offers from emotion-focused and task-focused restorative justice perspectives (newly derived from the well-established scales of perceived justice). When consumers face both negative and positive brand information (i.e., negative publicity news and a remedial offer), they change their appraisal criterion. The social situation in the cognitive reappraisal process influences the quality of the customer-brand relationship and the customer’s recovery from brand embarrassment. The results also depict that the components of recovery compensation cause differences in emotion recovery, relationship quality, and repurchase intentions. This study extends embarrassment literature in an embarrassing brand publicity context. The emotional components of brand remedial tactics provide insights to brand managers on how to handle different consumers’ emotions, consumer satisfaction, and foster positive future behavior.Keywords: brand relationship quality, cognitive appraisal, crisis communications, emotion, justice, social presence
Procedia PDF Downloads 13427997 Presentation of the Model of Reliability of the Signaling System with Emphasis on Determining Best Time Schedule for Repairments and Preventive Maintenance in the Iranian Railway
Authors: Maziar Yazdani, Ahmad Khodaee, Fatemeh Hajizadeh
Abstract:
The purpose of this research was analysis of the reliability of the signaling system in the railway and planning repair and maintenance of its subsystems. For this purpose, it will be endeavored to introduce practical strategies for activities control and appropriate planning for repair and preventive maintenance by statistical modeling of reliability. Therefore, modeling, evaluation, and promotion of reliability of the signaling system appear very critical. Among the key goals of the railway is provision of quality service for passengers and this purpose is gained by increasing reliability, availability, maintainability and safety of (RAMS). In this research, data were analyzed, and the reliability of the subsystems and entire system was calculated and with emphasis on preservation of performance of each of the subsystems with a reliability of 80%, a plan for repair and preventive maintenance of the subsystems of the signaling system was introduced.Keywords: reliability, modeling reliability, plan for repair and preventive maintenance, signaling system
Procedia PDF Downloads 18427996 Accurate HLA Typing at High-Digit Resolution from NGS Data
Authors: Yazhi Huang, Jing Yang, Dingge Ying, Yan Zhang, Vorasuk Shotelersuk, Nattiya Hirankarn, Pak Chung Sham, Yu Lung Lau, Wanling Yang
Abstract:
Human leukocyte antigen (HLA) typing from next generation sequencing (NGS) data has the potential for applications in clinical laboratories and population genetic studies. Here we introduce a novel technique for HLA typing from NGS data based on read-mapping using a comprehensive reference panel containing all known HLA alleles and de novo assembly of the gene-specific short reads. An accurate HLA typing at high-digit resolution was achieved when it was tested on publicly available NGS data, outperforming other newly-developed tools such as HLAminer and PHLAT.Keywords: human leukocyte antigens, next generation sequencing, whole exome sequencing, HLA typing
Procedia PDF Downloads 66427995 Early Childhood Education: Teachers Ability to Assess
Authors: Ade Dwi Utami
Abstract:
Pedagogic competence is the basic competence of teachers to perform their tasks as educators. The ability to assess has become one of the demands in teachers pedagogic competence. Teachers ability to assess is related to curriculum instructions and applications. This research is aimed at obtaining data concerning teachers ability to assess that comprises of understanding assessment, determining assessment type, tools and procedure, conducting assessment process, and using assessment result information. It uses mixed method of explanatory technique in which qualitative data is used to verify the quantitative data obtained through a survey. The technique of quantitative data collection is by test whereas the qualitative data collection is by observation, interview and documentation. Then, the analyzed data is processed through a proportion study technique to be categorized into high, medium and low. The result of the research shows that teachers ability to assess can be grouped into 3 namely, 2% of high, 4% of medium and 94% of low. The data shows that teachers ability to assess is still relatively low. Teachers are lack of knowledge and comprehension in assessment application. The statement is verified by the qualitative data showing that teachers did not state which aspect was assessed in learning, record children’s behavior, and use the data result as a consideration to design a program. Teachers have assessment documents yet they only serve as means of completing teachers administration for the certification program. Thus, assessment documents were not used with the basis of acquired knowledge. The condition should become a consideration of the education institution of educators and the government to improve teachers pedagogic competence, including the ability to assess.Keywords: assessment, early childhood education, pedagogic competence, teachers
Procedia PDF Downloads 24627994 Creating Smart and Healthy Cities by Exploring the Potentials of Emerging Technologies and Social Innovation for Urban Efficiency: Lessons from the Innovative City of Boston
Authors: Mohammed Agbali, Claudia Trillo, Yusuf Arayici, Terrence Fernando
Abstract:
The wide-spread adoption of the Smart City concept has introduced a new era of computing paradigm with opportunities for city administrators and stakeholders in various sectors to re-think the concept of urbanization and development of healthy cities. With the world population rapidly becoming urban-centric especially amongst the emerging economies, social innovation will assist greatly in deploying emerging technologies to address the development challenges in core sectors of the future cities. In this context, sustainable health-care delivery and improved quality of life of the people is considered at the heart of the healthy city agenda. This paper examines the Boston innovation landscape from the perspective of smart services and innovation ecosystem for sustainable development, especially in transportation and healthcare. It investigates the policy implementation process of the Healthy City agenda and eHealth economy innovation based on the experience of Massachusetts’s City of Boston initiatives. For this purpose, three emerging areas are emphasized, namely the eHealth concept, the innovation hubs, and the emerging technologies that drive innovation. This was carried out through empirical analysis on results of public sector and industry-wide interviews/survey about Boston’s current initiatives and the enabling environment. The paper highlights few potential research directions for service integration and social innovation for deploying emerging technologies in the healthy city agenda. The study therefore suggests the need to prioritize social innovation as an overarching strategy to build sustainable Smart Cities in order to avoid technology lock-in. Finally, it concludes that the Boston example of innovation economy is unique in view of the existing platforms for innovation and proper understanding of its dynamics, which is imperative in building smart and healthy cities where quality of life of the citizenry can be improved.Keywords: computing paradigm, emerging technologies, equitable healthcare, healthy cities, open data, smart city, social innovation
Procedia PDF Downloads 33627993 Perception of Public Transport Quality of Service among Regular Private Vehicle Users in Five European Cities
Authors: Juan de Ona, Esperanza Estevez, Rocío de Ona
Abstract:
Urban traffic levels can be reduced by drawing travelers away from private vehicles over to using public transport. This modal change can be achieved by either introducing restrictions on private vehicles or by introducing measures which increase people’s satisfaction with public transport. For public transport users, quality of service affects customer satisfaction, which, in turn, influences the behavioral intentions towards the service. This paper intends to identify the main attributes which influence the perception private vehicle users have about the public transport services provided in five European cities: Berlin, Lisbon, London, Madrid and Rome. Ordinal logit models have been applied to an online panel survey with a sample size of 2,500 regular private vehicle users (approximately 500 inhabitants per city). To achieve a comprehensive analysis and to deal with heterogeneity in perceptions, 15 models have been developed for the entire sample and 14 user segments. The results show differences between the cities and among the segments. Madrid was taken as reference city and results indicate that the inhabitants are satisfied with public transport in Madrid and that the most important public transport service attributes for private vehicle users are frequency, speed and intermodality. Frequency is an important attribute for all the segments, while speed and intermodality are important for most of the segments. An analysis by segments has identified attributes which, although not important in most cases, are relevant for specific segments. This study also points out important differences between the five cities. Findings from this study can be used to develop policies and recommendations for persuading.Keywords: service quality, satisfaction, public transportation, private vehicle users, car users, segmentation, ordered logit
Procedia PDF Downloads 11727992 Improvement of Water Quality of Al Asfar Lake Using Constructed Wetland System
Authors: Jamal Radaideh
Abstract:
Al-Asfar Lake is located about 14 km east of Al-Ahsa and is one of the most important wetland lakes in the Al Ahsa/Eastern Province of Saudi Arabia. Al-Ahsa is may be the largest oasis in the world, having an area of 20,000 hectares, in addition, it is of the largest and oldest agricultural centers in the region. The surplus farm irrigation water beside additional water supplied by treated wastewater from Al-Hofuf sewage station is collected by a drainage network and discharged into Al-Asfar Lake. The lake has good wetlands, sand dunes as well as large expanses of open and shallow water. Salt tolerant vegetation is present in some of the shallow areas around the lake, and huge stands of Phragmites reeds occur around the lake. The lake presents an important habitat for wildlife and birds, something not expected to find in a large desert. Although high evaporation rates in the range of 3250 mm are common, the water remains in the evaporation lakes during all seasons of the year is used to supply cattle with drinking water and for aquifer recharge. Investigations showed that high concentrations of nitrogen (N), phosphorus (P), biological oxygen demand (BOD), chemical oxygen demand (COD) and salinity discharge to Al Asfar Lake from the D2 drain exist. It is expected that the majority of BOD, COD and N originates from wastewater discharge and leachate from surplus irrigation water which also contribute to the majority of P and salinity. The significant content of nutrients and biological oxygen demand reduces available oxygen in the water. The present project aimed to improve the water quality of the lake using constructed wetland trains which will be built around the lake. Phragmites reeds, which already occur around the lake, will be used.Keywords: Al Asfar lake, constructed wetland, water quality, water treatment
Procedia PDF Downloads 44927991 Effect of Texturised Soy Protein and Yeast on the Instrumental and Sensory Quality of Hybrid Beef Meatballs
Authors: Simona Grasso, Gabrielle Smith, Sophie Bowers, Oluseyi Moses Ajayi, Mark Swainson
Abstract:
Hybrid meat analogues are meat products whereby a proportion of meat has been partially replaced by more sustainable protein sources. These products could bridge the gap between meat and meat-free products, providing convenience, and allowing consumers to continue using meat products as they conventionally would, while lowering their overall meat intake. The study aimed to investigate the effect of introducing texturized soy protein (TSP) at different levels (15% and 30%) with and without nutritional yeast as flavour enhancer on the sensory and instrumental quality of beef meatballs, compared to a soy and yeast-free control. Proximate analysis, yield, colour, instrumental texture, and sensory quality were investigated. The addition of soy and yeast did not have significant effects on the overall protein content, but the total fat and moisture content went down with increasing soy substitution. Samples with 30% TSP had significantly higher yield than the other recipes. In terms of colour, a* redness values tended to go down and b* yellowness values tended to go up with increasing soy addition. The addition of increasing levels of soy and yeast modified the structure of meatballs resulting in a progressive decrease in hardness and chewiness compared to control. Sixty participants assessed the samples using Check-all-that-apply (CATA) questions and hedonic scales. The texture of all TSP-containing samples received significantly higher acceptability scores than control, while 15% TSP with yeast received significantly higher flavour and overall acceptability scores than control. Control samples were significantly more often associated than the other recipes to the term 'hard' and the least associated to 'soft' and 'crumbly and easy to cut'. All recipes were similarly associated to the terms 'weak meaty', 'strong meaty', 'characteristic' and 'unusual'. Correspondence analysis separated the meatballs in three distinct groups: 1) control; 2) 30%TSP with yeast; and 3) 15%TSP, 15%TSP with yeast and 30%TSP located together on the sensory map, showing similarity. Adding 15-30% TSP with or without yeast inclusion could be beneficial for the development of future meat hybrids with acceptable sensory quality. These results can provide encouragement for the use of the hybrid concept by the meat industry to promote the partial substitution of meat in flexitarians’ diets.Keywords: CATA, hybrid meat products, texturised soy protein, yeast
Procedia PDF Downloads 16527990 Application of Combined Cluster and Discriminant Analysis to Make the Operation of Monitoring Networks More Economical
Authors: Norbert Magyar, Jozsef Kovacs, Peter Tanos, Balazs Trasy, Tamas Garamhegyi, Istvan Gabor Hatvani
Abstract:
Water is one of the most important common resources, and as a result of urbanization, agriculture, and industry it is becoming more and more exposed to potential pollutants. The prevention of the deterioration of water quality is a crucial role for environmental scientist. To achieve this aim, the operation of monitoring networks is necessary. In general, these networks have to meet many important requirements, such as representativeness and cost efficiency. However, existing monitoring networks often include sampling sites which are unnecessary. With the elimination of these sites the monitoring network can be optimized, and it can operate more economically. The aim of this study is to illustrate the applicability of the CCDA (Combined Cluster and Discriminant Analysis) to the field of water quality monitoring and optimize the monitoring networks of a river (the Danube), a wetland-lake system (Kis-Balaton & Lake Balaton), and two surface-subsurface water systems on the watershed of Lake Neusiedl/Lake Fertő and on the Szigetköz area over a period of approximately two decades. CCDA combines two multivariate data analysis methods: hierarchical cluster analysis and linear discriminant analysis. Its goal is to determine homogeneous groups of observations, in our case sampling sites, by comparing the goodness of preconceived classifications obtained from hierarchical cluster analysis with random classifications. The main idea behind CCDA is that if the ratio of correctly classified cases for a grouping is higher than at least 95% of the ratios for the random classifications, then at the level of significance (α=0.05) the given sampling sites don’t form a homogeneous group. Due to the fact that the sampling on the Lake Neusiedl/Lake Fertő was conducted at the same time at all sampling sites, it was possible to visualize the differences between the sampling sites belonging to the same or different groups on scatterplots. Based on the results, the monitoring network of the Danube yields redundant information over certain sections, so that of 12 sampling sites, 3 could be eliminated without loss of information. In the case of the wetland (Kis-Balaton) one pair of sampling sites out of 12, and in the case of Lake Balaton, 5 out of 10 could be discarded. For the groundwater system of the catchment area of Lake Neusiedl/Lake Fertő all 50 monitoring wells are necessary, there is no redundant information in the system. The number of the sampling sites on the Lake Neusiedl/Lake Fertő can decrease to approximately the half of the original number of the sites. Furthermore, neighbouring sampling sites were compared pairwise using CCDA and the results were plotted on diagrams or isoline maps showing the location of the greatest differences. These results can help researchers decide where to place new sampling sites. The application of CCDA proved to be a useful tool in the optimization of the monitoring networks regarding different types of water bodies. Based on the results obtained, the monitoring networks can be operated more economically.Keywords: combined cluster and discriminant analysis, cost efficiency, monitoring network optimization, water quality
Procedia PDF Downloads 34927989 Project-Based Learning Application: Applying Systems Thinking Concepts to Assure Continuous Improvement
Authors: Kimberley Kennedy
Abstract:
The major findings of this study discuss the importance of understanding and applying Systems thinking concepts to ensure an effective Project-Based Learning environment. A pilot project study of a major pedagogical change was conducted over a five year period with the goal to give students real world, hands-on learning experiences and the opportunity to apply what they had learned over the past two years of their business program. The first two weeks of the fifteen week semester utilized teaching methods of lectures, guest speakers and design thinking workshops to prepare students for the project work. For the remaining thirteen weeks of the semester, the students worked with actual business owners and clients on projects and challenges. The first three years of the five year study focused on student feedback to ensure a quality learning experience and continuous improvement process was developed. The final two years of the study, examined the conceptual understanding and perception of learning and teaching by faculty using Project-Based Learning pedagogy as compared to lectures and more traditional teaching methods was performed. Relevant literature was reviewed and data collected from program faculty participants who completed pre-and post-semester interviews and surveys over a two year period. Systems thinking concepts were applied to better understand the challenges for faculty using Project-Based Learning pedagogy as compared to more traditional teaching methods. Factors such as instructor and student fatigue, motivation, quality of work and enthusiasm were explored to better understand how to provide faculty with effective support and resources when using Project-Based Learning pedagogy as the main teaching method. This study provides value by presenting generalizable, foundational knowledge by offering suggestions for practical solutions to assure student and teacher engagement in Project-Based Learning courses.Keywords: continuous improvement, project-based learning, systems thinking, teacher engagement
Procedia PDF Downloads 11927988 Statistical Analysis for Overdispersed Medical Count Data
Authors: Y. N. Phang, E. F. Loh
Abstract:
Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling over-dispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling over-dispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling over-dispersed medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling over-dispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian, and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling over-dispersed medical count data when ZIP and ZINB are inadequate.Keywords: zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit
Procedia PDF Downloads 54427987 A Method to Identify Areas for Hydraulic Fracturing by Using Production Logging Tools
Authors: Armin Shirbazo, Hamed Lamei Ramandi, Mohammad Vahab, Jalal Fahimpour
Abstract:
Hydraulic fracturing, especially multi-stage hydraulic fracturing, is a practical solution for wells with uneconomic production. The wide range of applications is appraised appropriately to have a stable well-production. Production logging tool, which is known as PLT in the oil and gas industry, is counted as one of the most reliable methods to evaluate the efficiency of fractures jobs. This tool has a number of benefits and can be used to prevent subsequent production failure. It also distinguishes different problems that occurred during well-production. In this study, the effectiveness of hydraulic fracturing jobs is examined by using the PLT in various cases and situations. The performance of hydraulically fractured wells is investigated. Then, the PLT is employed to gives more information about the properties of different layers. The PLT is also used to selecting an optimum fracturing design. The results show that one fracture and three-stage fractures behave differently. In general, the one-stage fracture should be created in high-quality areas of the reservoir to have better performance, and conversely, in three-stage fractures, low-quality areas are a better candidate for fracturingKeywords: multi-stage fracturing, horizontal well, PLT, fracture length, number of stages
Procedia PDF Downloads 19427986 Monotone Rational Trigonometric Interpolation
Authors: Uzma Bashir, Jamaludin Md. Ali
Abstract:
This study is concerned with the visualization of monotone data using a piece-wise C1 rational trigonometric interpolating scheme. Four positive shape parameters are incorporated in the structure of rational trigonometric spline. Conditions on two of these parameters are derived to attain the monotonicity of monotone data and other two are left-free. Figures are used widely to exhibit that the proposed scheme produces graphically smooth monotone curves.Keywords: trigonometric splines, monotone data, shape preserving, C1 monotone interpolant
Procedia PDF Downloads 27127985 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels
Authors: Joshua Buli, David Pietrowski, Samuel Britton
Abstract:
Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization
Procedia PDF Downloads 8627984 Contraceptives: Experiences of Agency and Coercion of Young People Living in Colombia
Authors: Paola Montenegro, Maria de los Angeles Balaguera Villa
Abstract:
Contraceptive methods play a fundamental role in preventing unwanted pregnancies and protecting users from sexually transmitted infections (STIs). Despite being known to almost the entire population of reproductive age living in Colombia, there are barriers, practices and complex notions about contraceptives that affect their desired mass use and effectiveness. This work aims to analyse some of the perceptions and practices discussed with young people (13-28 years old) living in Colombia regarding the use of contraceptives in their daily lives, preferences, needs and perceived side effects. This research also examines the perceived paradox in autonomy that young people experience regarding contraceptive use: in one hand, its use (or lack of it) is interpreted as an act of self-determination and primary example of reproductive agency, on the other hand, it was frequently associated with coercion and limited autonomy derived from the gaps in reliable information available for young people, the difficulty of accessing certain preferred methods, and sometimes the experienced coercion exercise by doctors, partners and/or family members. The data and analysis discussed in this work stems from a research project whose objective was to provide information about needs and preferences in sexual and reproductive health of young people living in Colombia in relation to a possible telehealth service that could close the gap in access to quality care and safe information. Through a mixed methods approach, this study collected 5.736 responses to a virtual survey disseminated nationwide in Colombia and 47 inperson interviews (24 of them with people who were assigned female at birth and 21 with local key stakeholders in the abortion ecosystem). Quantitative data was analyzed using Stata SE Version 16.0 and qualitative analysis was completed through NVivo using thematic analysis. Key findings on contraception use in young people living in Colombia reveal that 85,8% of participants had used a contraceptive method in the last two years, and that the most commonly used methods were condoms, contraceptive pills, the morning-after pill and the method of interruption. The remaining 14,2% of respondents who declared to not have used contraceptives in the last two years expressed that the main four barriers to access were: "Lack of knowledge about contraceptive methods and where to obtain information and/or access them (13.9%)", "Have had sex with people who have vaginas (10.2%)", "Cost of contraceptive method (8.4%)" and "Difficulties in obtaining medical authorisations (7.6%)". These barriers coincided with the ones used to explain the non-use of contraceptives in young people, which reveals that limitations in information, cost, and quality care represent structural issues that need to be address in programmes, services, and public policy. Finally, interviews showed that young people perceive contraceptive use and non-use as an example of reaffirming reproductive agency and limitations to this can be explained through the widespread incomplete knowledge about how methods work and the prevalence of other social representations of contraception associated with trust, fidelity, and partner preferences, that in the end create limitations to young people’s autonomy.Keywords: contraception, family planning, premarital fertility, unplanned pregnancy
Procedia PDF Downloads 7627983 Effect of Chain Length on Skeletonema pseudocostatum as Probed by THz Spectroscopy
Authors: Ruqyyah Mushtaq, Chiacar Gamberdella, Roberta Miroglio, Fabio Novelli, Domenica Papro, M. Paturzo, A. Rubano, Angela Sardo
Abstract:
Microalgae, particularly diatoms, are well suited for monitoring environmental health, especially in assessing the quality of seas and rivers in terms of organic matter, nutrients, and heavy metal pollution. They respond rapidly to changes in habitat quality. In this study, we focused on Skeletonema pseudocostatum, a unicellular alga that forms chains depending on environmental conditions. Specifically, we explored whether metal toxicants could affect the growth of these algal chains, potentially serving as an ecotoxicological indicator of heavy metal pollution. We utilized THz spectroscopy in conjunction with standard optical microscopy to observe the formation of these chains and their response to toxicants. Despite the strong absorption of terahertz radiation in water, we demonstrate that changes in water absorption in the terahertz range due to water-diatom interaction can provide insights into diatom chain length.Keywords: THz-TDS spectroscopy, diatoms, marine ecotoxicology, marine pollution
Procedia PDF Downloads 3127982 Elevating Environmental Impact Assessment through Remote Sensing in Engineering
Authors: Spoorthi Srupad
Abstract:
Environmental Impact Assessment (EIA) stands as a critical engineering application facilitated by Earth Resources and Environmental Remote Sensing. Employing advanced technologies, this process enables a systematic evaluation of potential environmental impacts arising from engineering projects. Remote sensing techniques, including satellite imagery and geographic information systems (GIS), play a pivotal role in providing comprehensive data for assessing changes in land cover, vegetation, water bodies, and air quality. This abstract delves into the significance of EIA in engineering, emphasizing its role in ensuring sustainable and environmentally responsible practices. The integration of remote sensing technologies enhances the accuracy and efficiency of impact assessments, contributing to informed decision-making and the mitigation of adverse environmental consequences associated with engineering endeavors.Keywords: environmental impact assessment, engineering applications, sustainability, environmental monitoring, remote sensing, geographic information systems, environmental management
Procedia PDF Downloads 9227981 Exergy Analysis of Reverse Osmosis for Potable Water and Land Irrigation
Authors: M. Sarai Atab, A. Smallbone, A. P. Roskilly
Abstract:
A thermodynamic study is performed on the Reverse Osmosis (RO) desalination process for brackish water. The detailed RO model of thermodynamics properties with and without an energy recovery device was built in Simulink/MATLAB and validated against reported measurement data. The efficiency of desalination plants can be estimated by both the first and second laws of thermodynamics. While the first law focuses on the quantity of energy, the second law analysis (i.e. exergy analysis) introduces quality. This paper used the Main Outfall Drain in Iraq as a case study to conduct energy and exergy analysis of RO process. The result shows that it is feasible to use energy recovery method for reverse osmosis with salinity less than 15000 ppm as the exergy efficiency increases twice. Moreover, this analysis shows that the highest exergy destruction occurs in the rejected water and lowest occurs in the permeate flow rate accounting 37% for 4.3% respectively.Keywords: brackish water, exergy, irrigation, reverse osmosis (RO)
Procedia PDF Downloads 17427980 Integration of Knowledge and Metadata for Complex Data Warehouses and Big Data
Authors: Jean Christian Ralaivao, Fabrice Razafindraibe, Hasina Rakotonirainy
Abstract:
This document constitutes a resumption of work carried out in the field of complex data warehouses (DW) relating to the management and formalization of knowledge and metadata. It offers a methodological approach for integrating two concepts, knowledge and metadata, within the framework of a complex DW architecture. The objective of the work considers the use of the technique of knowledge representation by description logics and the extension of Common Warehouse Metamodel (CWM) specifications. This will lead to a fallout in terms of the performance of a complex DW. Three essential aspects of this work are expected, including the representation of knowledge in description logics and the declination of this knowledge into consistent UML diagrams while respecting or extending the CWM specifications and using XML as pivot. The field of application is large but will be adapted to systems with heteroge-neous, complex and unstructured content and moreover requiring a great (re)use of knowledge such as medical data warehouses.Keywords: data warehouse, description logics, integration, knowledge, metadata
Procedia PDF Downloads 13827979 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution
Authors: Nikolay P. Brayanov, Anna V. Stoynova
Abstract:
Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development
Procedia PDF Downloads 24427978 Data Analytics in Energy Management
Authors: Sanjivrao Katakam, Thanumoorthi I., Antony Gerald, Ratan Kulkarni, Shaju Nair
Abstract:
With increasing energy costs and its impact on the business, sustainability today has evolved from a social expectation to an economic imperative. Therefore, finding methods to reduce cost has become a critical directive for Industry leaders. Effective energy management is the only way to cut costs. However, Energy Management has been a challenge because it requires a change in old habits and legacy systems followed for decades. Today exorbitant levels of energy and operational data is being captured and stored by Industries, but they are unable to convert these structured and unstructured data sets into meaningful business intelligence. It must be noted that for quick decisions, organizations must learn to cope with large volumes of operational data in different formats. Energy analytics not only helps in extracting inferences from these data sets, but also is instrumental in transformation from old approaches of energy management to new. This in turn assists in effective decision making for implementation. It is the requirement of organizations to have an established corporate strategy for reducing operational costs through visibility and optimization of energy usage. Energy analytics play a key role in optimization of operations. The paper describes how today energy data analytics is extensively used in different scenarios like reducing operational costs, predicting energy demands, optimizing network efficiency, asset maintenance, improving customer insights and device data insights. The paper also highlights how analytics helps transform insights obtained from energy data into sustainable solutions. The paper utilizes data from an array of segments such as retail, transportation, and water sectors.Keywords: energy analytics, energy management, operational data, business intelligence, optimization
Procedia PDF Downloads 36427977 Efficient Frequent Itemset Mining Methods over Real-Time Spatial Big Data
Authors: Hamdi Sana, Emna Bouazizi, Sami Faiz
Abstract:
In recent years, there is a huge increase in the use of spatio-temporal applications where data and queries are continuously moving. As a result, the need to process real-time spatio-temporal data seems clear and real-time stream data management becomes a hot topic. Sliding window model and frequent itemset mining over dynamic data are the most important problems in the context of data mining. Thus, sliding window model for frequent itemset mining is a widely used model for data stream mining due to its emphasis on recent data and its bounded memory requirement. These methods use the traditional transaction-based sliding window model where the window size is based on a fixed number of transactions. Actually, this model supposes that all transactions have a constant rate which is not suited for real-time applications. And the use of this model in such applications endangers their performance. Based on these observations, this paper relaxes the notion of window size and proposes the use of a timestamp-based sliding window model. In our proposed frequent itemset mining algorithm, support conditions are used to differentiate frequents and infrequent patterns. Thereafter, a tree is developed to incrementally maintain the essential information. We evaluate our contribution. The preliminary results are quite promising.Keywords: real-time spatial big data, frequent itemset, transaction-based sliding window model, timestamp-based sliding window model, weighted frequent patterns, tree, stream query
Procedia PDF Downloads 16227976 A Model of Teacher Leadership in History Instruction
Authors: Poramatdha Chutimant
Abstract:
The objective of the research was to propose a model of teacher leadership in history instruction for utilization. Everett M. Rogers’ Diffusion of Innovations Theory is applied as theoretical framework. Qualitative method is to be used in the study, and the interview protocol used as an instrument to collect primary data from best practices who awarded by Office of National Education Commission (ONEC). Open-end questions will be used in interview protocol in order to gather the various data. Then, information according to international context of history instruction is the secondary data used to support in the summarizing process (Content Analysis). Dendrogram is a key to interpret and synthesize the primary data. Thus, secondary data comes as the supportive issue in explanation and elaboration. In-depth interview is to be used to collected information from seven experts in educational field. The focal point is to validate a draft model in term of future utilization finally.Keywords: history study, nationalism, patriotism, responsible citizenship, teacher leadership
Procedia PDF Downloads 28027975 Telephone Health Service to Improve the Quality of Life of the People Living with AIDS in Eastern Nepal
Authors: Ram Sharan Mehta, Naveen Kumar Pandey, Binod Kumar Deo
Abstract:
Quality of Life (QOL) is an important component in the evaluation of the well-being of People Living with AIDS (PLWA). This study assessed the effectiveness of education intervention programme in improving the QOL of PLWA on ART attaining the ART-clinics at B. P. Koirala Institute of Health Sciences (BPKIHS), Nepal. A pre-experimental research design was used to conduct the study among the PLWA on ART at BPKIHS from June to August 2013 involving 60 PLWA on pre-test randomly. The mean age of the respondents was 36.70 ± 9.92, and majority of them (80%) were of age group of 25-50 years and Male (56.7%). After education intervention programme there is significant change in the QOL in all the four domains i.e. Physical (p=0.008), Psychological (p=0.019), Social (p=0.046) and Environmental (p=0.032) using student t-test at 0.05 level of significance. There is significant (p= 0.016) difference in the mean QOL scores of pre-test and post-test. High QOL scores in post-test after education intervention programme may reflective of the effectiveness of planned education interventions programme.Keywords: telephone, AIDS, health service, Nepal
Procedia PDF Downloads 50227974 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation
Authors: Mohammad Anwar, Shah Waliullah
Abstract:
This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model
Procedia PDF Downloads 6827973 Analysis of the Effects of Institutions on the Sub-National Distribution of Aid Using Geo-Referenced AidData
Authors: Savas Yildiz
Abstract:
The article assesses the performance of international aid donors to determine the sub-national distribution of their aid projects dependent on recipient countries’ governance. The present paper extends the scope from a cross-country perspective to a more detailed analysis by looking at the effects of institutional qualities on the sub-national distribution of foreign aid. The analysis examines geo-referenced aid project in 37 countries and 404 regions at the first administrative division level in Sub-Saharan Africa from the World Bank (WB) and the African Development Bank (ADB) that were approved between the years 2000 and 2011. To measure the influence of institutional qualities on the distribution of aid the following measures are used: control of corruption, government effectiveness, regulatory quality and rule of law from the World Governance Indicators (WGI) and the corruption perception index from Transparency International. Furthermore, to assess the importance of ethnic heterogeneity on the sub-national distribution of aid projects, the study also includes interaction terms measuring ethnic fragmentation. The regression results indicate a general skew of aid projects towards regions which hold capital cities, however, being incumbent presidents’ birth region does not increase the allocation of aid projects significantly. Nevertheless, with increasing quality of institutions aid projects are less skewed towards capital regions and the previously estimated coefficients loose significance in most cases. Higher ethnic fragmentation also seems to impede the possibility to allocate aid projects mainly in capital city regions and presidents’ birth places. Additionally, to assess the performance of the WB based on its own proclaimed goal to aim the poor in a country, the study also includes sub-national wealth data from the Demographic and Health Surveys (DSH), and finds that, even with better institutional qualities, regions with a larger share from the richest quintile receive significantly more aid than regions with a larger share of poor people. With increasing ethnic diversity, the allocation of aid projects towards regions where the richest citizens reside diminishes, but still remains high and significant. However, regions with a larger share of poor people still do not receive significantly more aid. This might imply that the sub-national distribution of aid projects increases in general with higher ethnic fragmentation, independent of the diverse regional needs. The results provide evidence that institutional qualities matter to undermine the influence of incumbent presidents on the allocation of aid projects towards their birth regions and capital regions. Moreover, even for countries with better institutional qualities the WB and the ADB do not seem to be able to aim the poor in a country with their aid projects. Even, if one considers need-based variables, such as infant mortality and child mortality rates, aid projects do not seem to be allocated in districts with a larger share of people in need. Therefore, the study provides further evidence using more detailed information on the sub-national distribution of aid projects that aid is not being allocated effectively towards regions with a larger share of poor people to alleviate poverty in recipient countries directly. Institutions do not have any significant influence on the sub-national distribution of aid towards the poor.Keywords: aid allocation, georeferenced data, institutions, spatial analysis
Procedia PDF Downloads 11927972 Taguchi Approach for the Optimization of the Stitching Defects of Knitted Garments
Authors: Adel El-Hadidy
Abstract:
For any industry, the production and quality management or wastages reductions have major impingement on overall factory economy. This work discusses the quality improvement of garment industry by applying Pareto analysis, cause and effect diagram and Taguchi experimental design. The main purpose of the work is to reduce the stitching defects, which will also minimize the rejection and reworks rate. Application of Pareto chart, fish bone diagram and Process Sigma Level/and or Performance Level tools helps solving those problems on priority basis. Among all, only sewing, defects are responsible form 69.3% to 97.3 % of total defects. Process Sigma level has been improved from 0.79 to 1.3 and performance rate improved, from F to D level. The results showed that the new set of sewing parameters was superior to the original one. It can be seen that fabric size has the largest effect on the sewing defects and that needle size has the smallest effect on the stitching defects.Keywords: garment, sewing defects, cost of rework, DMAIC, sigma level, cause and effect diagram, Pareto analysis
Procedia PDF Downloads 16527971 Nutritional Quality of Partially Processed Chicken Meat Products from Egyptian and Saudi Arabia Markets
Authors: Ali Meawad Ahmad, Hosny A. Abdelrahman
Abstract:
Chicken meat is a good source of protein of high biological value which contains most of essential amino-acids with high proportion of unsaturated fatty acids and low cholesterol level. Besides, it contain many vitamins as well as minerals which are important for the human body. Therefore, a total of 150 frozen chicken meat product samples, 800g each within their shelf-life, were randomly collected from commercial markets from Egypt (75 samples) and Saudi Arabian (75 samples) for chemical evaluation. The mean values of fat% in the examined samples of Egyptian and Saudi markets were 16.0% and 4.6% for chicken burger; 15.0% and 11% for nuggets and 11% and 11% for strips respectively. The mean values of moisture % in the examined samples of Egyptian and Saudi markets were 67.0% and 81% for chicken burger; 66.0% and 78% for nuggets and 71.0% and 72% for strips respectively. The mean values of protein % in the examined samples of Egyptian and Saudi markets were 15% and 17% for chicken burger; 16% and 16% for nuggets and 16% and 17% for strips respectively. The obtained results were compared with the Egyptian slandered and suggestions for improving the chemical quality of chicken products were given.Keywords: chicken meat, nutrition, Egypt, markets
Procedia PDF Downloads 56827970 Overview of Resources and Tools to Bridge Language Barriers Provided by the European Union
Authors: Barbara Heinisch, Mikael Snaprud
Abstract:
A common, well understood language is crucial in critical situations like landing a plane. For e-Government solutions, a clear and common language is needed to allow users to successfully complete transactions online. Misunderstandings here may not risk a safe landing but can cause delays, resubmissions and drive costs. This holds also true for higher education, where misunderstandings can also arise due to inconsistent use of terminology. Thus, language barriers are a societal challenge that needs to be tackled. The major means to bridge language barriers is translation. However, achieving high-quality translation and making texts understandable and accessible require certain framework conditions. Therefore, the EU and individual projects take (strategic) actions. These actions include the identification, collection, processing, re-use and development of language resources. These language resources may be used for the development of machine translation systems and the provision of (public) services including higher education. This paper outlines some of the existing resources and indicate directions for further development to increase the quality and usage of these resources.Keywords: language resources, machine translation, terminology, translation
Procedia PDF Downloads 319