Search results for: customized
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 223

Search results for: customized

73 Mobile Phone Text Reminders and Voice Call Follow-ups Improve Attendance for Community Retail Pharmacy Refills; Learnings from Lango Sub-region in Northern Uganda

Authors: Jonathan Ogwal, Louis H. Kamulegeya, John M. Bwanika, Davis Musinguzi

Abstract:

Introduction: Community retail Pharmacy drug distribution points (CRPDDP) were implemented in the Lango sub-region as part of the Ministry of Health’s response to improving access and adherence to antiretroviral treatment (ART). Clients received their ART refills from nearby local pharmacies; as such, the need for continuous engagement through mobile phone appointment reminders and health messages. We share learnings from the implementation of mobile text reminders and voice call follow-ups among ART clients attending the CRPDDP program in northern Uganda. Methods: A retrospective data review of electronic medical records from four pharmacies allocated for CRPDDP in the Lira and Apac districts of the Lango sub-region in Northern Uganda was done from February to August 2022. The process involved collecting phone contacts of eligible clients from the health facility appointment register and uploading them onto a messaging platform customized by Rapid-pro, an open-source software. Client information, including code name, phone number, next appointment date, and the allocated pharmacy for ART refill, was collected and kept confidential. Contacts received appointment reminder messages and other messages on positive living as an ART client. Routine voice call follow-ups were done to ascertain the picking of ART from the refill pharmacy. Findings: In total, 1,354 clients were reached from the four allocated pharmacies found in urban centers. 972 clients received short message service (SMS) appointment reminders, and 382 were followed up through voice calls. The majority (75%) of the clients returned for refills on the appointed date, 20% returned within four days after the appointment date, and the remaining 5% needed follow-up where they reported that they were not in the district by the appointment date due to other engagements. Conclusion: The use of mobile text reminders and voice call follow-ups improves the attendance of community retail pharmacy refills.

Keywords: antiretroviral treatment, community retail drug distribution points, mobile text reminders, voice call follow-up

Procedia PDF Downloads 77
72 Customer Acquisition through Time-Aware Marketing Campaign Analysis in Banking Industry

Authors: Harneet Walia, Morteza Zihayat

Abstract:

Customer acquisition has become one of the critical issues of any business in the 21st century; having a healthy customer base is the essential asset of the bank business. Term deposits act as a major source of cheap funds for the banks to invest and benefit from interest rate arbitrage. To attract customers, the marketing campaigns at most financial institutions consist of multiple outbound telephonic calls with more than one contact to a customer which is a very time-consuming process. Therefore, customized direct marketing has become more critical than ever for attracting new clients. As customer acquisition is becoming more difficult to archive, having an intelligent and redefined list is necessary to sell a product smartly. Our aim of this research is to increase the effectiveness of campaigns by predicting customers who will most likely subscribe to the fixed deposit and suggest the most suitable month to reach out to customers. We design a Time Aware Upsell Prediction Framework (TAUPF) using two different approaches, with an aim to find the best approach and technique to build the prediction model. TAUPF is implemented using Upsell Prediction Approach (UPA) and Clustered Upsell Prediction Approach (CUPA). We also address the data imbalance problem by examining and comparing different methods of sampling (Up-sampling and down-sampling). Our results have shown building such a model is quite feasible and profitable for the financial institutions. The Time Aware Upsell Prediction Framework (TAUPF) can be easily used in any industry such as telecom, automobile, tourism, etc. where the TAUPF (Clustered Upsell Prediction Approach (CUPA) or Upsell Prediction Approach (UPA)) holds valid. In our case, CUPA books more reliable. As proven in our research, one of the most important challenges is to define measures which have enough predictive power as the subscription to a fixed deposit depends on highly ambiguous situations and cannot be easily isolated. While we have shown the practicality of time-aware upsell prediction model where financial institutions can benefit from contacting the customers at the specified month, further research needs to be done to understand the specific time of the day. In addition, a further empirical/pilot study on real live customer needs to be conducted to prove the effectiveness of the model in the real world.

Keywords: customer acquisition, predictive analysis, targeted marketing, time-aware analysis

Procedia PDF Downloads 95
71 Students' Performance, Perception and Attitude towards Interactive Online Modules to Improve Undergraduate Quantitative Skills in Biological Science

Authors: C. Suphioglu , V. Simbag, J. Markham, C. Coady, S. Belward, G. Di Trapani, P. Chunduri, J. Chuck, Y. Hodgson, L. Lluka, L. Poladian, D. Watters

Abstract:

Advances in science have made quantitative skills (QS) an essential graduate outcome for undergraduate science programs in Australia and other parts of the world. However, many students entering into degrees in Australian universities either lack these skills or have little confidence in their ability to apply them in their biological science units. It has been previously reported that integration of quantitative skills into life science programs appears to have a positive effect on student attitudes towards the importance of mathematics and statistics in biological sciences. It has also been noted that there is deficiency in QS resources available and applicable to undergraduate science students in Australia. MathBench (http://mathbench.umd.edu) is a series of online modules involving quantitative biology scenarios developed by the University of Maryland. Through collaboration with Australian universities, a project was funded by the Australian government through its Office for Learning and Teaching (OLT) to develop customized MathBench biology modules to promote the quantitative skills of undergraduate biology students in Australia. This presentation will focus on the assessment of changes in performance, perception and attitude of students in a third year Cellular Physiology unit after use of interactive online cellular diffusion modules modified for the Australian context. The modules have been designed to integrate QS into the biological science curriculum using familiar scenarios and informal language and providing students with the opportunity to review solutions to diffusion QS-related problems with interactive graphics. This paper will discuss results of pre and post MathBench quizzes composed of general and module specific questions that assessed change in student QS after MathBench; and pre and post surveys, administered before and after using MathBench modules to evaluate the students’ change in perception towards the influence of the modules, their attitude towards QS and on the development of their confidence in completing the inquiry-based activity as well as changes to their appreciation of the relevance of mathematics to cellular processes. Results will be compared to changes reported by Thompson et al., (2010) at the University of Maryland and implications for further integration of interactive online activities in the curriculum will be explored and discussed.

Keywords: quantitative skills, MathBench, maths in biology

Procedia PDF Downloads 353
70 The Analyzer: Clustering Based System for Improving Business Productivity by Analyzing User Profiles to Enhance Human Computer Interaction

Authors: Dona Shaini Abhilasha Nanayakkara, Kurugamage Jude Pravinda Gregory Perera

Abstract:

E-commerce platforms have revolutionized the shopping experience, offering convenient ways for consumers to make purchases. To improve interactions with customers and optimize marketing strategies, it is essential for businesses to understand user behavior, preferences, and needs on these platforms. This paper focuses on recommending businesses to customize interactions with users based on their behavioral patterns, leveraging data-driven analysis and machine learning techniques. Businesses can improve engagement and boost the adoption of e-commerce platforms by aligning behavioral patterns with user goals of usability and satisfaction. We propose TheAnalyzer, a clustering-based system designed to enhance business productivity by analyzing user-profiles and improving human-computer interaction. The Analyzer seamlessly integrates with business applications, collecting relevant data points based on users' natural interactions without additional burdens such as questionnaires or surveys. It defines five key user analytics as features for its dataset, which are easily captured through users' interactions with e-commerce platforms. This research presents a study demonstrating the successful distinction of users into specific groups based on the five key analytics considered by TheAnalyzer. With the assistance of domain experts, customized business rules can be attached to each group, enabling The Analyzer to influence business applications and provide an enhanced personalized user experience. The outcomes are evaluated quantitatively and qualitatively, demonstrating that utilizing TheAnalyzer’s capabilities can optimize business outcomes, enhance customer satisfaction, and drive sustainable growth. The findings of this research contribute to the advancement of personalized interactions in e-commerce platforms. By leveraging user behavioral patterns and analyzing both new and existing users, businesses can effectively tailor their interactions to improve customer satisfaction, loyalty and ultimately drive sales.

Keywords: data clustering, data standardization, dimensionality reduction, human computer interaction, user profiling

Procedia PDF Downloads 36
69 Automated, Objective Assessment of Pilot Performance in Simulated Environment

Authors: Maciej Zasuwa, Grzegorz Ptasinski, Antoni Kopyt

Abstract:

Nowadays flight simulators offer tremendous possibilities for safe and cost-effective pilot training, by utilization of powerful, computational tools. Due to technology outpacing methodology, vast majority of training related work is done by human instructors. It makes assessment not efficient, and vulnerable to instructors’ subjectivity. The research presents an Objective Assessment Tool (gOAT) developed at the Warsaw University of Technology, and tested on SW-4 helicopter flight simulator. The tool uses database of the predefined manoeuvres, defined and integrated to the virtual environment. These were implemented, basing on Aeronautical Design Standard Performance Specification Handling Qualities Requirements for Military Rotorcraft (ADS-33), with predefined Mission-Task-Elements (MTEs). The core element of the gOAT enhanced algorithm that provides instructor a new set of information. In details, a set of objective flight parameters fused with report about psychophysical state of the pilot. While the pilot performs the task, the gOAT system automatically calculates performance using the embedded algorithms, data registered by the simulator software (position, orientation, velocity, etc.), as well as measurements of physiological changes of pilot’s psychophysiological state (temperature, sweating, heart rate). Complete set of measurements is presented on-line to instructor’s station and shown in dedicated graphical interface. The presented tool is based on open source solutions, and flexible for editing. Additional manoeuvres can be easily added using guide developed by authors, and MTEs can be changed by instructor even during an exercise. Algorithm and measurements used allow not only to implement basic stress level measurements, but also to reduce instructor’s workload significantly. Tool developed can be used for training purpose, as well as periodical checks of the aircrew. Flexibility and ease of modifications allow the further development to be wide ranged, and the tool to be customized. Depending on simulation purpose, gOAT can be adjusted to support simulator of aircraft, helicopter, or unmanned aerial vehicle (UAV).

Keywords: automated assessment, flight simulator, human factors, pilot training

Procedia PDF Downloads 124
68 Evaluation of Main Factors Affecting the Choice of a Freight Forwarder: A Sri Lankan Exporter’s Perspective

Authors: Ishani Maheshika

Abstract:

The intermediary role performed by freight forwarders in exportation has become significant in fulfilling businesses’ supply chain needs in this dynamic world. Since the success of exporter’s business is at present, highly reliant on supply chain optimization, cost efficiency, profitability, consistent service and responsiveness, the decision of selecting the most beneficial freight forwarder has become crucial for exporters. Although there are similar foreign researches, prior researches covering Sri Lankan setting are not in existence. Moreover, results vary with time, nature of industry and business environment factors. Therefore, a study from the perspective of Sri Lankan exporters was identified as a requisite to be researched. In order to identify and prioritize key factors which have affected the exporter’s decision in selecting freight forwarders in Sri Lankan context, Sri Lankan export industry was stratified into 22 sectors based on commodity using stratified sampling technique. One exporter from each sector was then selected using judgmental sampling to have a sample of 22. Factors which were identified through a pilot survey, was organized under 6 main criteria. A questionnaire was basically developed as pairwise comparisons using 9-point semantic differential scale and comparisons were done within main criteria and subcriteria. After a pre-testing, interviews and e-mail questionnaire survey were conducted. Data were analyzed using Analytic Hierarchy Process to determine priority vectors of criteria. Customer service was found to be the most important main criterion for Sri Lankan exporters. It was followed by reliability and operational efficiency respectively. The criterion of the least importance is company background and reputation. Whereas small sized exporters pay more attention to rate, reliability is the major concern among medium and large scale exporters. Irrespective of seniority of the exporter, reliability is given the prominence. Responsiveness is the most important sub criterion among Sri Lankan exporters. Consistency of judgments with respect to main criteria was verified through consistency ratio, which was less than 10%. Being more competitive, freight forwarders should come up with customized marketing strategies based on each target group’s requirements and expectations in offering services to retain existing exporters and attract new exporters.

Keywords: analytic hierarchy process, freight forwarders, main criteria, Sri Lankan exporters, subcriteria

Procedia PDF Downloads 380
67 Innovations in the Implementation of Preventive Strategies and Measuring Their Effectiveness Towards the Prevention of Harmful Incidents to People with Mental Disabilities who Receive Home and Community Based Services

Authors: Carlos V. Gonzalez

Abstract:

Background: Providers of in-home and community based services strive for the elimination of preventable harm to the people under their care as well as to the employees who support them. Traditional models of safety and protection from harm have assumed that the absence of incidents of harm is a good indicator of safe practices. However, this model creates an illusion of safety that is easily shaken by sudden and inadvertent harmful events. As an alternative, we have developed and implemented an evidence-based resilient model of safety known as C.O.P.E. (Caring, Observing, Predicting and Evaluating). Within this model, safety is not defined by the absence of harmful incidents, but by the presence of continuous monitoring, anticipation, learning, and rapid response to events that may lead to harm. Objective: The objective was to evaluate the effectiveness of the C.O.P.E. model for the reduction of harm to individuals with mental disabilities who receive home and community based services. Methods: Over the course of 2 years we counted the number of incidents of harm and near misses. We trained employees on strategies to eliminate incidents before they fully escalated. We trained employees to track different levels of patient status within a scale from 0 to 10. Additionally, we provided direct support professionals and supervisors with customized smart phone applications to track and notify the team of changes in that status every 30 minutes. Finally, the information that we collected was saved in a private computer network that analyzes and graphs the outcome of each incident. Result and conclusions: The use of the COPE model resulted in: A reduction in incidents of harm. A reduction the use of restraints and other physical interventions. An increase in Direct Support Professional’s ability to detect and respond to health problems. Improvement in employee alertness by decreasing sleeping on duty. Improvement in caring and positive interaction between Direct Support Professionals and the person who is supported. Developing a method to globally measure and assess the effectiveness of prevention from harm plans. Future applications of the COPE model for the reduction of harm to people who receive home and community based services are discussed.

Keywords: harm, patients, resilience, safety, mental illness, disability

Procedia PDF Downloads 424
66 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 83
65 Carbon Based Wearable Patch Devices for Real-Time Electrocardiography Monitoring

Authors: Hachul Jung, Ahee Kim, Sanghoon Lee, Dahye Kwon, Songwoo Yoon, Jinhee Moon

Abstract:

We fabricated a wearable patch device including novel patch type flexible dry electrode based on carbon nanofibers (CNFs) and silicone-based elastomer (MED 6215) for real-time ECG monitoring. There are many methods to make flexible conductive polymer by mixing metal or carbon-based nanoparticles. In this study, CNFs are selected for conductive nanoparticles because carbon nanotubes (CNTs) are difficult to disperse uniformly in elastomer compare with CNFs and silver nanowires are relatively high cost and easily oxidized in the air. Wearable patch is composed of 2 parts that dry electrode parts for recording bio signal and sticky patch parts for mounting on the skin. Dry electrode parts were made by vortexer and baking in prepared mold. To optimize electrical performance and diffusion degree of uniformity, we developed unique mixing and baking process. Secondly, sticky patch parts were made by patterning and detaching from smooth surface substrate after spin-coating soft skin adhesive. In this process, attachable and detachable strengths of sticky patch are measured and optimized for them, using a monitoring system. Assembled patch is flexible, stretchable, easily skin mountable and connectable directly with the system. To evaluate the performance of electrical characteristics and ECG (Electrocardiography) recording, wearable patch was tested by changing concentrations of CNFs and thickness of the dry electrode. In these results, the CNF concentration and thickness of dry electrodes were important variables to obtain high-quality ECG signals without incidental distractions. Cytotoxicity test is conducted to prove biocompatibility, and long-term wearing test showed no skin reactions such as itching or erythema. To minimize noises from motion artifacts and line noise, we make the customized wireless, light-weight data acquisition system. Measured ECG Signals from this system are stable and successfully monitored simultaneously. To sum up, we could fully utilize fabricated wearable patch devices for real-time ECG monitoring easily.

Keywords: carbon nanofibers, ECG monitoring, flexible dry electrode, wearable patch

Procedia PDF Downloads 156
64 Unlocking Health Insights: Studying Data for Better Care

Authors: Valentina Marutyan

Abstract:

Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.

Keywords: data mining, healthcare, big data, large amounts of data

Procedia PDF Downloads 32
63 The Evaporation Study of 1-ethyl-3-methylimidazolium chloride

Authors: Kirill D. Semavin, Norbert S. Chilingarov, Eugene.V. Skokan

Abstract:

The ionic liquids (ILs) based on imidazolium cation are well known nowadays. The changing anions and substituents in imidazolium ring may lead to different physical and chemical properties of ILs. It is important that such ILs with halogen as anion are characterized by a low thermal stability. The data about thermal stability of 1-ethyl-3-methylimidazolium chloride are ambiguous. In the works of last years, thermal stability of this IL was investigated by thermogravimetric analysis and obtained results are contradictory. Moreover, in the last study, it was shown that the observed temperature of the beginning of decomposition significantly depends on the experimental conditions, for example, the heating rate of the sample. The vapor pressure of this IL is not presented at the literature. In this study, the vapor pressure of 1-ethyl-3-methylimidazolium chloride was obtained by Knudsen effusion mass-spectrometry (KEMS). The samples of [ЕMIm]Cl (purity > 98%) were supplied by Sigma–Aldrich and were additionally dried at dynamic vacuum (T = 60 0C). Preliminary procedures with Il were derived into glove box. The evaporation studies of [ЕMIm]Cl were carried out by KEMS with using original research equipment based on commercial MI1201 magnetic mass spectrometer. The stainless steel effusion cell had an effective evaporation/effusion area ratio of more than 6000. The cell temperature, measured by a Pt/Pt−Rh (10%) thermocouple, was controlled by a Termodat 128K5 device with an accuracy of ±1 K. In first step of this study, the optimal temperature of experiment and heating rate of samples were customized: 449 K and 5 K/min, respectively. In these conditions the sample is decomposed, but the experimental measurements of the vapor pressures are possible. The thermodynamic activity of [ЕMIm]Cl is close to 1 and products of decomposition don’t affect it at firstly 50 hours of experiment. Therefore, it lets to determine the saturated vapor pressure of IL. The electronic ionization mass-spectra shows that the decomposition of [ЕMIm]Cl proceeds with two ways. Nonetheless, the MALDI mass spectra of the starting sample and residue in the cell were similar. It means that the main decomposition products are gaseous under experimental conditions. This result allows us to obtain information about the kinetics of [ЕMIm]Cl decomposition. Thus, the original KEMS-based procedure made it possible to determine the IL vapor pressure under decomposition conditions. Also, the loss of sample mass due to the evaporation was obtained.

Keywords: ionic liquids, Knudsen effusion mass spectrometry, thermal stability, vapor pressure

Procedia PDF Downloads 160
62 Optimizing Residential Housing Renovation Strategies at Territorial Scale: A Data Driven Approach and Insights from the French Context

Authors: Rit M., Girard R., Villot J., Thorel M.

Abstract:

In a scenario of extensive residential housing renovation, stakeholders need models that support decision-making through a deep understanding of the existing building stock and accurate energy demand simulations. To address this need, we have modified an optimization model using open data that enables the study of renovation strategies at both territorial and national scales. This approach provides (1) a definition of a strategy to simplify decision trees from theoretical combinations, (2) input to decision makers on real-world renovation constraints, (3) more reliable identification of energy-saving measures (changes in technology or behaviour), and (4) discrepancies between currently planned and actually achieved strategies. The main contribution of the studies described in this document is the geographic scale: all residential buildings in the areas of interest were modeled and simulated using national data (geometries and attributes). These buildings were then renovated, when necessary, in accordance with the environmental objectives, taking into account the constraints applicable to each territory (number of renovations per year) or at the national level (renovation of thermal deficiencies (Energy Performance Certificates F&G)). This differs from traditional approaches that focus only on a few buildings or archetypes. This model can also be used to analyze the evolution of a building stock as a whole, as it can take into account both the construction of new buildings and their demolition or sale. Using specific case studies of French territories, this paper highlights a significant discrepancy between the strategies currently advocated by decision-makers and those proposed by our optimization model. This discrepancy is particularly evident in critical metrics such as the relationship between the number of renovations per year and achievable climate targets or the financial support currently available to households and the remaining costs. In addition, users are free to seek optimizations for their building stock across a range of different metrics (e.g., financial, energy, environmental, or life cycle analysis). These results are a clear call to re-evaluate existing renovation strategies and take a more nuanced and customized approach. As the climate crisis moves inexorably forward, harnessing the potential of advanced technologies and data-driven methodologies is imperative.

Keywords: residential housing renovation, MILP, energy demand simulations, data-driven methodology

Procedia PDF Downloads 43
61 Variables, Annotation, and Metadata Schemas for Early Modern Greek

Authors: Eleni Karantzola, Athanasios Karasimos, Vasiliki Makri, Ioanna Skouvara

Abstract:

Historical linguistics unveils the historical depth of languages and traces variation and change by analyzing linguistic variables over time. This field of linguistics usually deals with a closed data set that can only be expanded by the (re)discovery of previously unknown manuscripts or editions. In some cases, it is possible to use (almost) the entire closed corpus of a language for research, as is the case with the Thesaurus Linguae Graecae digital library for Ancient Greek, which contains most of the extant ancient Greek literature. However, concerning ‘dynamic’ periods when the production and circulation of texts in printed as well as manuscript form have not been fully mapped, representative samples and corpora of texts are needed. Such material and tools are utterly lacking for Early Modern Greek (16th-18th c.). In this study, the principles of the creation of EMoGReC, a pilot representative corpus of Early Modern Greek (16th-18th c.) are presented. Its design follows the fundamental principles of historical corpora. The selection of texts aims to create a representative and balanced corpus that gives insight into diachronic, diatopic and diaphasic variation. The pilot sample includes data derived from fully machine-readable vernacular texts, which belong to 4-5 different textual genres and come from different geographical areas. We develop a hierarchical linguistic annotation scheme, further customized to fit the characteristics of our text corpus. Regarding variables and their variants, we use as a point of departure the bundle of twenty-four features (or categories of features) for prose demotic texts of the 16th c. Tags are introduced bearing the variants [+old/archaic] or [+novel/vernacular]. On the other hand, further phenomena that are underway (cf. The Cambridge Grammar of Medieval and Early Modern Greek) are selected for tagging. The annotated texts are enriched with metalinguistic and sociolinguistic metadata to provide a testbed for the development of the first comprehensive set of tools for the Greek language of that period. Based on a relational management system with interconnection of data, annotations, and their metadata, the EMoGReC database aspires to join a state-of-the-art technological ecosystem for the research of observed language variation and change using advanced computational approaches.

Keywords: early modern Greek, variation and change, representative corpus, diachronic variables.

Procedia PDF Downloads 31
60 The Study of Thai Millennial Attitude toward End-of-Life Planning, Opportunity of Service Design Development

Authors: Mawong R., Bussracumpakorn C.

Abstract:

Millions of young people around the world have been affected by COVID-19 to their psychological and social effects. Millennials’ stresses have been shaped by a few global issues, including climate change, political instability, and financial crisis. In particular, the spread of COVID-19 has become laying psychological and socioeconomic scars on them. When end-of-life planning turns into more widely discussed, the stigma and taboos around this issue are greatly lessened. End-of-life planning is defined as a future life plan, such as financial, legacy, funeral, and memorial planning. This plan would help millennials to discover the value and meaning of life. This study explores the attitudes of Thai Millennials toward end-of-life planning as a new normal awareness of life in order to initiate an innovative service concept to fit with their value and meaning. The study conducts an in-depth interview with 12 potential participants who have awareness or action on the plan. The framework of the customer journey map is used to analyze the responses to examine trigger points, barriers, beliefs, and expectations. The findings pointed to a service concept that is suggested for a new end-of-life planning service that is suited to Thai Millennials in 4 different groups, which are 1. Social -Conscious as a socially aware who to donate time and riches to make the world and society a better place, their end-of-life planning value is inspired by the social impact of giving something or some action that they will be able to do after life or during life which provides a variety of choice based on their preference to give to society, 2. Life Fulfillment who make a life goal for themselves and attempt to achieve it before the time comes to their value will be to inspire life value with a customized plan and provide guidance to suggest, 3. Prevention of the After-Death Effect who want to plan to avoid the effects of their death as patriarch, head of the family, and anchor of someone, so they want to have a plan that brings confidence and feel relief while they are still alive and they want to find some reliable service that they can leave the death will or asset, and 4. No Guilty Planning who plan for when they wish to be worry-free as a self-responsible they want to have the plan which is easy to understand and easy to access. The overall finding of the study is to understand the new service concept of end-of-life planning which to improve knowledge of significant life worth rather than death planning, encouraging people to reassess their lives in a positive way, leading to higher self-esteem and intrinsic motivation for this generation in this time of global crisis.

Keywords: design management, end-of-life planning, millennial generation, service design solution

Procedia PDF Downloads 163
59 Exploring the Correlation between Body Constitution of an Individual as Per Ayurveda and Gut Microbiome in Healthy, Multi Ethnic Urban Population in Bangalore, India

Authors: Shalini TV, Gangadharan GG, Sriranjini S Jaideep, ASN Seshasayee, Awadhesh Pandit

Abstract:

Introduction: Prakriti (body-mind constitution of an individual) is a conventional, customized and unique understanding of which is essential for the personalized medicine described in Ayurveda, Indian System of Medicine. Based on the Doshas( functional, bio humoral unit in the body), individuals are categorized into three major Prakriti- Vata, Pitta, and Kapha. The human gut microbiome hosts plenty of highly diverse and metabolically active microorganisms, mainly dominated by the bacteria, which are known to influence the physiology of an individual. Few researches have shown the correlation between the Prakriti and the biochemical parameters. In this study, an attempt was made to explore any correlation between the Prakriti (phenotype of an individual) with the Genetic makeup of the gut microbiome in healthy individuals. Materials and methods: 270 multi-ethnic, healthy volunteers of both sex with the age group between 18 to 40 years, with no history of antibiotics in the last 6 months were recruited into three groups of Vata, Pitta, and Kapha. The Prakriti of the individual was determined using Ayusoft, a software designed by CDAC, Pune, India. The volunteers were subjected to initial screening for the assessment of their height, weight, Body Mass Index, Vital signs and Blood investigations to ensure they are healthy. The stool and saliva samples of the recruited volunteers were collected as per the standard operating procedure developed, and the bacterial DNA was isolated using Qiagen kits. The extracted DNA was subjected to 16s rRNA sequencing using the Illumina kits. The sequencing libraries are targeting the variable V3 and V4 regions of the 16s rRNA gene. Paired sequencing was done on the MiSeq system and data were analyzed using the CLC Genomics workbench 11. Results: The 16s rRNA sequencing of the V3 and V4 regions showed a diverse pattern in both the oral and stool microbial DNA. The study did not reveal any specific pattern of bacterial flora amongst the Prakriti. All the p-values were more than the effective alpha values for all OTUs in both the buccal cavity and stool samples. Therefore, there was no observed significant enrichment of an OTU in the patient samples from either the buccal cavity or stool samples. Conclusion: In healthy volunteers of multi-ethnicity, due to the influence of the various factors, the correlation between the Prakriti and the gut microbiome was not seen.

Keywords: gut microbiome, ayurveda Prakriti, sequencing, multi-ethnic urban population

Procedia PDF Downloads 107
58 TARF: Web Toolkit for Annotating RNA-Related Genomic Features

Authors: Jialin Ma, Jia Meng

Abstract:

Genomic features, the genome-based coordinates, are commonly used for the representation of biological features such as genes, RNA transcripts and transcription factor binding sites. For the analysis of RNA-related genomic features, such as RNA modification sites, a common task is to correlate these features with transcript components (5'UTR, CDS, 3'UTR) to explore their distribution characteristics in terms of transcriptomic coordinates, e.g., to examine whether a specific type of biological feature is enriched near transcription start sites. Existing approaches for performing these tasks involve the manipulation of a gene database, conversion from genome-based coordinate to transcript-based coordinate, and visualization methods that are capable of showing RNA transcript components and distribution of the features. These steps are complicated and time consuming, and this is especially true for researchers who are not familiar with relevant tools. To overcome this obstacle, we develop a dedicated web app TARF, which represents web toolkit for annotating RNA-related genomic features. TARF web tool intends to provide a web-based way to easily annotate and visualize RNA-related genomic features. Once a user has uploaded the features with BED format and specified a built-in transcript database or uploaded a customized gene database with GTF format, the tool could fulfill its three main functions. First, it adds annotation on gene and RNA transcript components. For every features provided by the user, the overlapping with RNA transcript components are identified, and the information is combined in one table which is available for copy and download. Summary statistics about ambiguous belongings are also carried out. Second, the tool provides a convenient visualization method of the features on single gene/transcript level. For the selected gene, the tool shows the features with gene model on genome-based view, and also maps the features to transcript-based coordinate and show the distribution against one single spliced RNA transcript. Third, a global transcriptomic view of the genomic features is generated utilizing the Guitar R/Bioconductor package. The distribution of features on RNA transcripts are normalized with respect to RNA transcript landmarks and the enrichment of the features on different RNA transcript components is demonstrated. We tested the newly developed TARF toolkit with 3 different types of genomics features related to chromatin H3K4me3, RNA N6-methyladenosine (m6A) and RNA 5-methylcytosine (m5C), which are obtained from ChIP-Seq, MeRIP-Seq and RNA BS-Seq data, respectively. TARF successfully revealed their respective distribution characteristics, i.e. H3K4me3, m6A and m5C are enriched near transcription starting sites, stop codons and 5’UTRs, respectively. Overall, TARF is a useful web toolkit for annotation and visualization of RNA-related genomic features, and should help simplify the analysis of various RNA-related genomic features, especially those related RNA modifications.

Keywords: RNA-related genomic features, annotation, visualization, web server

Procedia PDF Downloads 181
57 Coupling of Microfluidic Droplet Systems with ESI-MS Detection for Reaction Optimization

Authors: Julia R. Beulig, Stefan Ohla, Detlev Belder

Abstract:

In contrast to off-line analytical methods, lab-on-a-chip technology delivers direct information about the observed reaction. Therefore, microfluidic devices make an important scientific contribution, e.g. in the field of synthetic chemistry. Herein, the rapid generation of analytical data can be applied for the optimization of chemical reactions. These microfluidic devices enable a fast change of reaction conditions as well as a resource saving method of operation. In the presented work, we focus on the investigation of multiphase regimes, more specifically on a biphasic microfluidic droplet systems. Here, every single droplet is a reaction container with customized conditions. The biggest challenge is the rapid qualitative and quantitative readout of information as most detection techniques for droplet systems are non-specific, time-consuming or too slow. An exception is the electrospray mass spectrometry (ESI-MS). The combination of a reaction screening platform with a rapid and specific detection method is an important step in droplet-based microfluidics. In this work, we present a novel approach for synthesis optimization on the nanoliter scale with direct ESI-MS detection. The development of a droplet-based microfluidic device, which enables the modification of different parameters while simultaneously monitoring the effect on the reaction within a single run, is shown. By common soft- and photolithographic techniques a polydimethylsiloxane (PDMS) microfluidic chip with different functionalities is developed. As an interface for the MS detection, we use a steel capillary for ESI and improve the spray stability with a Teflon siphon tubing, which is inserted underneath the steel capillary. By optimizing the flow rates, it is possible to screen parameters of various reactions, this is exemplarity shown by a Domino Knoevenagel Hetero-Diels-Alder reaction. Different starting materials, catalyst concentrations and solvent compositions are investigated. Due to the high repetition rate of the droplet production, each set of reaction condition is examined hundreds of times. As a result, of the investigation, we receive possible reagents, the ideal water-methanol ratio of the solvent and the most effective catalyst concentration. The developed system can help to determine important information about the optimal parameters of a reaction within a short time. With this novel tool, we make an important step on the field of combining droplet-based microfluidics with organic reaction screening.

Keywords: droplet, mass spectrometry, microfluidics, organic reaction, screening

Procedia PDF Downloads 269
56 Additive Manufacturing of Microstructured Optical Waveguides Using Two-Photon Polymerization

Authors: Leonnel Mhuka

Abstract:

Background: The field of photonics has witnessed substantial growth, with an increasing demand for miniaturized and high-performance optical components. Microstructured optical waveguides have gained significant attention due to their ability to confine and manipulate light at the subwavelength scale. Conventional fabrication methods, however, face limitations in achieving intricate and customizable waveguide structures. Two-photon polymerization (TPP) emerges as a promising additive manufacturing technique, enabling the fabrication of complex 3D microstructures with submicron resolution. Objectives: This experiment aimed to utilize two-photon polymerization to fabricate microstructured optical waveguides with precise control over geometry and dimensions. The objective was to demonstrate the feasibility of TPP as an additive manufacturing method for producing functional waveguide devices with enhanced performance. Methods: A femtosecond laser system operating at a wavelength of 800 nm was employed for two-photon polymerization. A custom-designed CAD model of the microstructured waveguide was converted into G-code, which guided the laser focus through a photosensitive polymer material. The waveguide structures were fabricated using a layer-by-layer approach, with each layer formed by localized polymerization induced by non-linear absorption of the laser light. Characterization of the fabricated waveguides included optical microscopy, scanning electron microscopy, and optical transmission measurements. The optical properties, such as mode confinement and propagation losses, were evaluated to assess the performance of the additive manufactured waveguides. Conclusion: The experiment successfully demonstrated the additive manufacturing of microstructured optical waveguides using two-photon polymerization. Optical microscopy and scanning electron microscopy revealed the intricate 3D structures with submicron resolution. The measured optical transmission indicated efficient light propagation through the fabricated waveguides. The waveguides exhibited well-defined mode confinement and relatively low propagation losses, showcasing the potential of TPP-based additive manufacturing for photonics applications. The experiment highlighted the advantages of TPP in achieving high-resolution, customized, and functional microstructured optical waveguides. Conclusion: his experiment substantiates the viability of two-photon polymerization as an innovative additive manufacturing technique for producing complex microstructured optical waveguides. The successful fabrication and characterization of these waveguides open doors to further advancements in the field of photonics, enabling the development of high-performance integrated optical devices for various applications

Keywords: Additive Manufacturing, Microstructured Optical Waveguides, Two-Photon Polymerization, Photonics Applications

Procedia PDF Downloads 68
55 Data Mining in Healthcare for Predictive Analytics

Authors: Ruzanna Muradyan

Abstract:

Medical data mining is a crucial field in contemporary healthcare that offers cutting-edge tactics with enormous potential to transform patient care. This abstract examines how sophisticated data mining techniques could transform the healthcare industry, with a special focus on how they might improve patient outcomes. Healthcare data repositories have dynamically evolved, producing a rich tapestry of different, multi-dimensional information that includes genetic profiles, lifestyle markers, electronic health records, and more. By utilizing data mining techniques inside this vast library, a variety of prospects for precision medicine, predictive analytics, and insight production become visible. Predictive modeling for illness prediction, risk stratification, and therapy efficacy evaluations are important points of focus. Healthcare providers may use this abundance of data to tailor treatment plans, identify high-risk patient populations, and forecast disease trajectories by applying machine learning algorithms and predictive analytics. Better patient outcomes, more efficient use of resources, and early treatments are made possible by this proactive strategy. Furthermore, data mining techniques act as catalysts to reveal complex relationships between apparently unrelated data pieces, providing enhanced insights into the cause of disease, genetic susceptibilities, and environmental factors. Healthcare practitioners can get practical insights that guide disease prevention, customized patient counseling, and focused therapies by analyzing these associations. The abstract explores the problems and ethical issues that come with using data mining techniques in the healthcare industry. In order to properly use these approaches, it is essential to find a balance between data privacy, security issues, and the interpretability of complex models. Finally, this abstract demonstrates the revolutionary power of modern data mining methodologies in transforming the healthcare sector. Healthcare practitioners and researchers can uncover unique insights, enhance clinical decision-making, and ultimately elevate patient care to unprecedented levels of precision and efficacy by employing cutting-edge methodologies.

Keywords: data mining, healthcare, patient care, predictive analytics, precision medicine, electronic health records, machine learning, predictive modeling, disease prognosis, risk stratification, treatment efficacy, genetic profiles, precision health

Procedia PDF Downloads 31
54 Sentiment Analysis of Creative Tourism Experiences: The Case of Girona, Spain

Authors: Ariadna Gassiot, Raquel Camprubi, Lluis Coromina

Abstract:

Creative tourism involves the participation of tourists in the co-creation of their own experiences in a tourism destination. Consequently, creative tourists move from a passive behavior to an active behavior, and tourism destinations address this type of tourism by changing the scenario and making tourists learn and participate while they travel instead of merely offering tourism products and services to them. In creative tourism experiences, tourists are in close contact with locals and their culture. In destinations where culture (i.e. food, heritage, etc.) is the basis of their offer, such as Girona, Spain, tourism stakeholders must especially consider, analyze, and further foster the co-creation of authentic tourism experiences. They should focus on discovering more about these experiences, their main attributes, visitors’ opinions, etc. Creative tourists do not only participate while they travel around the world, but they also have and active post-travel behavior. They feel free to write about tourism experiences in different channels. User-generated content becomes crucial for any tourism destination when analyzing the market, making decisions, planning strategies, and when addressing issues, such as their reputation and performance. Sentiment analysis is a methodology used to automatically analyze semantic relationships and meanings in texts, so it is a way to extract tourists’ emotions and feelings. Tourists normally express their views and opinions regarding tourism products and services. They may express positive, neutral or negative feelings towards these products or services. For example, they may express anger, love, hate, sadness or joy towards tourism services and products. They may also express feelings through verbs, nouns, adverbs, adjectives, among others. Sentiment analysis may help tourism professionals in a range of areas, from marketing to customer service. For example, sentiment analysis allows tourism stakeholders to forecast tourism expenditure and tourist arrivals, or to analyze tourists’ profile. While there is an increasing presence of creativity in tourists’ experiences, there is also an increasing need to explore tourists’ expressions about these experiences. There is a need to know how they feel about participating in specific tourism activities. Thus, the main objective of this study is to analyze the meanings, emotions and feelings that tourists express about their creative experiences in Girona, Spain. To do so, sentiment analysis methodology is used. Results show the diversity of tourists who actively participate in tourism in Girona. Their opinions refer both to tangible aspects (e.g. food, museums, etc.) and to intangible aspects (e.g. friendliness, nightlife, etc.) of tourism experiences. Tourists express love, likeliness and other sentiments towards tourism products and services in Girona. This study can help tourism stakeholders in understanding tourists’ experiences and feelings. Consequently, they can offer more customized products and services and they can efficiently make them participate in the co-creation of their own tourism experiences.

Keywords: creative tourism, sentiment analysis, text mining, user-generated content

Procedia PDF Downloads 150
53 How Virtualization, Decentralization, and Network-Building Change the Manufacturing Landscape: An Industry 4.0 Perspective

Authors: Malte Brettel, Niklas Friederichsen, Michael Keller, Marius Rosenberg

Abstract:

The German manufacturing industry has to withstand an increasing global competition on product quality and production costs. As labor costs are high, several industries have suffered severely under the relocation of production facilities towards aspiring countries, which have managed to close the productivity and quality gap substantially. Established manufacturing companies have recognized that customers are not willing to pay large price premiums for incremental quality improvements. As a consequence, many companies from the German manufacturing industry adjust their production focusing on customized products and fast time to market. Leveraging the advantages of novel production strategies such as Agile Manufacturing and Mass Customization, manufacturing companies transform into integrated networks, in which companies unite their core competencies. Hereby, virtualization of the process- and supply-chain ensures smooth inter-company operations providing real-time access to relevant product and production information for all participating entities. Boundaries of companies deteriorate, as autonomous systems exchange data, gained by embedded systems throughout the entire value chain. By including Cyber-Physical-Systems, advanced communication between machines is tantamount to their dialogue with humans. The increasing utilization of information and communication technology allows digital engineering of products and production processes alike. Modular simulation and modeling techniques allow decentralized units to flexibly alter products and thereby enable rapid product innovation. The present article describes the developments of Industry 4.0 within the literature and reviews the associated research streams. Hereby, we analyze eight scientific journals with regards to the following research fields: Individualized production, end-to-end engineering in a virtual process chain and production networks. We employ cluster analysis to assign sub-topics into the respective research field. To assess the practical implications, we conducted face-to-face interviews with managers from the industry as well as from the consulting business using a structured interview guideline. The results reveal reasons for the adaption and refusal of Industry 4.0 practices from a managerial point of view. Our findings contribute to the upcoming research stream of Industry 4.0 and support decision-makers to assess their need for transformation towards Industry 4.0 practices.

Keywords: Industry 4.0., mass customization, production networks, virtual process-chain

Procedia PDF Downloads 250
52 Computational Analysis of Thermal Degradation in Wind Turbine Spars' Equipotential Bonding Subjected to Lightning Strikes

Authors: Antonio A. M. Laudani, Igor O. Golosnoy, Ole T. Thomsen

Abstract:

Rotor blades of large, modern wind turbines are highly susceptible to downward lightning strikes, as well as to triggering upward lightning; consequently, it is necessary to equip them with an effective lightning protection system (LPS) in order to avoid any damage. The performance of existing LPSs is affected by carbon fibre reinforced polymer (CFRP) structures, which lead to lightning-induced damage in the blades, e.g. via electrical sparks. A solution to prevent internal arcing would be to electrically bond the LPS and the composite structures such that to obtain the same electric potential. Nevertheless, elevated temperatures are achieved at the joint interfaces because of high contact resistance, which melts and vaporises some of the epoxy resin matrix around the bonding. The produced high-pressure gasses open up the bonding and can ignite thermal sparks. The objective of this paper is to predict the current density distribution and the temperature field in the adhesive joint cross-section, in order to check whether the resin pyrolysis temperature is achieved and any damage is expected. The finite element method has been employed to solve both the current and heat transfer problems, which are considered weakly coupled. The mathematical model for electric current includes Maxwell-Ampere equation for induced electric field solved together with current conservation, while the thermal field is found from heat diffusion equation. In this way, the current sub-model calculates Joule heat release for a chosen bonding configuration, whereas the thermal analysis allows to determining threshold values of voltage and current density not to be exceeded in order to maintain the temperature across the joint below the pyrolysis temperature, therefore preventing the occurrence of outgassing. In addition, it provides an indication of the minimal number of bonding points. It is worth to mention that the numerical procedures presented in this study can be tailored and applied to any type of joints other than adhesive ones for wind turbine blades. For instance, they can be applied for lightning protection of aerospace bolted joints. Furthermore, they can even be customized to predict the electromagnetic response under lightning strikes of other wind turbine systems, such as nacelle and hub components.

Keywords: carbon fibre reinforced polymer, equipotential bonding, finite element method, FEM, lightning protection system, LPS, wind turbine blades

Procedia PDF Downloads 135
51 Consequences to Financial Reporting by Implementing Sri Lanka Financial Reporting Standard 13 on Measuring the Fair Value of Financial Instruments: Evidence from Three Sri Lankan Organizations

Authors: Nayoma Ranawaka

Abstract:

The demand for the high quality internationally comparable financial information has been increased than ever with the expansion of economic activities beyond its national boundaries. Thus, the necessity of converging accounting practices across the world is now continuously discussed with greater emphasis. The global convergence to International Financial Reporting Standards has been one of the main objectives of the International Accounting Standards Setting Board (IASB) since its establishment in 2001. Accordingly, Sri Lanka has adopted IFRSs in 2012. Among the other standards as a newly introduced standard by the IASB, IFRS 13 plays a pivotal role as it deals with the Fair Value Accounting (FVA). Therefore, it is valuable to obtain knowledge about the consequences of implementing IFRS 13 in Sri Lanka and compare results across nations. According to the IFRS Jurisdictional provision of Sri Lanka, Institute of Chartered Accountants of Sri Lanka has taken official steps to adopt IFRS 13 by introducing SLFRS 13 with de jure convergence. Then this study was identified the de facto convergence of the SLFRS 13 in measuring the Fair Value of Financial Instruments in the Sri Lankan context. Accordingly, the objective of this study is to explore the consequences to financial reporting by implementing SLFRS 13 on measuring the financial instruments. In order to achieve the objective of the study expert interview and in-depth interviews with the interviewees from the selected three case studies and their independent auditor were carried out using customized three different interview guides. These three cases were selected from three different industries; Banking, Manufacturing and Finance. NVivo version 10 was used to analyze the data collected through in-depth interviews. Then the content analysis was carried out and conclusions were derived based on the findings. Contribution to the knowledge by this study can be identified in different aspects. Findings of this study facilitate accounting practitioners to get an overall picture of application of fair value standard in measuring the financial instruments and to identify the challenges and barriers to the adoption process. Further, assist auditors in carrying out their audit procedures to check the level of compliance to the fair value standard in measuring the financial instruments. Moreover, this would enable foreign investors in assessing the reliability of the financial statements of their target investments as a result of SLFRS 13 in measuring the FVs of the FIs. The findings of the study could be used to open new avenues of thinking for policy formulators to provide the necessary infrastructure to eliminate disparities exists among different regulatory bodies to facilitate full convergence and thereby growth of the economy. Further, this provides insights to the dynamics of FVA implementation that are also relevant for other developing countries.

Keywords: convergence, fair value, financial instruments, IFRS 13

Procedia PDF Downloads 94
50 Analysis of Fuel Adulteration Consequences in Bangladesh

Authors: Mahadehe Hassan

Abstract:

In most countries manufacturing, trading and distribution of gasoline and diesel fuels belongs to the most important sectors of national economy. For Bangladesh, a robust, well-functioning, secure and smartly managed national fuel distribution chain is an essential precondition for achieving Government top priorities in development and modernization of transportation infrastructure, protection of national environment and population health as well as, very importantly, securing due tax revenue for the State Budget. Bangladesh is a developing country with complex fuel supply network, high fuel taxes incidence and – till now - limited possibilities in application of modern, automated technologies for Government national fuel market control. Such environment allows dishonest physical and legal persons and organized criminals to build and profit from illegal fuel distribution schemes and fuel illicit trade. As a result, the market transparency and the country attractiveness for foreign investments, law-abiding economic operators, national consumers, State Budget and the Government ability to finance development projects, and the country at large suffer significantly. Research shows that over 50% of retail petrol stations in major agglomerations of Bangladesh sell adulterated fuels and/or cheat customers on the real volume of the fuel pumped into their vehicles. Other forms of detected fuel illicit trade practices include misdeclaration of fuel quantitative and qualitative parameters during internal transit and selling of non-declared and smuggled fuels. The aim of the study is to recommend the implementation of a National Fuel Distribution Integrity Program (FDIP) in Bangladesh to address and resolve fuel adulteration and illicit trade problems. The program should be customized according to the specific needs of the country and implemented in partnership with providers of advanced technologies. FDIP should enable and further enhance capacity of respective Bangladesh Government authorities in identification and elimination of all forms of fuel illicit trade swiftly and resolutely. FDIP high-technology, IT and automation systems and secure infrastructures should be aimed at the following areas (1) fuel adulteration, misdeclaration and non-declaration; (2) fuel quality and; (3) fuel volume manipulation at retail level. Furthermore, overall concept of FDIP delivery and its interaction with the reporting and management systems used by the Government shall be aligned with and support objectives of the Vision 2041 and Smart Bangladesh Government programs.

Keywords: fuel adulteration, octane, kerosene, diesel, petrol, pollution, carbon emissions

Procedia PDF Downloads 32
49 Modelling the Antecedents of Supply Chain Enablers in Online Groceries Using Interpretive Structural Modelling and MICMAC Analysis

Authors: Rose Antony, Vivekanand B. Khanapuri, Karuna Jain

Abstract:

Online groceries have transformed the way the supply chains are managed. These are facing numerous challenges in terms of product wastages, low margins, long breakeven to achieve and low market penetration to mention a few. The e-grocery chains need to overcome these challenges in order to survive the competition. The purpose of this paper is to carry out a structural analysis of the enablers in e-grocery chains by applying Interpretive Structural Modeling (ISM) and MICMAC analysis in the Indian context. The research design is descriptive-explanatory in nature. The enablers have been identified from the literature and through semi-structured interviews conducted among the managers having relevant experience in e-grocery supply chains. The experts have been contacted through professional/social networks by adopting a purposive snowball sampling technique. The interviews have been transcribed, and manual coding is carried using open and axial coding method. The key enablers are categorized into themes, and the contextual relationship between these and the performance measures is sought from the Industry veterans. Using ISM, the hierarchical model of the enablers is developed and MICMAC analysis identifies the driver and dependence powers. Based on the driver-dependence power the enablers are categorized into four clusters namely independent, autonomous, dependent and linkage. The analysis found that information technology (IT) and manpower training acts as key enablers towards reducing the lead time and enhancing the online service quality. Many of the enablers fall under the linkage cluster viz., frequent software updating, branding, the number of delivery boys, order processing, benchmarking, product freshness and customized applications for different stakeholders, depicting these as critical in online food/grocery supply chains. Considering the perishability nature of the product being handled, the impact of the enablers on the product quality is also identified. Hence, study aids as a tool to identify and prioritize the vital enablers in the e-grocery supply chain. The work is perhaps unique, which identifies the complex relationships among the supply chain enablers in fresh food for e-groceries and linking them to the performance measures. It contributes to the knowledge of supply chain management in general and e-retailing in particular. The approach focus on the fresh food supply chains in the Indian context and hence will be applicable in developing economies context, where supply chains are evolving.

Keywords: interpretive structural modelling (ISM), India, online grocery, retail operations, supply chain management

Procedia PDF Downloads 179
48 Customized Temperature Sensors for Sustainable Home Appliances

Authors: Merve Yünlü, Nihat Kandemir, Aylin Ersoy

Abstract:

Temperature sensors are used in home appliances not only to monitor the basic functions of the machine but also to minimize energy consumption and ensure safe operation. In parallel with the development of smart home applications and IoT algorithms, these sensors produce important data such as the frequency of use of the machine, user preferences, and the compilation of critical data in terms of diagnostic processes for fault detection throughout an appliance's operational lifespan. Commercially available thin-film resistive temperature sensors have a well-established manufacturing procedure that allows them to operate over a wide temperature range. However, these sensors are over-designed for white goods applications. The operating temperature range of these sensors is between -70°C and 850°C, while the temperature range requirement in home appliance applications is between 23°C and 500°C. To ensure the operation of commercial sensors in this wide temperature range, usually, a platinum coating of approximately 1-micron thickness is applied to the wafer. However, the use of platinum in coating and the high coating thickness extends the sensor production process time and therefore increases sensor costs. In this study, an attempt was made to develop a low-cost temperature sensor design and production method that meets the technical requirements of white goods applications. For this purpose, a custom design was made, and design parameters (length, width, trim points, and thin film deposition thickness) were optimized by using statistical methods to achieve the desired resistivity value. To develop thin film resistive temperature sensors, one side polished sapphire wafer was used. To enhance adhesion and insulation 100 nm silicon dioxide was coated by inductively coupled plasma chemical vapor deposition technique. The lithography process was performed by a direct laser writer. The lift-off process was performed after the e-beam evaporation of 10 nm titanium and 280 nm platinum layers. Standard four-point probe sheet resistance measurements were done at room temperature. The annealing process was performed. Resistivity measurements were done with a probe station before and after annealing at 600°C by using a rapid thermal processing machine. Temperature dependence between 25-300 °C was also tested. As a result of this study, a temperature sensor has been developed that has a lower coating thickness than commercial sensors but can produce reliable data in the white goods application temperature range. A relatively simplified but optimized production method has also been developed to produce this sensor.

Keywords: thin film resistive sensor, temperature sensor, household appliance, sustainability, energy efficiency

Procedia PDF Downloads 45
47 Analysis and Quantification of Historical Drought for Basin Wide Drought Preparedness

Authors: Joo-Heon Lee, Ho-Won Jang, Hyung-Won Cho, Tae-Woong Kim

Abstract:

Drought is a recurrent climatic feature that occurs in virtually every climatic zone around the world. Korea experiences the drought almost every year at the regional scale mainly during in the winter and spring seasons. Moreover, extremely severe droughts at a national scale also occurred at a frequency of six to seven years. Various drought indices had developed as tools to quantitatively monitor different types of droughts and are utilized in the field of drought analysis. Since drought is closely related with climatological and topographic characteristics of the drought prone areas, the basins where droughts are frequently occurred need separate drought preparedness and contingency plans. In this study, an analysis using statistical methods was carried out for the historical droughts occurred in the five major river basins in Korea so that drought characteristics can be quantitatively investigated. It was also aimed to provide information with which differentiated and customized drought preparedness plans can be established based on the basin level analysis results. Conventional methods which quantifies drought execute an evaluation by applying a various drought indices. However, the evaluation results for same drought event are different according to different analysis technique. Especially, evaluation of drought event differs depend on how we view the severity or duration of drought in the evaluation process. Therefore, it was intended to draw a drought history for the most severely affected five major river basins of Korea by investigating a magnitude of drought that can simultaneously consider severity, duration, and the damaged areas by applying drought run theory with the use of SPI (Standardized Precipitation Index) that can efficiently quantifies meteorological drought. Further, quantitative analysis for the historical extreme drought at various viewpoints such as average severity, duration, and magnitude of drought was attempted. At the same time, it was intended to quantitatively analyze the historical drought events by estimating the return period by derived SDF (severity-duration-frequency) curve for the five major river basins through parametric regional drought frequency analysis. Analysis results showed that the extremely severe drought years were in the years of 1962, 1988, 1994, and 2014 in the Han River basin. While, the extreme droughts were occurred in 1982 and 1988 in the Nakdong river basin, 1994 in the Geumg basin, 1988 and 1994 in Youngsan river basin, 1988, 1994, 1995, and 2000 in the Seomjin river basin. While, the extremely severe drought years at national level in the Korean Peninsula were occurred in 1988 and 1994. The most damaged drought were in 1981~1982 and 1994~1995 which lasted for longer than two years. The return period of the most severe drought at each river basin was turned out to be at a frequency of 50~100 years.

Keywords: drought magnitude, regional frequency analysis, SPI, SDF(severity-duration-frequency) curve

Procedia PDF Downloads 375
46 Numerical Modelling of the Influence of Meteorological Forcing on Water-Level in the Head Bay of Bengal

Authors: Linta Rose, Prasad K. Bhaskaran

Abstract:

Water-level information along the coast is very important for disaster management, navigation, planning shoreline management, coastal engineering and protection works, port and harbour activities, and for a better understanding of near-shore ocean dynamics. The water-level variation along a coast attributes from various factors like astronomical tides, meteorological and hydrological forcing. The study area is the Head Bay of Bengal which is highly vulnerable to flooding events caused by monsoons, cyclones and sea-level rise. The study aims to explore the extent to which wind and surface pressure can influence water-level elevation, in view of the low-lying topography of the coastal zones in the region. The ADCIRC hydrodynamic model has been customized for the Head Bay of Bengal, discretized using flexible finite elements and validated against tide gauge observations. Monthly mean climatological wind and mean sea level pressure fields of ERA Interim reanalysis data was used as input forcing to simulate water-level variation in the Head Bay of Bengal, in addition to tidal forcing. The output water-level was compared against that produced using tidal forcing alone, so as to quantify the contribution of meteorological forcing to water-level. The average contribution of meteorological fields to water-level in January is 5.5% at a deep-water location and 13.3% at a coastal location. During the month of July, when the monsoon winds are strongest in this region, this increases to 10.7% and 43.1% respectively at the deep-water and coastal locations. The model output was tested by varying the input conditions of the meteorological fields in an attempt to quantify the relative significance of wind speed and wind direction on water-level. Under uniform wind conditions, the results showed a higher contribution of meteorological fields for south-west winds than north-east winds, when the wind speed was higher. A comparison of the spectral characteristics of output water-level with that generated due to tidal forcing alone showed additional modes with seasonal and annual signatures. Moreover, non-linear monthly mode was found to be weaker than during tidal simulation, all of which point out that meteorological fields do not cause much effect on the water-level at periods less than a day and that it induces non-linear interactions between existing modes of oscillations. The study signifies the role of meteorological forcing under fair weather conditions and points out that a combination of multiple forcing fields including tides, wind, atmospheric pressure, waves, precipitation and river discharge is essential for efficient and effective forecast modelling, especially during extreme weather events.

Keywords: ADCIRC, head Bay of Bengal, mean sea level pressure, meteorological forcing, water-level, wind

Procedia PDF Downloads 194
45 Efforts to Revitalize Piipaash Language: An Explorative Study to Develop Culturally Appropriate and Contextually Relevant Teaching Materials for Preschoolers

Authors: Shahzadi Laibah Burq, Gina Scarpete Walters

Abstract:

Piipaash, representing one large family of North American languages, Yuman, is reported as one of the seriously endangered languages in the Salt River Pima-Maricopa Indian Community of Arizona. In a collaborative venture between Arizona State University (ASU) and Salt River Pima-Maricopa Indian Community (SRPMIC), efforts have been made to revitalize and preserve the Piipaash language and its cultural heritage. The present study is one example of several other language documentation and revitalization initiatives that Humanities Lab ASU has taken. This study was approved to receive a “Beyond the lab” grant after the researchers successfully created a Teaching Guide for Early Childhood Piipaash storybook during their time working in the Humanities Lab. The current research is an extension of the previous project and focuses on creating customized teaching materials and tools for the teachers and parents of the students of the Early Enrichment Program at SRPMIC. However, to determine and maximize the usefulness of the teaching materials with regards to their reliability, validity, and practicality in the given context, this research aims to conduct Environmental Analysis and Need Analysis. Environmental Analysis seeks to evaluate the Early Enrichment Program situation and Need Analysis to investigate the specific and situated requirements of the teachers to assist students in building target language skills. The study employs a qualitative methods approach for the collection of the data. Multiple data collection strategies are used concurrently to gather information from the participants. The research tools include semi-structured interviews with the program administrators and teachers, classroom observations, and teacher shadowing. The researchers utilize triangulation of the data to maintain validity in the process of data interpretation. The preliminary results of the study show a need for culturally appropriate materials that can further the learning of students of the target language as well as the culture, i.e., clay pots and basket-making materials. It was found that the course and teachers focus on developing the Listening and Speaking skills of the students. Moreover, to assist the young learners beyond the classroom, the teachers could make use of send-home teaching materials to reinforce the learning (i.e., coloring books, including illustrations of culturally relevant animals, food, and places). Audio language resources are also identified as helpful additional materials for the parents to assist the learning of the kids.

Keywords: indigenous education, materials development, need analysis, piipaash language revitalizaton

Procedia PDF Downloads 51
44 Leveraging Power BI for Advanced Geotechnical Data Analysis and Visualization in Mining Projects

Authors: Elaheh Talebi, Fariba Yavari, Lucy Philip, Lesley Town

Abstract:

The mining industry generates vast amounts of data, necessitating robust data management systems and advanced analytics tools to achieve better decision-making processes in the development of mining production and maintaining safety. This paper highlights the advantages of Power BI, a powerful intelligence tool, over traditional Excel-based approaches for effectively managing and harnessing mining data. Power BI enables professionals to connect and integrate multiple data sources, ensuring real-time access to up-to-date information. Its interactive visualizations and dashboards offer an intuitive interface for exploring and analyzing geotechnical data. Advanced analytics is a collection of data analysis techniques to improve decision-making. Leveraging some of the most complex techniques in data science, advanced analytics is used to do everything from detecting data errors and ensuring data accuracy to directing the development of future project phases. However, while Power BI is a robust tool, specific visualizations required by geotechnical engineers may have limitations. This paper studies the capability to use Python or R programming within the Power BI dashboard to enable advanced analytics, additional functionalities, and customized visualizations. This dashboard provides comprehensive tools for analyzing and visualizing key geotechnical data metrics, including spatial representation on maps, field and lab test results, and subsurface rock and soil characteristics. Advanced visualizations like borehole logs and Stereonet were implemented using Python programming within the Power BI dashboard, enhancing the understanding and communication of geotechnical information. Moreover, the dashboard's flexibility allows for the incorporation of additional data and visualizations based on the project scope and available data, such as pit design, rock fall analyses, rock mass characterization, and drone data. This further enhances the dashboard's usefulness in future projects, including operation, development, closure, and rehabilitation phases. Additionally, this helps in minimizing the necessity of utilizing multiple software programs in projects. This geotechnical dashboard in Power BI serves as a user-friendly solution for analyzing, visualizing, and communicating both new and historical geotechnical data, aiding in informed decision-making and efficient project management throughout various project stages. Its ability to generate dynamic reports and share them with clients in a collaborative manner further enhances decision-making processes and facilitates effective communication within geotechnical projects in the mining industry.

Keywords: geotechnical data analysis, power BI, visualization, decision-making, mining industry

Procedia PDF Downloads 53