Search results for: Simon Quick
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 647

Search results for: Simon Quick

227 Public Procurement and Innovation: A Municipal Approach

Authors: M. Moso-Diez, J. L. Moragues-Oregi, K. Simon-Elorz

Abstract:

Innovation procurement is designed to steer the development of solutions towards concrete public sector needs as a driver for innovation from the demand side (in public services as well as in market opportunities for companies), is horizontally emerging as a new policy instrument. In 2014 the new EU public procurement directives 2014/24/EC and 2014/25/EC reinforced the support for Public Procurement for Innovation, dedicating funding instruments that can be used across all areas supported by Horizon 2020, and targeting potential buyers of innovative solutions: groups of public procurers with similar needs. Under this programme, new policy adapters and networks emerge, aiming to embed innovation criteria into new procurement processes. As these initiatives are in process, research related to is scarce. We argue that Innovation Public Procurement can arise as an innovative policy instrument to public procurement in different policy domains, in spite of existing institutional and cultural barriers (legal guarantee versus innovation). The presentation combines insights from public procurement to supply management chain management in a sustainability and innovation policy arena, as a means of providing understanding of: (1) the circumstances that emerge; (2) the relationship between public and private actors; and (3) the emerging capacities in the definition of the agenda. The policy adopters are the contracting authorities that mainly are at municipal level where they interact with the supply management chain, interconnecting sustainability and climate measures with other policy priorities such as innovation and urban planning; and through the Competitive Dialogue procedure. We found that geography and territory affect both the level of municipal budget (due to municipal income per capita) and its institutional competencies (due to demographic reasons). In spite of the relevance of institutional determinants for public procurement, other factors play an important role such as human factors as well as both public policy and private intervention. The experience is a ‘city project’ (Bilbao) in the field of brownfield decontamination. Brownfield sites typically refer to abandoned or underused industrial and commercial properties—such as old process plants, mining sites, and landfills—that are available but contain low levels of environmental contaminants that may complicate reuse or redevelopment of the land. This article concludes that Innovation Public Procurement in sustainability and climate issues should be further developed both as a policy instrument and as a policy research line that could enable further relevant changes in public procurement as well as in climate innovation.

Keywords: innovation, city projects, public policy, public procurement

Procedia PDF Downloads 285
226 Reducing Flood Risk through Value Capture and Risk Communication: A Case Study in Cocody-Abidjan

Authors: Dedjo Yao Simon, Takahiro Saito, Norikazu Inuzuka, Ikuo Sugiyama

Abstract:

Abidjan city (Republic of Ivory Coast) is an emerging megacity and an urban coastal area where the number of floods reported is on a rapid increase due to climate change and unplanned urbanization. However, comprehensive disaster mitigation plans, policies, and financial resources are still lacking as the population ignores the extent and location of the flood zones; making them unprepared to mitigate the damages. Considering the existing condition, this paper aims to discuss an approach for flood risk reduction in Cocody Commune through value capture strategy and flood risk communication. Using geospatial techniques and hydrological simulation, we start our study by delineating flood zones and depths under several return periods in the study area. Then, through a questionnaire a field survey is conducted in order to validate the flood maps, to estimate the flood risk and to collect some sample of the opinion of residents on how the flood risk information disclosure could affect the values of property located inside and outside the flood zones. The results indicate that the study area is highly vulnerable to 5-year floods and more, which can cause serious harm to human lives and to properties as demonstrated by the extent of the 5-year flood of 2014. Also, it is revealed there is a high probability that the values of property located within flood zones could decline, and the values of surrounding property in the safe area could increase when risk information disclosure commences. However in order to raise public awareness of flood disaster and to prevent future housing promotion in high-risk prospective areas, flood risk information should be disseminated through the establishment of an early warning system. In order to reduce the effect of risk information disclosure and to protect the values of property within the high-risk zone, we propose that property tax increments in flood free zones should be captured and be utilized for infrastructure development and to maintain the early warning system that will benefit people living in flood prone areas. Through this case study, it is shown that combination of value capture strategy and risk communication could be an effective tool to educate citizen and to invest in flood risk reduction in emerging countries.

Keywords: Cocody-Abidjan, flood, geospatial techniques, risk communication, value capture

Procedia PDF Downloads 242
225 Effects of Food Habits on Road Accidents Due to Micro-Sleepiness and Analysis of Attitudes to Develop a Food Product as a Preventive Measure

Authors: Rumesh Liyanage, S. B. Nawaratne, K. K. D. S. Ranaweera, Indira Wickramasinghe, K. G. S. C. Katukurunda

Abstract:

Study it was attempted to identify an effect of food habits and publics’ attitudes on micro-sleepiness and preventive measures to develop a food product to combat. Statistical data pertaining to road accidents were collected from, Sri Lanka Police Traffic Division and a pre-tested questionnaire was used to collect data from 250 respondents. They were selected representing drivers (especially highway drivers), private and public sector workers (shift based) and cramming students (university and school). Questionnaires were directed to fill independently and personally and collected data were analyzed statistically. Results revealed that 76.84, 96.39 and 80.93% out of total respondents consumed rice for all three meals which lead to ingesting higher glycemic meals. Taking two hyper glycemic meals before 14.00h was identified as a cause of micro-sleepiness within these respondents. Peak level of road accidents were observed at 14.00 - 20.00h (38.2%)and intensity of micro-sleepiness falls at the same time period (37.36%) while 14.00 to 16.00h was the peak time, 16.00 to 18.00h was the least; again 18.00 to 20.00h it reappears slightly. Even though respondents of the survey expressed that peak hours of micro- sleepiness is 14.00-16.00h, according to police reports, peak hours fall in between 18.00-20.00h. Out of the interviewees, 69.27% strongly wanted to avoid micro-sleepiness and intend to spend LKR 10-20 on a commercial product to combat micro sleepiness. As age-old practices to suppress micro-sleepiness are time taken, modern day respondents (51.64%) like to have a quick solution through a drink. Therefore, food habits of morning and noon may cause for micro- sleepiness while dinner may cause for both, natural and micro-sleepiness due to the heavy glycemic load of food. According to the study micro-sleepiness, can be categorized into three zones such as low-risk zone (08.00-10.00h and 18.00-20.00h), manageable zone (10.00-12.00h), and high- risk zone (14.00-16.00h).

Keywords: food habits, glycemic load, micro-sleepiness, road accidents

Procedia PDF Downloads 521
224 Traditional Mechanisms of Conflict Resolution in Africa: A Pathway to Sustainable Peace in Nigeria

Authors: Ejovi Eghwubare Augustine

Abstract:

This study delved into the traditional mechanisms of conflict resolution in Africa, a pathway to sustainable peace in Nigeria. It deployed the quantitative and qualitative methods of data collection and content analysis. The work adopted the Peace Process theory propounded by John Darby and Roger Macunity. It ascertained that disputes or disagreements are unarguably and necessarily an inevitable part of human existence, flowing directly from communication, interaction, and relationships which can occur at individual and national levels, even at international levels in view of the current trend of globalization. The alternative Dispute Resolution (ADR) mechanism is a basket of procedures outside the traditional process of litigation or strict determination of legal rights. It may also be elucidated as a range of procedures that serve as generally involve the intercession and assistance of a neutral and impartial third party. The traditional mechanisms of conflict resolution in Africa are alien to the Western world; this paper is of utmost importance to the Western world and also enriched their pool of literature. Nigeria is a country that is dominated by various ethnic groups anchored on diverse cultures, customs, and traditions. It is, therefore, not surprising to see conflicts arise, and despite the various attempts at resolving these conflicts through litigation, they still remained unabated. The paper investigated the lessons learned from Traditional Mechanisms of Conflict resolution; it also interrogated its impact and the way forward. In light of the lessons that were learned and the impact of the traditional mechanisms of conflict resolution, suggestions on how to attain a sustainable, peaceful society were proffered. In conclusion, the study crystallized reforms on the alternative dispute resolution introduced through the traditional mechanism, which includes, amongst others, that constitutional recognition should be given to traditional institutions of conflict resolution to enable quick dispensation of matters.

Keywords: traditional, conflict, peace, resolution

Procedia PDF Downloads 45
223 Machine Learning Model to Predict TB Bacteria-Resistant Drugs from TB Isolates

Authors: Rosa Tsegaye Aga, Xuan Jiang, Pavel Vazquez Faci, Siqing Liu, Simon Rayner, Endalkachew Alemu, Markos Abebe

Abstract:

Tuberculosis (TB) is a major cause of disease globally. In most cases, TB is treatable and curable, but only with the proper treatment. There is a time when drug-resistant TB occurs when bacteria become resistant to the drugs that are used to treat TB. Current strategies to identify drug-resistant TB bacteria are laboratory-based, and it takes a longer time to identify the drug-resistant bacteria and treat the patient accordingly. But machine learning (ML) and data science approaches can offer new approaches to the problem. In this study, we propose to develop an ML-based model to predict the antibiotic resistance phenotypes of TB isolates in minutes and give the right treatment to the patient immediately. The study has been using the whole genome sequence (WGS) of TB isolates as training data that have been extracted from the NCBI repository and contain different countries’ samples to build the ML models. The reason that different countries’ samples have been included is to generalize the large group of TB isolates from different regions in the world. This supports the model to train different behaviors of the TB bacteria and makes the model robust. The model training has been considering three pieces of information that have been extracted from the WGS data to train the model. These are all variants that have been found within the candidate genes (F1), predetermined resistance-associated variants (F2), and only resistance-associated gene information for the particular drug. Two major datasets have been constructed using these three information. F1 and F2 information have been considered as two independent datasets, and the third information is used as a class to label the two datasets. Five machine learning algorithms have been considered to train the model. These are Support Vector Machine (SVM), Random forest (RF), Logistic regression (LR), Gradient Boosting, and Ada boost algorithms. The models have been trained on the datasets F1, F2, and F1F2 that is the F1 and the F2 dataset merged. Additionally, an ensemble approach has been used to train the model. The ensemble approach has been considered to run F1 and F2 datasets on gradient boosting algorithm and use the output as one dataset that is called F1F2 ensemble dataset and train a model using this dataset on the five algorithms. As the experiment shows, the ensemble approach model that has been trained on the Gradient Boosting algorithm outperformed the rest of the models. In conclusion, this study suggests the ensemble approach, that is, the RF + Gradient boosting model, to predict the antibiotic resistance phenotypes of TB isolates by outperforming the rest of the models.

Keywords: machine learning, MTB, WGS, drug resistant TB

Procedia PDF Downloads 24
222 Simons, Ehrlichs and the Case for Polycentricity – Why Growth-Enthusiasts and Growth-Sceptics Must Embrace Polycentricity

Authors: Justus Enninga

Abstract:

Enthusiasts and skeptics about economic growth have not much in common in their preference for institutional arrangements that solve ecological conflicts. This paper argues that agreement between both opposing schools can be found in the Bloomington Schools’ concept of polycentricity. Growth-enthusiasts who will be referred to as Simons after the economist Julian Simon and growth-skeptics named Ehrlichs after the ecologist Paul R. Ehrlich both profit from a governance structure where many officials and decision structures are assigned limited and relatively autonomous prerogatives to determine, enforce and alter legal relationships. The paper advances this argument in four steps. First, it will provide clarification of what Simons and Ehrlichs mean when they talk about growth and what the arguments for and against growth-enhancing or degrowth policies are for them and for the other site. Secondly, the paper advances the concept of polycentricity as first introduced by Michael Polanyi and later refined to the study of governance by the Bloomington School of institutional analysis around the Nobel Prize laureate Elinor Ostrom. The Bloomington School defines polycentricity as a non-hierarchical, institutional, and cultural framework that makes possible the coexistence of multiple centers of decision making with different objectives and values, that sets the stage for an evolutionary competition between the complementary ideas and methods of those different decision centers. In the third and fourth parts, it is shown how the concept of polycentricity is of crucial importance for growth-enthusiasts and growth-skeptics alike. The shorter third part demonstrates the literature on growth-enhancing policies and argues that large parts of the literature already accept that polycentric forms of governance like markets, the rule of law and federalism are an important part of economic growth. Part four delves into the more nuanced question of how a stagnant steady-state economy or even an economy that de-grows will still find polycentric governance desirable. While the majority of degrowth proposals follow a top-down approach by requiring direct governmental control, a contrasting bottom-up approach is advanced. A decentralized, polycentric approach is desirable because it allows for the utilization of tacit information dispersed in society and an institutionalized discovery process for new solutions to the problem of ecological collective action – no matter whether you belong to the Simons or Ehrlichs in a green political economy.

Keywords: degrowth, green political theory, polycentricity, institutional robustness

Procedia PDF Downloads 152
221 Biogas Enhancement Using Iron Oxide Nanoparticles and Multi-Wall Carbon Nanotubes

Authors: John Justo Ambuchi, Zhaohan Zhang, Yujie Feng

Abstract:

Quick development and usage of nanotechnology have resulted to massive use of various nanoparticles, such as iron oxide nanoparticles (IONPs) and multi-wall carbon nanotubes (MWCNTs). Thus, this study investigated the role of IONPs and MWCNTs in enhancing bioenergy recovery. Results show that IONPs at a concentration of 750 mg/L and MWCNTs at a concentration of 1500 mg/L induced faster substrate utilization and biogas production rates than the control. IONPs exhibited higher carbon oxygen demand (COD) removal efficiency than MWCNTs while on the contrary, MWCNT performance on biogas generation was remarkable than IONPs. Furthermore, scanning electron microscopy (SEM) investigation revealed extracellular polymeric substances (EPS) excretion from AGS had an interaction with nanoparticles. This interaction created a protective barrier to microbial consortia hence reducing their cytotoxicity. Microbial community analyses revealed genus predominance of bacteria of Anaerolineaceae and Longilinea. Their role in biodegradation of the substrate could have highly been boosted by nanoparticles. The archaea predominance of the genus level of Methanosaeta and Methanobacterium enhanced methanation process. The presence of bacteria of genus Geobacter was also reported. Their presence might have significantly contributed to direct interspecies electron transfer in the system. Exposure of AGS to nanoparticles promoted direct interspecies electron transfer among the anaerobic fermenting bacteria and their counterpart methanogens during the anaerobic digestion process. This results provide useful insightful information in understanding the response of microorganisms to IONPs and MWCNTs in the complex natural environment.

Keywords: anaerobic granular sludge, extracellular polymeric substances, iron oxide nanoparticles, multi-wall carbon nanotubes

Procedia PDF Downloads 268
220 A Modular Reactor for Thermochemical Energy Storage Examination of Ettringite-Based Materials

Authors: B. Chen, F. Kuznik, M. Horgnies, K. Johannes, V. Morin, E. Gengembre

Abstract:

More attention on renewable energy has been done after the achievement of Paris Agreement against climate change. Solar-based technology is supposed to be one of the most promising green energy technologies for residential buildings since its widely thermal usage for hot water and heating. However, the seasonal mismatch between its production and consumption makes buildings need an energy storage system to improve the efficiency of renewable energy use. Indeed, there exist already different kinds of energy storage systems using sensible or latent heat. With the consideration of energy dissipation during storage and low energy density for above two methods, thermochemical energy storage is then recommended. Recently, ettringite (3CaO∙Al₂O₃∙3CaSO₄∙32H₂O) based materials have been reported as potential thermochemical storage materials because of high energy density (~500 kWh/m³), low material cost (700 €/m³) and low storage temperature (~60-70°C), compared to reported salt hydrates like SrBr₂·6H₂O (42 k€/m³, ~80°C), LaCl₃·7H₂O (38 k€/m³, ~100°C) and MgSO₄·7H₂O (5 k€/m³, ~150°C). Therefore, they have the possibility to be largely used in building sector with being coupled to normal solar panel systems. On the other side, the lack in terms of extensive examination leads to poor knowledge on their thermal properties and limit maturity of this technology. The aim of this work is to develop a modular reactor adapting to thermal characterizations of ettringite-based material particles of different sizes. The filled materials in the reactor can be self-compacted vertically to ensure hot air or humid air goes through homogenously. Additionally, quick assembly and modification of reactor, like LEGO™ plastic blocks, make it suitable to distinct thermochemical energy storage material samples with different weights (from some grams to several kilograms). In our case, quantity of stored and released energy, best work conditions and even chemical durability of ettringite-based materials have been investigated.

Keywords: dehydration, ettringite, hydration, modular reactor, thermochemical energy storage

Procedia PDF Downloads 109
219 Development and Validation of Work Movement Task Analysis: Part 1

Authors: Mohd Zubairy Bin Shamsudin

Abstract:

Work-related Musculoskeletal Disorder (WMSDs) is one of the occupational health problems encountered by workers over the world. In Malaysia, there is increasing in trend over the years, particularly in the manufacturing sectors. Current method to observe workplace WMSDs is self-report questionnaire, observation and direct measurement. Observational method is most frequently used by the researcher and practitioner because of the simplified, quick and versatile when it applies to the worksite. However, there are some limitations identified e.g. some approach does not cover a wide spectrum of biomechanics activity and not sufficiently sensitive to assess the actual risks. This paper elucidates the development of Work Movement Task Analysis (WMTA), which is an observational tool for industrial practitioners’ especially untrained personnel to assess WMSDs risk factors and provide a basis for suitable intervention. First stage of the development protocol involved literature reviews, practitioner survey, tool validation and reliability. A total of six themes/comments were received in face validity stage. New revision of WMTA consisted of four sections of postural (neck, back, shoulder, arms, and legs) and associated risk factors; movement, load, coupling and basic environmental factors (lighting, noise, odorless, heat and slippery floor). For inter-rater reliability study shows substantial agreement among rater with K = 0.70. Meanwhile, WMTA validation shows significant association between WMTA score and self-reported pain or discomfort for the back, shoulder&arms and knee&legs with p<0.05. This tool is expected to provide new workplace ergonomic observational tool to assess WMSDs for the next stage of the case study.

Keywords: assessment, biomechanics, musculoskeletal disorders, observational tools

Procedia PDF Downloads 448
218 Save Lives: The Application of Geolocation-Awareness Service in Iranian Pre-hospital EMS Information Management System

Authors: Somayeh Abedian, Pirhossein Kolivand, Hamid Reza Lornejad, Amin Karampour, Ebrahim Keshavarz Safari

Abstract:

For emergency and relief service providers such as pre-hospital emergencies, quick arrival at the scene of an accident or any EMS mission is one of the most important requirements of effective service delivery. Response time (the interval between the time of the call and the time of arrival on scene) is a critical factor in determining the quality of pre-hospital Emergency Medical Services (EMS). This is especially important for heart attack, stroke, or accident patients. Location-based e-services can be broadly defined as any service that provides information pertinent to the current location of an active mobile handset or precise address of landline phone call at a specific time window, regardless of the underlying delivery technology used to convey the information. According to research, one of the effective methods of meeting this goal is determining the location of the caller via the cooperation of landline and mobile phone operators in the country. The follow-up of the Communications Regulatory Authority (CRA) organization has resulted in the receipt of two separate secured electronic web services. Thus, to ensure human privacy, a secure technical architecture was required for launching the services in the pre-hospital EMS information management system. In addition, to quicken medics’ arrival at the patient's bedside, rescue vehicles should make use of an intelligent transportation system to estimate road traffic using a GPS-based mobile navigation system independent of the Internet. This paper seeks to illustrate the architecture of the practical national model used by the Iranian EMS organization.

Keywords: response time, geographic location inquiry service (GLIS), location-based service (LBS), emergency medical services information system (EMSIS)

Procedia PDF Downloads 144
217 The Importance of Artificial Intelligence in Various Healthcare Applications

Authors: Joshna Rani S., Ahmadi Banu

Abstract:

Artificial Intelligence (AI) has a significant task to carry out in the medical care contributions of things to come. As AI, it is the essential capacity behind the advancement of accuracy medication, generally consented to be a painfully required development in care. Albeit early endeavors at giving analysis and treatment proposals have demonstrated testing, we anticipate that AI will at last dominate that area too. Given the quick propels in AI for imaging examination, it appears to be likely that most radiology, what's more, pathology pictures will be inspected eventually by a machine. Discourse and text acknowledgment are now utilized for assignments like patient correspondence and catch of clinical notes, and their utilization will increment. The best test to AI in these medical services areas isn't regardless of whether the innovations will be sufficiently skilled to be valuable, but instead guaranteeing their appropriation in day by day clinical practice. For far reaching selection to happen, AI frameworks should be affirmed by controllers, coordinated with EHR frameworks, normalized to an adequate degree that comparative items work likewise, instructed to clinicians, paid for by open or private payer associations, and refreshed over the long haul in the field. These difficulties will, at last, be survived, yet they will take any longer to do as such than it will take for the actual innovations to develop. Therefore, we hope to see restricted utilization of AI in clinical practice inside 5 years and more broad use inside 10 years. It likewise appears to be progressively evident that AI frameworks won't supplant human clinicians for a huge scope, yet rather will increase their endeavors to really focus on patients. Over the long haul, human clinicians may advance toward errands and work plans that draw on remarkably human abilities like sympathy, influence, and higher perspective mix. Maybe the lone medical services suppliers who will chance their professions over the long run might be the individuals who will not work close by AI

Keywords: artificial intellogence, health care, breast cancer, AI applications

Procedia PDF Downloads 158
216 Developing an Intervention Program to Promote Healthy Eating in a Catering System Based on Qualitative Research Results

Authors: O. Katz-Shufan, T. Simon-Tuval, L. Sabag, L. Granek, D. R. Shahar

Abstract:

Meals provided at catering systems are a common source of workers' nutrition and were found as contributing high amounts calories and fat. Thus, eating daily catering food can lead to overweight and chronic diseases. On the other hand, the institutional dining room may be an ideal environment for implementation of intervention programs that promote healthy eating. This may improve diners' lifestyle and reduce their prevalence of overweight, obesity and chronic diseases. The significance of this study is in developing an intervention program based on the diners’ dietary habits, preferences and their attitudes towards various intervention programs. In addition, a successful catering-based intervention program may have a significant effect simultaneously on a large group of diners, leading to improved nutrition, healthier lifestyle, and disease-prevention on a large scale. In order to develop the intervention program, we conducted a qualitative study. We interviewed 13 diners who eat regularly at catering systems, using a semi-structured interview. The interviews were recorded, transcribed and then analyzed by the thematic method, which identifies, analyzes and reports themes within the data. The interviews revealed several major themes, including expectation of diners to be provided with healthy food choices; their request for nutrition-expert involvement in planning the meals; the diners' feel that there is a conflict between sensory attractiveness of the food and its' nutritional quality. In the context of the catering-based intervention programs, the diners prefer scientific and clear messages focusing on labeling healthy dishes only, as opposed to the labeling of unhealthy dishes; they were interested in a nutritional education program to accompany the intervention program. Based on these findings, we have developed an intervention program that includes: changes in food served such as replacing several menu items and nutritional improvement of some of the recipes; as well as, environmental changes such as changing the location of some food items presented on the buffet, placing positive nutritional labels on healthy dishes and an ongoing healthy nutrition campaign, all accompanied by a nutrition education program. The intervention program is currently being tested for its impact on health outcomes and its cost-effectiveness.

Keywords: catering system, food services, intervention, nutrition policy, public health, qualitative research

Procedia PDF Downloads 166
215 Easy Way of Optimal Process-Storage Network Design

Authors: Gyeongbeom Yi

Abstract:

The purpose of this study is to introduce the analytic solution for determining the optimal capacity (lot-size) of a multiproduct, multistage production and inventory system to meet the finished product demand. Reasonable decision-making about the capacity of processes and storage units is an important subject for industry. The industrial solution for this subject is to use the classical economic lot sizing method, EOQ/EPQ (Economic Order Quantity/Economic Production Quantity) model, incorporated with practical experience. However, the unrealistic material flow assumption of the EOQ/EPQ model is not suitable for chemical plant design with highly interlinked processes and storage units. This study overcomes the limitation of the classical lot sizing method developed on the basis of the single product and single stage assumption. The superstructure of the plant considered consists of a network of serially and/or parallelly interlinked processes and storage units. The processes involve chemical reactions with multiple feedstock materials and multiple products as well as mixing, splitting or transportation of materials. The objective function for optimization is minimizing the total cost composed of setup and inventory holding costs as well as the capital costs of constructing processes and storage units. A novel production and inventory analysis method, PSW (Periodic Square Wave) model, is applied. The advantage of the PSW model comes from the fact that the model provides a set of simple analytic solutions in spite of a realistic description of the material flow between processes and storage units. The resulting simple analytic solution can greatly enhance the proper and quick investment decision for plant design and operation problem confronted in diverse economic situations.

Keywords: analytic solution, optimal design, process-storage network

Procedia PDF Downloads 303
214 Motivation in Online Instruction

Authors: David Whitehouse

Abstract:

Some of the strengths of online teaching include flexibility, creativity, and comprehensiveness. A challenge can be motivation. How can an instructor repeating the same lessons over and over, day in and day out, year after year, maintain motivation? Enthusiasm? Does motivating the student and creating enthusiasm in class build the same things inside the instructor? The answers lie in the adoption of what I label EUQ—The Empathy and Understanding Quotient. In the online environment, students who are adults have many demands on their time: civilian careers, families (spouse, children, older parents), and sometimes even military service. Empathetic responses on the part of the instructor will lead to open and honest communication on the part of the student, which will lead to understanding on the part of the instructor and a rise in motivation in both parties. Understanding the demands can inform an instructor’s relationship with the student throughout the temporal parameters of classwork. In practicing EUQ, instructors can build motivation in their students and find internal motivation in an enhanced classroom dynamic. The presentation will look at what motivates a student to accomplish more than the minimum required and how that can lead to excellent results for an instructor’s own motivation. Through direct experience of having students give high marks on post-class surveys and via direct messaging, the presentation will focus on how applying EUQ in granting extra time, searching for intent while grading, communicating with students via Quick Notes, responses in Forums, comments in Assignments, and comments in grading areas - - - how applying these things infuses enthusiasm and energy in the instructor which drive creativity in teaching. Three primary ways of communicating with students will be given as examples. The positive response and negative response each for a Forum, an Assignment, and a Message will be explored. If there is time, participants will be invited to craft their own EUQ responses in a role playing exercise involving two common classroom scenarios—late work and plagiarism.

Keywords: education, instruction, motivation, online, teaching

Procedia PDF Downloads 151
213 Towards Creative Movie Title Generation Using Deep Neural Models

Authors: Simon Espigolé, Igor Shalyminov, Helen Hastie

Abstract:

Deep machine learning techniques including deep neural networks (DNN) have been used to model language and dialogue for conversational agents to perform tasks, such as giving technical support and also for general chit-chat. They have been shown to be capable of generating long, diverse and coherent sentences in end-to-end dialogue systems and natural language generation. However, these systems tend to imitate the training data and will only generate the concepts and language within the scope of what they have been trained on. This work explores how deep neural networks can be used in a task that would normally require human creativity, whereby the human would read the movie description and/or watch the movie and come up with a compelling, interesting movie title. This task differs from simple summarization in that the movie title may not necessarily be derivable from the content or semantics of the movie description. Here, we train a type of DNN called a sequence-to-sequence model (seq2seq) that takes as input a short textual movie description and some information on e.g. genre of the movie. It then learns to output a movie title. The idea is that the DNN will learn certain techniques and approaches that the human movie titler may deploy that may not be immediately obvious to the human-eye. To give an example of a generated movie title, for the movie synopsis: ‘A hitman concludes his legacy with one more job, only to discover he may be the one getting hit.’; the original, true title is ‘The Driver’ and the one generated by the model is ‘The Masquerade’. A human evaluation was conducted where the DNN output was compared to the true human-generated title, as well as a number of baselines, on three 5-point Likert scales: ‘creativity’, ‘naturalness’ and ‘suitability’. Subjects were also asked which of the two systems they preferred. The scores of the DNN model were comparable to the scores of the human-generated movie title, with means m=3.11, m=3.12, respectively. There is room for improvement in these models as they were rated significantly less ‘natural’ and ‘suitable’ when compared to the human title. In addition, the human-generated title was preferred overall 58% of the time when pitted against the DNN model. These results, however, are encouraging given the comparison with a highly-considered, well-crafted human-generated movie title. Movie titles go through a rigorous process of assessment by experts and focus groups, who have watched the movie. This process is in place due to the large amount of money at stake and the importance of creating an effective title that captures the audiences’ attention. Our work shows progress towards automating this process, which in turn may lead to a better understanding of creativity itself.

Keywords: creativity, deep machine learning, natural language generation, movies

Procedia PDF Downloads 304
212 CO₂ Storage Capacity Assessment of Deep Saline Aquifers in Malaysia

Authors: Radzuan Junin, Dayang Zulaika A. Hasbollah

Abstract:

The increasing amount of greenhouse gasses in the atmosphere recently has become one of the discussed topics in relation with world’s concern on climate change. Developing countries’ emissions (such as Malaysia) are now seen to surpass developed country’s emissions due to rapid economic development growth in recent decades. This paper presents the potential storage sites suitability and storage capacity assessment for CO2 sequestration in sedimentary basins of Malaysia. This study is the first of its kind that made an identification of potential storage sites and assessment of CO2 storage capacity within the deep saline aquifers in the country. The CO2 storage capacity in saline formation assessment was conducted based on the method for quick assessment of CO2 storage capacity in closed, and semi-closed saline formations modified to suit the geology setting of Malaysia. Then, an integrated approach that involved geographic information systems (GIS) analysis and field data assessment was adopted to provide the potential storage sites and its capacity for CO2 sequestration. This study concentrated on the assessment of major sedimentary basins in Malaysia both onshore and offshore where potential geological formations which CO2 could be stored exist below 800 meters and where suitable sealing formations are present. Based on regional study and amount of data available, there are 14 sedimentary basins all around Malaysia that has been identified as potential CO2 storage. Meanwhile, from the screening and ranking exercises, it is obvious that Malay Basin, Central Luconia Province, West Baram Delta and Balingian Province are respectively ranked as the top four in the ranking system for CO2 storage. 27% of sedimentary basins in Malaysia were evaluated as high potential area for CO2 storage. This study should provide a basis for further work to reduce the uncertainty in these estimates and also provide support to policy makers on future planning of carbon capture and sequestration (CCS) projects in Malaysia.

Keywords: CO₂ storage, deep saline aquifer, GIS, sedimentary basin

Procedia PDF Downloads 326
211 A Bicycle Based Model of Prehospital Care Implanted in Northeast of the Brazil: Initial Experience

Authors: Odaleia de O. Farias, Suzelene C. Marinho, Ecleidson B. Fragoso, Daniel S. Lima, Francisco R. S. Lira, Lara S. Araújo, Gabriel dos S. D. Soares

Abstract:

In populous cities, prehospital care services that use vehicles alternative to ambulances are needed in order to reduce costs and improve response time to occurrences in areas with large concentration of people, such as leisure and tourism spaces. In this context, it was implanted a program called BIKE VIDA, that is innovative quick access and assistance program. The aim of this study is to describe the implantation and initial profile of occurrences performed by an urgency/emergency pre-hospital care service through paramedics on bicycles. It is a cross-sectional, descriptive study carried out in the city of Fortaleza, Ceara, Brazil. The data included service records from July to August 2017. Ethical aspects were respected. The service covers a perimeter of 4.5 km, divided into three areas with perimeter of 1.5 km for each paramedic, attending from 5 am to 9 pm. Materials transported by bicycles include External Automated Defibrillator - DEA, portable oxygen, oximeter, cervical collar, stethoscope, sphygmomanometer, dressing and immobilization materials and personal protective equipment. Occurrences are requested directly by calling the emergency number 192 or through direct approach to the professional. In the first month of the program, there were 93 emergencies/urgencies, mainly in the daytime period (71,0%), in males (59,7%), in the age range of 26 to 45 years (46,2%). The main nature was traumatic incidents (53.3%). Most of the cases (88,2%) did not require ambulance transport to the hospital, and there were two deaths. Pre-hospital service through bicycles is an innovative strategy in Brazil and has shown to be promising in terms of reducing costs and improving the quality of the services offered.

Keywords: emergency, response time, prehospital care, urgency

Procedia PDF Downloads 166
210 Quality Tools for Shaping Quality of Learning and Teaching in Education and Training

Authors: Renga Rao Krishnamoorthy, Raihan Tahir

Abstract:

The quality of classroom learning and teaching delivery has been and will continue to be debated at various levels worldwide. The regional cooperation programme to improve the quality and labour market orientation of the Technical and Vocational Education and Training (RECOTVET), ‘Deutsche Gesellschaft für Internationale Zusammenarbeit’ (GIZ), in line with the sustainable development goals (SDG), has taken the initiative in the development of quality TVET in the ASEAN region by developing the Quality Toolbox for Better TVET Delivery (Quality Toolbox). This initiative aims to provide quick and practical materials to trainers, instructors, and personnel involved in education and training at an institute to shape the quality of classroom learning and teaching. The Quality Toolbox for Better TVET Delivery was developed in three stages: literature review and development, validation, and finalization. Thematic areas in the Quality Toolbox were derived from collective input of concerns and challenges raised from experts’ workshops through moderated sessions involving representatives of TVET institutes from 9 ASEAN Member States (AMS). The sessions were facilitated by professional moderators and international experts. TVET practitioners representing AMS further analysed and discussed the structure of the Quality Toolbox and content of thematic areas and outlined a set of specific requirements and recommendations. The application exercise of the Quality Toolbox was carried out by TVET institutes among ASM. Experience sharing sessions from participating ASEAN countries were conducted virtually. The findings revealed that TVET institutes use two types of approaches in shaping the quality of learning and teaching, which is ascribed to inductive or deductive, shaping of quality in learning and teaching is a non-linear process and finally, Q-tools can be adopted and adapted to shape the quality of learning and teaching at TVET institutes in the following: improvement of the institutional quality, improvement of teaching quality and improvement on the organisation of learning and teaching for students and trainers. The Quality Toolbox has good potential to be used at education and training institutes to shape quality in learning and teaching.

Keywords: AMS, GIZ, RECOTVET, quality tools

Procedia PDF Downloads 109
209 Evaluation of Dry Matter Yield of Panicum maximum Intercropped with Pigeonpea and Sesbania Sesban

Authors: Misheck Musokwa, Paramu Mafongoya, Simon Lorentz

Abstract:

Seasonal shortages of fodder during the dry season is a major constraint to smallholder livestock farmers in South Africa. To mitigate the shortage of fodder, legume trees can be intercropped with pastures which can diversify the sources of feed and increase the amount of protein for grazing animals. The objective was to evaluate dry matter yield of Panicum maximum and land productivity under different fodder production systems during 2016/17-2017/18 seasons at Empangeni (28.6391° S and 31.9400° E). A randomized complete block design, replicated three times was used, the treatments were sole Panicum maximum, Panicum maximum + Sesbania sesban, Panicum maximum + pigeonpea, sole Sesbania sesban, Sole pigeonpea. Three months S.sesbania seedlings were transplanted whilst pigeonpea was direct seeded at spacing of 1m x 1m. P. maximum seeds were drilled at a respective rate of 7.5 kg/ha having an inter-row spacing of 0.25 m apart. In between rows of trees P. maximum seeds were drilled. The dry matter yield harvesting times were separated by six months’ timeframe. A 0.25 m² quadrant randomly placed on 3 points on the plot was used as sampling area during harvesting P. maximum. There was significant difference P < 0.05 across 3 harvests and total dry matter. P. maximum had higher dry matter yield as compared to both intercrops at first harvest and total. The second and third harvest had no significant difference with pigeonpea intercrop. The results was in this order for all 3 harvest: P. maximum (541.2c, 1209.3b and 1557b) kg ha¹ ≥ P. maximum + pigeonpea (157.2b, 926.7b and 1129b) kg ha¹ > P. maximum + S. sesban (36.3a, 282a and 555a) kg ha¹. Total accumulation of dry matter yield of P. maximum (3307c kg ha¹) > P. maximum + pigeonpea (2212 kg ha¹) ≥ P. maximum + S. sesban (874 kg ha¹). There was a significant difference (P< 0.05) on seed yield for trees. Pigeonpea (1240.3 kg ha¹) ≥ Pigeonpea + P. maximum (862.7 kg ha¹) > S.sesbania (391.9 kg ha¹) ≥ S.sesbania + P. maximum. The Land Equivalent Ratio (LER) was in the following order P. maximum + pigeonpea (1.37) > P. maximum + S. sesban (0.84) > Pigeonpea (0.59) ≥ S. Sesbania (0.57) > P. maximum (0.26). Results indicates that it is beneficial to have P. maximum intercropped with pigeonpea because of higher land productivity. Planting grass with pigeonpea was more beneficial than S. sesban with grass or sole cropping in terms of saving the shortage of arable land. P. maximum + pigeonpea saves a substantial (37%) land which can be subsequently be used for other crop production. Pigeonpea is recommended as an intercrop with P. maximum due to its higher LER and combined production of livestock feed, human food, and firewood. Panicum grass is low in crude protein though high in carbohydrates, there is a need for intercropping it with legume trees. A farmer who buys concentrates can reduce costs by combining P. maximum with pigeonpea this will provide a balanced diet at low cost.

Keywords: fodder, livestock, productivity, smallholder farmers

Procedia PDF Downloads 125
208 Optimization-Based Design Improvement of Synchronizer in Transmission System for Efficient Vehicle Performance

Authors: Sanyka Banerjee, Saikat Nandi, P. K. Dan

Abstract:

Synchronizers as an integral part of gearbox is a key element in the transmission system in automotive. The performance of synchronizer affects transmission efficiency and driving comfort. Synchronizing mechanism as a major component of transmission system must be capable of preventing vibration and noise in the gears. Gear shifting efficiency improvement with an aim to achieve smooth, quick and energy efficient power transmission remains a challenge for the automotive industry. Performance of the synchronizer is dependent on the features and characteristics of its sub-components and therefore analysis of the contribution of such characteristics is necessary. An important exercise involved is to identify all such characteristics or factors which are associated with the modeling and analysis and for this purpose the literature was reviewed, rather extensively, to study the mathematical models, formulated considering such. It has been observed that certain factors are rather common across models; however, there are few factors which have specifically been selected for individual models, as reported. In order to obtain a more realistic model, an attempt here has been made to identify and assimilate practically all possible factors which may be considered in formulating the model more comprehensively. A simulation study, formulated as a block model, for such analysis has been carried out in a reliable environment like MATLAB. Lower synchronization time is desirable and hence, it has been considered here as the output factors in the simulation modeling for evaluating transmission efficiency. An improved synchronizer model requires optimized values of sub-component design parameters. A parametric optimization utilizing Taguchi’s design of experiment based response data and their analysis has been carried out for this purpose. The effectiveness of the optimized parameters for the improved synchronizer performance has been validated by the simulation study of the synchronizer block model with improved parameter values as input parameters for better transmission efficiency and driver comfort.

Keywords: design of experiments, modeling, parametric optimization, simulation, synchronizer

Procedia PDF Downloads 279
207 Development of Map of Gridded Basin Flash Flood Potential Index: GBFFPI Map of QuangNam, QuangNgai, DaNang, Hue Provinces

Authors: Le Xuan Cau

Abstract:

Flash flood is occurred in short time rainfall interval: from 1 hour to 12 hours in small and medium basins. Flash floods typically have two characteristics: large water flow and big flow velocity. Flash flood is occurred at hill valley site (strip of lowland of terrain) in a catchment with large enough distribution area, steep basin slope, and heavy rainfall. The risk of flash floods is determined through Gridded Basin Flash Flood Potential Index (GBFFPI). Flash Flood Potential Index (FFPI) is determined through terrain slope flash flood index, soil erosion flash flood index, land cover flash floods index, land use flash flood index, rainfall flash flood index. Determining GBFFPI, each cell in a map can be considered as outlet of a water accumulation basin. GBFFPI of the cell is determined as basin average value of FFPI of the corresponding water accumulation basin. Based on GIS, a tool is developed to compute GBFFPI using ArcObjects SDK for .NET. The maps of GBFFPI are built in two types: GBFFPI including rainfall flash flood index (real time flash flood warning) or GBFFPI excluding rainfall flash flood index. GBFFPI Tool can be used to determine a high flash flood potential site in a large region as quick as possible. The GBFFPI is improved from conventional FFPI. The advantage of GBFFPI is that GBFFPI is taking into account the basin response (interaction of cells) and determines more true flash flood site (strip of lowland of terrain) while conventional FFPI is taking into account single cell and does not consider the interaction between cells. The GBFFPI Map of QuangNam, QuangNgai, DaNang, Hue is built and exported to Google Earth. The obtained map proves scientific basis of GBFFPI.

Keywords: ArcObjects SDK for NET, basin average value of FFPI, gridded basin flash flood potential index, GBFFPI map

Procedia PDF Downloads 349
206 Selecting The Contractor using Multi Criteria Decision Making in National Gas Company of Lorestan Province of Iran

Authors: Fatemeh Jaferi, Moslem Parsa, Heshmatolah Shams Khorramabadi

Abstract:

In this modern fluctuating world, organizations need to outsource some parts of their activities (project) to providers in order to show a quick response to their changing requirements. In fact, a number of companies and institutes have contractors do their projects and have some specific criteria in contractor selection. Therefore, a set of scientific tools is needed to select the best contractors to execute the project according to appropriate criteria. Multi-criteria decision making (MCDM) has been employed in the present study as a powerful tool in ranking and selecting the appropriate contractor. In this study, devolving second-source (civil) project to contractors in the National Gas Company of Lorestan Province (Iran) has been found and therefore, 5 civil companies have been evaluated. Evaluation criteria include executive experience, qualification of technical staff, good experience and company's rate, technical interview, affordability, equipment and machinery. Criteria's weights are found through experts' opinions along with AHP and contractors ranked through TOPSIS and AHP. The order of ranking contractors based on MCDM methods differs by changing the formula in the study. In the next phase, the number of criteria and their weights has been sensitivity analysed through using AHP. Adding each criterion changed contractors' ranking. Similarly, changing weights resulted in a change in ranking. Adopting the stated strategy resulted in the facts that not only is an appropriate scientific method available to select the most qualified contractors to execute gas project, but also a great attention is paid to picking needed criteria for selecting contractors. Consequently, executing such project is undertaken by most qualified contractors resulted in optimum use of limited resource, accelerating the implementation of project, increasing quality and finally boosting organizational efficiency.

Keywords: multi-criteria decision making, project, management, contractor selection, gas company

Procedia PDF Downloads 373
205 From Binary Solutions to Real Bio-Oils: A Multi-Step Extraction Story of Phenolic Compounds with Ionic Liquid

Authors: L. Cesari, L. Canabady-Rochelle, F. Mutelet

Abstract:

The thermal conversion of lignin produces bio-oils that contain many compounds with high added-value such as phenolic compounds. In order to efficiently extract these compounds, the possible use of choline bis(trifluoromethylsulfonyl)imide [Choline][NTf2] ionic liquid was explored. To this end, a multistep approach was implemented. First, binary (phenolic compound and solvent) and ternary (phenolic compound and solvent and ionic liquid) solutions were investigated. Eight binary systems of phenolic compound and water were investigated at atmospheric pressure. These systems were quantified using the turbidity method and UV-spectroscopy. Ternary systems (phenolic compound and water and [Choline][NTf2]) were investigated at room temperature and atmospheric pressure. After stirring, the solutions were let to settle down, and a sample of each phase was collected. The analysis of the phases was performed using gas chromatography with an internal standard. These results were used to quantify the values of the interaction parameters of thermodynamic models. Then, extractions were performed on synthetic solutions to determine the influence of several operating conditions (temperature, kinetics, amount of [Choline][NTf2]). With this knowledge, it has been possible to design and simulate an extraction process composed of one extraction column and one flash. Finally, the extraction efficiency of [Choline][NTf2] was quantified with real bio-oils from lignin pyrolysis. Qualitative and quantitative analysis were performed using gas chromatographic connected to mass spectroscopy and flame ionization detector. The experimental measurements show that the extraction of phenolic compounds is efficient at room temperature, quick and does not require a high amount of [Choline][NTf2]. Moreover, the simulations of the extraction process demonstrate that [Choline][NTf2] process requires less energy than an organic one. Finally, the efficiency of [Choline][NTf2] was confirmed in real situations with the experiments on lignin pyrolysis bio-oils.

Keywords: bio-oils, extraction, lignin, phenolic compounds

Procedia PDF Downloads 84
204 A Game-Based Methodology to Discriminate Executive Function – a Pilot Study With Institutionalized Elderly People

Authors: Marlene Rosa, Susana Lopes

Abstract:

There are few studies that explore the potential of board games as a performance measure, despite it can be an interesting strategy in the context of frailty populations. In fact, board games are immersive strategies than can inhibit the pressure of being evaluated. This study aimed to test the ability of gamed-base strategies to assess executive function in elderly population. Sixteen old participants were included: 10 with affected executive functions (G1 – 85.30±6.00 yrs old; 10 male); 6 with executive functions with non-clinical important modifications (G2 - 76.30±5.19 yrs old; 6 male). Executive tests were assessed using the Frontal Assessment Battery (FAB), which is a quick-applicable cognitive screening test (score<12 means impairment). The board game used in this study was the TATI Hand Game, specifically for training rhythmic coordination of the upper limbs with multiple cognitive stimuli. This game features 1 table grid, 1 set of Single Game cards (to play with one hand); Double Game cards (to play simultaneously with two hands); 1 dice to plan Single Game mode; cards to plan the Double Game mode; 1 bell; 2 cups. Each participant played 3 single game cards, and the following data were collected: (i) variability in time during board game challenges (SD); (ii) number of errors; (iii) execution speed (sec). G1 demonstrated: high variability in execution time during board game challenges (G1 – 13.0s vs G2- 0.5s); a higher number of errors (1.40 vs 0.67); higher execution velocity (607.80s vs 281.83s). These results demonstrated the potential of implementing board games as a functional assessment strategy in geriatric care. Future studies might include larger samples and statistical methodologies to find cut-off values for impairment in executive functions during performance in TATI game.

Keywords: board game, aging, executive function, evaluation

Procedia PDF Downloads 120
203 Velma-ARC’s Rehabilitation of Repentant Cybercriminals in Nigeria

Authors: Umukoro Omonigho Simon, Ashaolu David ‘Diya, Aroyewun-Olaleye Temitope Folashade

Abstract:

The VELMA Action to Reduce Cybercrime (ARC) is an initiative, the first of its kind in Nigeria, designed to identify, rehabilitate and empower repentant cybercrime offenders popularly known as ‘yahoo boys’ in Nigerian parlance. Velma ARC provides social inclusion boot camps with the goal of rehabilitating cybercriminals via psychotherapeutic interventions, improving their IT skills, and empowering them to make constructive contributions to society. This report highlights the psychological interventions provided for participants of the maiden edition of the Velma ARC boot camp and presents the outcomes of these interventions. The boot camp was set up in a hotel premises which was booked solely for the 1 month event. The participants were selected and invited via the Velma online recruitment portal based on an objective double-blind selection process from a pool of potential participants who signified interest via the registration portal. The participants were first taken through psychological profiling (personality, symptomology and psychopathology) before the individual and group sessions began. They were profiled using the Minnesota Multiphasic Personality Inventory -2- Restructured Form (MMPI-2-RF), the latest version of its series. Individual psychotherapy sessions were conducted for all participants based on what was interpreted on their profiles. Focus group discussion was held later to discuss a movie titled ‘catch me if you can’ directed by Steven Spielberg, featuring Leonardo De Caprio and Tom Hanks. The movie was based on the true life story of Frank Abagnale, who was a notorious scammer and con artist in his youthful years. Emergent themes from the movie were discussed as psycho-educative parameters for the participants. The overall evaluation of outcomes from the VELMA ARC rehabilitation boot camp stemmed from a disaggregated assessment of observed changes which are summarized in the final report of the clinical psychologist and was detailed enough to infer genuine repentance and positive change in attitude towards cybercrime among the participants. Follow up services were incorporated to validate initial observations. This gives credence to the potency of the psycho-educative intervention provided during the Velma ARC boot camp. It was recommended that support and collaborations from the government and other agencies/individuals would assist the VELMA foundation in expanding the scope and quality of the Velma ARC initiative as an additional requirement for cybercrime offenders following incarceration.

Keywords: Velma-ARC, cybercrime offenders, rehabilitation, Nigeria

Procedia PDF Downloads 121
202 A Multi-Agent System for Accelerating the Delivery Process of Clinical Diagnostic Laboratory Results Using GSM Technology

Authors: Ayman M. Mansour, Bilal Hawashin, Hesham Alsalem

Abstract:

Faster delivery of laboratory test results is one of the most noticeable signs of good laboratory service and is often used as a key performance indicator of laboratory performance. Despite the availability of technology, the delivery time of clinical laboratory test results continues to be a cause of customer dissatisfaction which makes patients feel frustrated and they became careless to get their laboratory test results. The Medical Clinical Laboratory test results are highly sensitive and could harm patients especially with the severe case if they deliver in wrong time. Such results affect the treatment done by physicians if arrived at correct time efforts should, therefore, be made to ensure faster delivery of lab test results by utilizing new trusted, Robust and fast system. In this paper, we proposed a distributed Multi-Agent System to enhance and faster the process of laboratory test results delivery using SMS. The developed system relies on SMS messages because of the wide availability of GSM network comparing to the other network. The software provides the capability of knowledge sharing between different units and different laboratory medical centers. The system was built using java programming. To implement the proposed system we had many possible techniques. One of these is to use the peer-to-peer (P2P) model, where all the peers are treated equally and the service is distributed among all the peers of the network. However, for the pure P2P model, it is difficult to maintain the coherence of the network, discover new peers and ensure security. Also, security is a quite important issue since each node is allowed to join the network without any control mechanism. We thus take the hybrid P2P model, a model between the Client/Server model and the pure P2P model using GSM technology through SMS messages. This model satisfies our need. A GUI has been developed to provide the laboratory staff with the simple and easy way to interact with the system. This system provides quick response rate and the decision is faster than the manual methods. This will save patients life.

Keywords: multi-agent system, delivery process, GSM technology, clinical laboratory results

Procedia PDF Downloads 228
201 Mathematical Modeling and Analysis of COVID-19 Pandemic

Authors: Thomas Wetere

Abstract:

Background: The coronavirus disease 2019 (COVID-19) pandemic (COVID-19) virus infection is a severe infectious disease with the highly transmissible variant, which become the global public health treat now. It has taken the life of more than 4 million people so far. What makes the disease the worst of all is no specific effective treatment available, its dynamics is not much researched and understood. Methodology: To end the global COVID-19 pandemic, implementation of multiple population-wide strategies, including vaccination, environmental factors, Government action, testing, and contact tracing, is required. In this article, a new mathematical model incorporating both temperature and government action to study the dynamics of the COVID-19 pandemic has been developed and comprehensively analysed. The model considers eight stages of infection: susceptible (S), infected Asymptomatic and Undetected(IAU ), infected Asymptomatic and detected(IAD), infected symptomatic and Undetected(ISU ), infected Symptomatic and detected(ISD), Hospitalized or threatened(H), Recovered(R) and Died(D). Results: The existence as well as non-negativity of the solution to the model is also verified, and the basic reproduction number is calculated. Besides, stability conditions are also checked, and finally, simulation results are compared with real data. The results demonstrates that effective government action will need to be combined with vaccination to end the ongoing COVID-19 pandemic. Conclusion: Vaccination and Government action are highly the crucial measures to control the COVID-19 pandemic. Besides, as the cost of vaccination might be high, we recommend an optimal control to reduce the cost and number of infected individuals. Moreover, in order to prevent COVID-19 pandemic, through the analysis of the model, the government must strictly manage the policy on COVID-19 and carry it out. This, in turn, helps for health campaigning and raising health literacy which plays a role to control the quick spread of the disease. We finally strongly believe that our study will play its own role in the current effort of controlling the pandemic.

Keywords: modeling, COVID-19, MCMC, stability

Procedia PDF Downloads 84
200 Cardiothoracic Ratio in Postmortem Computed Tomography: A Tool for the Diagnosis of Cardiomegaly

Authors: Alex Eldo Simon, Abhishek Yadav

Abstract:

This study aimed to evaluate the utility of postmortem computed tomography (CT) and heart weight measurements in the assessment of cardiomegaly in cases of sudden death due to cardiac origin by comparing the results of these two diagnostic methods. The study retrospectively analyzed postmortem computed tomography (PMCT) data from 54 cases of sudden natural death and compared the findings with those of the autopsy. The study involved measuring the cardiothoracic ratio (CTR) from coronal computed tomography (CT) images and determining the actual cardiac weight by weighing the heart during the autopsy. The inclusion criteria for the study were cases of sudden death suspected to be caused by cardiac pathology, while exclusion criteria included death due to unnatural causes such as trauma or poisoning, diagnosed natural causes of death related to organs other than the heart, and cases of decomposition. Sensitivity, specificity, and diagnostic accuracy were calculated, and to evaluate the accuracy of using the cardiothoracic ratio (CTR) to detect an enlarged heart, the study generated receiver operating characteristic (ROC) curves. The cardiothoracic ratio (CTR) is a radiological tool used to assess cardiomegaly by measuring the maximum cardiac diameter in relation to the maximum transverse diameter of the chest wall. The clinically used criteria for CTR have been modified from 0.50 to 0.57 for use in postmortem settings, where abnormalities can be detected by comparing CTR values to this threshold. A CTR value of 0.57 or higher is suggestive of hypertrophy but not conclusive. Similarly, heart weight is measured during the traditional autopsy, and a cardiac weight greater than 450 grams is defined as hypertrophy. Of the 54 cases evaluated, 22 (40.7%) had a cardiothoracic ratio (CTR) ranging from > 0.50 to equal 0.57, and 12 cases (22.2%) had a CTR greater than 0.57, which was defined as hypertrophy. The mean CTR was calculated as 0.52 ± 0.06. Among the 54 cases evaluated, the weight of the heart was measured, and the mean was calculated as 369.4 ± 99.9 grams. Out of the 54 cases evaluated, 12 were found to have hypertrophy as defined by PMCT, while only 9 cases were identified with hypertrophy in traditional autopsy. The sensitivity and specificity of the test were calculated as 55.56% and 84.44%, respectively. The sensitivity of the hypertrophy test was found to be 55.56% (95% CI: 26.66, 81.12¹), the specificity was 84.44% (95% CI: 71.22, 92.25¹), and the diagnostic accuracy was 79.63% (95% CI: 67.1, 88.23¹). The limitation of the study was a low sample size of only 54 cases, which may limit the generalizability of the findings. The comparison of the cardiothoracic ratio with heart weight in this study suggests that PMCT may serve as a screening tool for medico-legal autopsies when performed by forensic pathologists. However, it should be noted that the low sensitivity of the test (55.5%) may limit its diagnostic accuracy, and therefore, further studies with larger sample sizes and more diverse populations are needed to validate these findings.

Keywords: PMCT, virtopsy, CTR, cardiothoracic ratio

Procedia PDF Downloads 55
199 Evaluation of Bucket Utility Truck In-Use Driving Performance and Electrified Power Take-Off Operation

Authors: Robert Prohaska, Arnaud Konan, Kenneth Kelly, Adam Ragatz, Adam Duran

Abstract:

In an effort to evaluate the in-use performance of electrified Power Take-off (PTO) usage on bucket utility trucks operating under real-world conditions, data from 20 medium- and heavy-duty vehicles operating in California, USA were collected, compiled, and analyzed by the National Renewable Energy Laboratory's (NREL) Fleet Test and Evaluation team. In this paper, duty-cycle statistical analyses of class 5, medium-duty quick response trucks and class 8, heavy-duty material handler trucks are performed to examine and characterize vehicle dynamics trends and relationships based on collected in-use field data. With more than 100,000 kilometers of driving data collected over 880+ operating days, researchers have developed a robust methodology for identifying PTO operation from in-field vehicle data. Researchers apply this unique methodology to evaluate the performance and utilization of the conventional and electric PTO systems. Researchers also created custom representative drive-cycles for each vehicle configuration and performed modeling and simulation activities to evaluate the potential fuel and emissions savings for hybridization of the tractive driveline on these vehicles. The results of these analyses statistically and objectively define the vehicle dynamic and kinematic requirements for each vehicle configuration as well as show the potential for further system optimization through driveline hybridization. Results are presented in both graphical and tabular formats illustrating a number of key relationships between parameters observed within the data set that relates specifically to medium- and heavy-duty utility vehicles operating under real-world conditions.

Keywords: drive cycle, heavy-duty (HD), hybrid, medium-duty (MD), PTO, utility

Procedia PDF Downloads 364
198 Acquisition of French (L3) Direct Object by Persian (L1) Speakers of English (L2) as EFL Learners

Authors: Ali Akbar Jabbari

Abstract:

The present study assessed the acquisition of L3 French direct objects by Persian speakers who had already learned English as their L2. The ultimate goal of this paper is to extend the current knowledge about the CLI phenomenon in the realm of third language acquisition by examining the role of Persian and English as background languages and learners’ English level of proficiency in their performance on French direct object. To fulfill this, the assumptions of three L3 hypotheses, namely L1 Transfer, L2 Status Factor, and Cumulative Enhancement Model, were examined. The research sample was comprised of 40 undergraduate students in the fields of English language and literature and translation studies at Birjand University in Iran. According to the English proficiency level of learners revealed by the Quick Oxford English Placement test, the participants were grouped as upper intermediate and lower intermediate. A grammaticality judgment and a translation test were administered to gather the required data on learners' comprehension and production of the desired structure in French. It was demonstrated that the rate of positive transfer from previously learned languages was more potent than the rate of negative transfer. A Comparison of groups' performances revealed a significant difference between upper and lower intermediate groups in positing French direct objects correctly. However, the upper intermediate group did not significantly differ from the lower intermediate group in negative transfer. It can be said that by increasing the L2 proficiency of the learners, they could use their previous linguistic knowledge more efficiently. Although further examinations are needed, the current study contributed to a better characterization of cross-linguistic influence in third language acquisition. The findings help French teachers and learners to positively exploit the prior knowledge of Persian and English and apply it in in the multilingual context of French direct object's teaching and learning process.

Keywords: Cross-Linguistic Influence, Persian, French & English Direct Object, Third Language Acquisition, Language Transfer

Procedia PDF Downloads 47