Search results for: Web 2.0 tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3842

Search results for: Web 2.0 tools

3182 A Design Methodology and Tool to Support Ecodesign Implementation in Induction Hobs

Authors: Anna Costanza Russo, Daniele Landi, Michele Germani

Abstract:

Nowadays, the European Ecodesign Directive has emerged as a new approach to integrate environmental concerns into the product design and related processes. Ecodesign aims to minimize environmental impacts throughout the product life cycle, without compromising performances and costs. In addition, the recent Ecodesign Directives require products which are increasingly eco-friendly and eco-efficient, preserving high-performances. It is very important for producers measuring performances, for electric cooking ranges, hobs, ovens, and grills for household use, and a low power consumption of appliances represents a powerful selling point, also in terms of ecodesign requirements. The Ecodesign Directive provides a clear framework about the sustainable design of products and it has been extended in 2009 to all energy-related products, or products with an impact on energy consumption during the use. The European Regulation establishes measures of ecodesign of ovens, hobs, and kitchen hoods, and domestic use and energy efficiency of a product has a significant environmental aspect in the use phase which is the most impactful in the life cycle. It is important that the product parameters and performances are not affected by ecodesign requirements from a user’s point of view, and the benefits of reducing energy consumption in the use phase should offset the possible environmental impact in the production stage. Accurate measurements of cooking appliance performance are essential to help the industry to produce more energy efficient appliances. The development of ecodriven products requires ecoinnovation and ecodesign tools to support the sustainability improvement. The ecodesign tools should be practical and focused on specific ecoobjectives in order to be largely diffused. The main scope of this paper is the development, implementation, and testing of an innovative tool, which could be an improvement for the sustainable design of induction hobs. In particular, a prototypical software tool is developed in order to simulate the energy performances of the induction hobs. The tool is focused on a multiphysics model which is able to simulate the energy performances and the efficiency of induction hobs starting from the design data. The multiphysics model is composed by an electromagnetic simulation and a thermal simulation. The electromagnetic simulation is able to calculate the eddy current induced in the pot, which leads to the Joule heating of material. The thermal simulation is able to measure the energy consumption during the operational phase. The Joule heating caused from the eddy currents is the output of electromagnetic simulation and the input of thermal ones. The aims of the paper are the development of integrated tools and methodologies of virtual prototyping in the context of the ecodesign. This tool could be a revolutionary instrument in the field of industrial engineering and it gives consideration to the environmental aspects of product design and focus on the ecodesign of energy-related products, in order to achieve a reduced environmental impact.

Keywords: ecodesign, energy efficiency, induction hobs, virtual prototyping

Procedia PDF Downloads 242
3181 Outcomes of Pain Management for Patients in Srinagarind Hospital: Acute Pain Indicator

Authors: Chalermsri Sorasit, Siriporn Mongkhonthawornchai, Darawan Augsornwan, Sudthanom Kamollirt

Abstract:

Background: Although knowledge of pain and pain management is improving, they are still inadequate to patients. The Nursing Division of Srinagarind Hospital is responsible for setting the pain management system, including work instruction development and pain management indicators. We have developed an information technology program for monitoring pain quality indicators, which was implemented to all nursing departments in April 2013. Objective: To study outcomes of acute pain management in process and outcome indicators. Method: This is a retrospective descriptive study. The sample population was patients who had acute pain 24-48 hours after receiving a procedure, while admitted to Srinagarind Hospital in 2014. Data were collected from the information technology program. 2709 patients with acute pain from 10 Nursing Departments were recruited in the study. The research tools in this study were 1) the demographic questionnaire 2) the pain management questionnaire for process indicators, and 3) the pain management questionnaire for outcome indicators. Data were analyzed and presented by percentages and means. Results: The process indicators show that nurses used pain assessment tool and recorded 99.19%. The pain reassessment after the intervention was 96.09%. The 80.15% of the patients received opioid for pain medication and the most frequency of non-pharmacological intervention used was positioning (76.72%). For the outcome indicators, nearly half of them (49.90%) had moderate–severe pain, mean scores of worst pain was 6.48 and overall pain was 4.08. Patient satisfaction level with pain management was good (49.17%) and very good (46.62%). Conclusion: Nurses used pain assessment tools and pain documents which met the goal of the pain management process. Patient satisfaction with pain management was at high level. However the patients had still moderate to severe pain. Nurses should adhere more strictly to the guidelines of pain management, by using acute pain guidelines especially when pain intensity is particularly moderate-high. Nurses should also develop and practice a non-pharmacological pain management program to continually improve the quality of pain management. The information technology program should have more details about non-pharmacological pain techniques.

Keywords: outcome, pain management, acute pain, Srinagarind Hospital

Procedia PDF Downloads 214
3180 ROSgeoregistration: Aerial Multi-Spectral Image Simulator for the Robot Operating System

Authors: Andrew R. Willis, Kevin Brink, Kathleen Dipple

Abstract:

This article describes a software package called ROS-georegistration intended for use with the robot operating system (ROS) and the Gazebo 3D simulation environment. ROSgeoregistration provides tools for the simulation, test, and deployment of aerial georegistration algorithms and is available at github.com/uncc-visionlab/rosgeoregistration. A model creation package is provided which downloads multi-spectral images from the Google Earth Engine database and, if necessary, incorporates these images into a single, possibly very large, reference image. Additionally a Gazebo plugin which uses the real-time sensor pose and image formation model to generate simulated imagery using the specified reference image is provided along with related plugins for UAV relevant data. The novelty of this work is threefold: (1) this is the first system to link the massive multi-spectral imaging database of Google’s Earth Engine to the Gazebo simulator, (2) this is the first example of a system that can simulate geospatially and radiometrically accurate imagery from multiple sensor views of the same terrain region, and (3) integration with other UAS tools creates a new holistic UAS simulation environment to support UAS system and subsystem development where real-world testing would generally be prohibitive. Sensed imagery and ground truth registration information is published to client applications which can receive imagery synchronously with telemetry from other payload sensors, e.g., IMU, GPS/GNSS, barometer, and windspeed sensor data. To highlight functionality, we demonstrate ROSgeoregistration for simulating Electro-Optical (EO) and Synthetic Aperture Radar (SAR) image sensors and an example use case for developing and evaluating image-based UAS position feedback, i.e., pose for image-based Guidance Navigation and Control (GNC) applications.

Keywords: EO-to-EO, EO-to-SAR, flight simulation, georegistration, image generation, robot operating system, vision-based navigation

Procedia PDF Downloads 89
3179 Exploring the Design of Prospective Human Immunodeficiency Virus Type 1 Reverse Transcriptase Inhibitors through a Comprehensive Approach of Quantitative Structure Activity Relationship Study, Molecular Docking, and Molecular Dynamics Simulations

Authors: Mouna Baassi, Mohamed Moussaoui, Sanchaita Rajkhowa, Hatim Soufi, Said Belaaouad

Abstract:

The objective of this paper is to address the challenging task of targeting Human Immunodeficiency Virus type 1 Reverse Transcriptase (HIV-1 RT) in the treatment of AIDS. Reverse Transcriptase inhibitors (RTIs) have limitations due to the development of Reverse Transcriptase mutations that lead to treatment resistance. In this study, a combination of statistical analysis and bioinformatics tools was adopted to develop a mathematical model that relates the structure of compounds to their inhibitory activities against HIV-1 Reverse Transcriptase. Our approach was based on a series of compounds recognized for their HIV-1 RT enzymatic inhibitory activities. These compounds were designed via software, with their descriptors computed using multiple tools. The most statistically promising model was chosen, and its domain of application was ascertained. Furthermore, compounds exhibiting comparable biological activity to existing drugs were identified as potential inhibitors of HIV-1 RT. The compounds underwent evaluation based on their chemical absorption, distribution, metabolism, excretion, toxicity properties, and adherence to Lipinski's rule. Molecular docking techniques were employed to examine the interaction between the Reverse Transcriptase (Wild Type and Mutant Type) and the ligands, including a known drug available in the market. Molecular dynamics simulations were also conducted to assess the stability of the RT-ligand complexes. Our results reveal some of the new compounds as promising candidates for effectively inhibiting HIV-1 Reverse Transcriptase, matching the potency of the established drug. This necessitates further experimental validation. This study, beyond its immediate results, provides a methodological foundation for future endeavors aiming to discover and design new inhibitors targeting HIV-1 Reverse Transcriptase.

Keywords: QSAR, ADMET properties, molecular docking, molecular dynamics simulation, reverse transcriptase inhibitors, HIV type 1

Procedia PDF Downloads 69
3178 Reinventing Education Systems: Towards an Approach Based on Universal Values and Digital Technologies

Authors: Ilyes Athimni, Mouna Bouzazi, Mongi Boulehmi, Ahmed Ferchichi

Abstract:

The principles of good governance, universal values, and digitization are among the tools to fight corruption and improve the quality of service delivery. In recent years, these tools have become one of the most controversial topics in the field of education and a concern of many international organizations and institutions against the problem of corruption. Corruption in the education sector, particularly in higher education, has negative impacts on the quality of education systems and on the quality of administrative or educational services. Currently, the health crisis due to the spread of the COVID-19 pandemic reveals the difficulties encountered by education systems in most countries of the world. Due to the poor governance of these systems, many educational institutions were unable to continue working remotely. To respond to these problems encountered by most education systems in many countries of the world, our initiative is to propose a methodology to reinvent education systems based on global values and digital technologies. This methodology includes a work strategy for educational institutions, whether in the provision of administrative services or in the teaching method, based on information and communication technologies (ICTs), intelligence artificial, and intelligent agents. In addition, we will propose a supervisory law that will be implemented and monitored by intelligent agents to improve accountability, transparency, and accountability in educational institutions. On the other hand, we will implement and evaluate a field experience by applying the proposed methodology in the operation of an educational institution and comparing it to the traditional methodology through the results of teaching an educational program. With these specifications, we can reinvent quality education systems. We also expect the results of our proposal to play an important role at local, regional, and international levels in motivating governments of countries around the world to change their university governance policies.

Keywords: artificial intelligence, corruption in education, distance learning, education systems, ICTs, intelligent agents, good governance

Procedia PDF Downloads 192
3177 Information Technology Service Management System Measurement Using ISO20000-1 and ISO15504-8

Authors: Imam Asrowardi, Septafiansyah Dwi Putra, Eko Subyantoro

Abstract:

Process assessments can improve IT service management system (IT SMS) processes but the assessment method is not always transparent. This paper outlines a project to develop a solution- mediated process assessment tool to enable transparent and objective SMS process assessment. Using the international standards for SMS and process assessment, the tool is being developed following the International standard approach in collaboration and evaluate by expert judgment from committee members and ITSM practitioners.

Keywords: SMS, tools evaluation, ITIL, ISO service

Procedia PDF Downloads 460
3176 A Contemporary Advertising Strategy on Social Networking Sites

Authors: M. S. Aparna, Pushparaj Shetty D.

Abstract:

Nowadays social networking sites have become so popular that the producers or the sellers look for these sites as one of the best options to target the right audience to market their products. There are several tools available to monitor or analyze the social networks. Our task is to identify the right community web pages and find out the behavior analysis of the members by using these tools and formulate an appropriate strategy to market the products or services to achieve the set goals. The advertising becomes more effective when the information of the product/ services come from a known source. The strategy explores great buying influence in the audience on referral marketing. Our methodology proceeds with critical budget analysis and promotes viral influence propagation. In this context, we encompass the vital bits of budget evaluation such as the number of optimal seed nodes or primary influential users activated onset, an estimate coverage spread of nodes and maximum influence propagating distance from an initial seed to an end node. Our proposal for Buyer Prediction mathematical model arises from the urge to perform complex analysis when the probability density estimates of reliable factors are not known or difficult to calculate. Order Statistics and Buyer Prediction mapping function guarantee the selection of optimal influential users at each level. We exercise an efficient tactics of practicing community pages and user behavior to determine the product enthusiasts on social networks. Our approach is promising and should be an elementary choice when there is little or no prior knowledge on the distribution of potential buyers on social networks. In this strategy, product news propagates to influential users on or surrounding networks. By applying the same technique, a user can search friends who are capable to advise better or give referrals, if a product interests him.

Keywords: viral marketing, social network analysis, community web pages, buyer prediction, influence propagation, budget constraints

Procedia PDF Downloads 243
3175 Sustainable Zero Carbon Communities: The Role of Community-Based Interventions in Reducing Carbon Footprint

Authors: Damilola Mofikoya

Abstract:

Developed countries account for a large proportion of greenhouse gas emissions. In the last decade, countries including the United States and China have made a commitment to cut down carbon emissions by signing the Paris Climate Agreement. However, carbon neutrality is a challenging issue to tackle at the country level because of the scale of the problem. To overcome this challenge, cities are at the forefront of these efforts. Many cities in the United States are taking strategic actions and proposing programs and initiatives focused on renewable energy, green transportation, less use of fossil fuel vehicles, etc. There have been concerns about the implications of those strategies and a lack of community engagement. This paper is focused on community-based efforts that help actualize the reduction of carbon footprint through sustained and inclusive action. Existing zero-carbon assessment tools are examined to understand variables and indicators associated with the zero-carbon goals. Based on a broad, systematic review of literature on community strategies, and existing zero-carbon assessment tools, a dashboard was developed to help simplify and demystify carbon neutrality goals at a community level. The literature was able to shed light on the key contributing factors responsible for the success of community efforts in carbon neutrality. Stakeholder education is discussed as one of the strategies to help communities take action and generate momentum. The community-based efforts involving individuals and residents, such as reduction of food wastages, shopping preferences, transit mode choices, and healthy diets, play an important role in the context of zero-carbon initiatives. The proposed community-based dashboard will emphasize the importance of sustained, structured, and collective efforts at a communal scale. Finally, the present study discusses the relationship between life expectancy and quality of life and how it affects carbon neutrality in communities.

Keywords: carbon footprint, communities, life expectancy, quality of life

Procedia PDF Downloads 72
3174 International E-Learning for Assuring Ergonomic Working Conditions of Orthopaedic Surgeons: First Research Outcomes from Train4OrthoMIS

Authors: J. Bartnicka, J. A. Piedrabuena, R. Portilla, L. Moyano - Cuevas, J. B. Pagador, P. Augat, J. Tokarczyk, F. M. Sánchez Margallo

Abstract:

Orthopaedic surgeries are characterized by a high degree of complexity. This is reflected by four main groups of resources: 1) surgical team which is consisted of people with different competencies, educational backgrounds and positions; 2) information and knowledge about medical and technical aspects of surgery; 3) medical equipment including surgical tools and materials; 4) space infrastructure which is important from an operating room layout point of view. These all components must be integrated and build a homogeneous organism for achieving an efficient and ergonomically correct surgical workflow. Taking this as a background, there was formulated a concept of international project, called “Online Vocational Training course on ergonomics for orthopaedic Minimally Invasive” (Train4OrthoMIS), which aim is to develop an e-learning tool available in 4 languages (English, Spanish, Polish and German). In the article, there is presented the first project research outcomes focused on three aspects: 1) ergonomic needs of surgeons who work in hospitals around different European countries, 2) the concept of structure of e-learning course, 3) the definition of tools and methods for knowledge assessment adjusted to users’ expectation. The methodology was based on the expert panels and two types of surveys: 1) on training needs, 2) on evaluation and self-assessment preferences. The major findings of the study allowed describing the subjects of four training modules and learning sessions. According to peoples’ opinion there were defined most expected test methods which are single choice test and right after quizzes: “True or False” and “Link elements”. The first project outcomes confirmed the necessity of creating a universal training tool for orthopaedic surgeons regardless of the country in which they work. Because of limited time that surgeons have, the e-learning course should be strictly adjusted to their expectation in order to be useful.

Keywords: international e-learning, ergonomics, orthopaedic surgery, Train4OrthoMIS

Procedia PDF Downloads 164
3173 Accelerating Personalization Using Digital Tools to Drive Circular Fashion

Authors: Shamini Dhana, G. Subrahmanya VRK Rao

Abstract:

The fashion industry is advancing towards a mindset of zero waste, personalization, creativity, and circularity. The trend of upcycling clothing and materials into personalized fashion is being demanded by the next generation. There is a need for a digital tool to accelerate the process towards mass customization. Dhana’s D/Sphere fashion technology platform uses digital tools to accelerate upcycling. In essence, advanced fashion garments can be designed and developed via reuse, repurposing, recreating activities, and using existing fabric and circulating materials. The D/Sphere platform has the following objectives: to provide (1) An opportunity to develop modern fashion using existing, finished materials and clothing without chemicals or water consumption; (2) The potential for an everyday customer and designer to use the medium of fashion for creative expression; (3) A solution to address the global textile waste generated by pre- and post-consumer fashion; (4) A solution to reduce carbon emissions, water, and energy consumption with the participation of all stakeholders; (5) An opportunity for brands, manufacturers, retailers to work towards zero-waste designs and as an alternative revenue stream. Other benefits of this alternative approach include sustainability metrics, trend prediction, facilitation of disassembly and remanufacture deep learning, and hyperheuristics for high accuracy. A design tool for mass personalization and customization utilizing existing circulating materials and deadstock, targeted to fashion stakeholders will lower environmental costs, increase revenues through up to date upcycled apparel, produce less textile waste during the cut-sew-stitch process, and provide a real design solution for the end customer to be part of circular fashion. The broader impact of this technology will result in a different mindset to circular fashion, increase the value of the product through multiple life cycles, find alternatives towards zero waste, and reduce the textile waste that ends up in landfills. This technology platform will be of interest to brands and companies that have the responsibility to reduce their environmental impact and contribution to climate change as it pertains to the fashion and apparel industry. Today, over 70% of the $3 trillion fashion and apparel industry ends up in landfills. To this extent, the industry needs such alternative techniques to both address global textile waste as well as provide an opportunity to include all stakeholders and drive circular fashion with new personalized products. This type of modern systems thinking is currently being explored around the world by the private sector, organizations, research institutions, and governments. This technological innovation using digital tools has the potential to revolutionize the way we look at communication, capabilities, and collaborative opportunities amongst stakeholders in the development of new personalized and customized products, as well as its positive impacts on society, our environment, and global climate change.

Keywords: circular fashion, deep learning, digital technology platform, personalization

Procedia PDF Downloads 43
3172 A Comprehensive Methodology for Voice Segmentation of Large Sets of Speech Files Recorded in Naturalistic Environments

Authors: Ana Londral, Burcu Demiray, Marcus Cheetham

Abstract:

Speech recording is a methodology used in many different studies related to cognitive and behaviour research. Modern advances in digital equipment brought the possibility of continuously recording hours of speech in naturalistic environments and building rich sets of sound files. Speech analysis can then extract from these files multiple features for different scopes of research in Language and Communication. However, tools for analysing a large set of sound files and automatically extract relevant features from these files are often inaccessible to researchers that are not familiar with programming languages. Manual analysis is a common alternative, with a high time and efficiency cost. In the analysis of long sound files, the first step is the voice segmentation, i.e. to detect and label segments containing speech. We present a comprehensive methodology aiming to support researchers on voice segmentation, as the first step for data analysis of a big set of sound files. Praat, an open source software, is suggested as a tool to run a voice detection algorithm, label segments and files and extract other quantitative features on a structure of folders containing a large number of sound files. We present the validation of our methodology with a set of 5000 sound files that were collected in the daily life of a group of voluntary participants with age over 65. A smartphone device was used to collect sound using the Electronically Activated Recorder (EAR): an app programmed to record 30-second sound samples that were randomly distributed throughout the day. Results demonstrated that automatic segmentation and labelling of files containing speech segments was 74% faster when compared to a manual analysis performed with two independent coders. Furthermore, the methodology presented allows manual adjustments of voiced segments with visualisation of the sound signal and the automatic extraction of quantitative information on speech. In conclusion, we propose a comprehensive methodology for voice segmentation, to be used by researchers that have to work with large sets of sound files and are not familiar with programming tools.

Keywords: automatic speech analysis, behavior analysis, naturalistic environments, voice segmentation

Procedia PDF Downloads 268
3171 Pedagogical Tools In The 21st Century

Authors: M. Aherrahrou

Abstract:

Moroccan education is currently facing many difficulties and problems due to traditional methods of teaching. Neuro -Linguistic Programming (NLP) appears to hold much potential for education at all levels. In this paper, the major aim is to explore the effect of certain Neuro -Linguistic Programming techniques in one educational institution in Morocco. Quantitative and Qualitative methods are used. The findings prove the effectiveness of this new approach regarding Moroccan education, and it is a promising tool to improve the quality of learning.

Keywords: learning and teaching environment, Neuro- Linguistic Programming, education, quality of learning

Procedia PDF Downloads 337
3170 Development of Pre-Mitigation Measures and Its Impact on Life-Cycle Cost of Facilities: Indian Scenario

Authors: Mahima Shrivastava, Soumya Kar, B. Swetha Malika, Lalu Saheb, M. Muthu Kumar, P. V. Ponambala Moorthi

Abstract:

Natural hazards and manmade destruction causes both economic and societal losses. Generalized pre-mitigation strategies introduced and adopted for prevention of disaster all over the world are capable of augmenting the resiliency and optimizing the life-cycle cost of facilities. In countries like India where varied topographical feature exists requires location specific mitigation measures and strategies to be followed for better enhancement by event-driven and code-driven approaches. Present state of vindication measures followed and adopted, lags dominance in accomplishing the required development. In addition, serious concern and debate over climate change plays a vital role in enhancing the need and requirement for the development of time bound adaptive mitigation measures. For the development of long-term sustainable policies incorporation of future climatic variation is inevitable. This will further assist in assessing the impact brought about by the climate change on life-cycle cost of facilities. This paper develops more definite region specific and time bound pre-mitigation measures, by reviewing the present state of mitigation measures in India and all over the world for improving life-cycle cost of facilities. For the development of region specific adoptive measures, Indian regions were divided based on multiple-calamity prone regions and geo-referencing tools were used to incorporate the effect of climate changes on life-cycle cost assessment. This study puts forward significant effort in establishing sustainable policies and helps decision makers in planning for pre-mitigation measures for different regions. It will further contribute towards evaluating the life cycle cost of facilities by adopting the developed measures.

Keywords: climate change, geo-referencing tools, life-cycle cost, multiple-calamity prone regions, pre-mitigation strategies, sustainable policies

Procedia PDF Downloads 360
3169 The Impact of the Use of Some Multiple Intelligence-Based Teaching Strategies on Developing Moral Intelligence and Inferential Jurisprudential Thinking among Secondary School Female Students in Saudi Arabia

Authors: Sameerah A. Al-Hariri Al-Zahrani

Abstract:

The current study aims at getting acquainted with the impact of the use of some multiple intelligence-based teaching strategies on developing moral intelligence and inferential jurisprudential thinking among secondary school female students. The study has endeavored to answer the following questions: What is the impact of the use of some multiple intelligence-based teaching strategies on developing inferential jurisprudential thinking and moral intelligence among first-year secondary school female students? In the frame of this main research question, the study seeks to answer the following sub-questions: (i) What are the inferential jurisprudential thinking skills among first-year secondary school female students? (ii) What are the components of moral intelligence among first year secondary school female students? (iii) What is the impact of the use of some multiple intelligence‐based teaching strategies (such as the strategies of analyzing values, modeling, Socratic discussion, collaborative learning, peer collaboration, collective stories, building emotional moments, role play, one-minute observation) on moral intelligence among first-year secondary school female students? (iv) What is the impact of the use of some multiple intelligence‐based teaching strategies (such as the strategies of analyzing values, modeling, Socratic discussion, collaborative learning, peer collaboration, collective stories, building emotional moments, role play, one-minute observation) on developing the capacity for inferential jurisprudential thinking of juristic rules among first-year secondary school female students? The study has used the descriptive-analytical methodology in surveying, analyzing, and reviewing the literature on previous studies in order to benefit from them in building the tools of the study and the materials of experimental treatment. The study has also used the experimental method to study the impact of the independent variable (multiple intelligence strategies) on the two dependent variables (moral intelligence and inferential jurisprudential thinking) in first-year secondary school female students’ learning. The sample of the study is made up of 70 female students that have been divided into two groups: an experimental group consisting of 35 students who have been taught through multiple intelligence strategies, and a control group consisting of the other 35 students who have been taught normally. The two tools of the study (inferential jurisprudential thinking test and moral intelligence scale) have been implemented on the two groups as a pre-test. The female researcher taught the experimental group and implemented the two tools of the study. After the experiment, which lasted eight weeks, was over, the study showed the following results: (i) The existence of significant statistical differences (0.05) between the mean average of the control group and that of the experimental group in the inferential jurisprudential thinking test (recognition of the evidence of jurisprudential rule, recognition of the motive for the jurisprudential rule, jurisprudential inferencing, analogical jurisprudence) in favor of the experimental group. (ii) The existence of significant statistical differences (0.05) between the mean average of the control group and that of the experimental group in the components of the moral intelligence scale (sympathy, conscience, moral wisdom, tolerance, justice, respect) in favor of the experimental group. The study has, thus, demonstrated the impact of the use of some multiple intelligence-based teaching strategies on developing moral intelligence and inferential jurisprudential thinking.

Keywords: moral intelligence, teaching, inferential jurisprudential thinking, secondary school

Procedia PDF Downloads 145
3168 I²C Master-Slave Integration

Authors: Rozita Borhan, Lam Kien Sieng

Abstract:

This paper describes I²C Slave implementation using I²C master obtained from the OpenCores website. This website provides free Verilog and VHDL Codes to users. The design implementation for the I²C slave is in Verilog Language and uses EDA tools for ASIC design known as ModelSim from Mentor Graphic. This tool is used for simulation and verification purposes. Common application for this I²C Master-Slave integration is also included. This paper also addresses the advantages and limitations of the said design.

Keywords: I²C, master, OpenCores, slave, Verilog, verification

Procedia PDF Downloads 424
3167 BFDD-S: Big Data Framework to Detect and Mitigate DDoS Attack in SDN Network

Authors: Amirreza Fazely Hamedani, Muzzamil Aziz, Philipp Wieder, Ramin Yahyapour

Abstract:

Software-defined networking in recent years came into the sight of so many network designers as a successor to the traditional networking. Unlike traditional networks where control and data planes engage together within a single device in the network infrastructure such as switches and routers, the two planes are kept separated in software-defined networks (SDNs). All critical decisions about packet routing are made on the network controller, and the data level devices forward the packets based on these decisions. This type of network is vulnerable to DDoS attacks, degrading the overall functioning and performance of the network by continuously injecting the fake flows into it. This increases substantial burden on the controller side, and the result ultimately leads to the inaccessibility of the controller and the lack of network service to the legitimate users. Thus, the protection of this novel network architecture against denial of service attacks is essential. In the world of cybersecurity, attacks and new threats emerge every day. It is essential to have tools capable of managing and analyzing all this new information to detect possible attacks in real-time. These tools should provide a comprehensive solution to automatically detect, predict and prevent abnormalities in the network. Big data encompasses a wide range of studies, but it mainly refers to the massive amounts of structured and unstructured data that organizations deal with on a regular basis. On the other hand, it regards not only the volume of the data; but also that how data-driven information can be used to enhance decision-making processes, security, and the overall efficiency of a business. This paper presents an intelligent big data framework as a solution to handle illegitimate traffic burden on the SDN network created by the numerous DDoS attacks. The framework entails an efficient defence and monitoring mechanism against DDoS attacks by employing the state of the art machine learning techniques.

Keywords: apache spark, apache kafka, big data, DDoS attack, machine learning, SDN network

Procedia PDF Downloads 153
3166 Well-Being of Elderly with Nanonutrients

Authors: Naqvi Shraddha Rathi

Abstract:

During the aging process, physical frailty may develop. A more sedentary lifestyle, a reduction in metabolic cell mass and, consequently, lower energy expenditure and dietary intake are important contributors to the progression of frailty. A decline in intake is in turn associated with the risk of developing a suboptimal nutritional state or multiple micro nutrient deficiencies.The tantalizing potential of nanotechnology is to fabricate and combine nano scale approaches and building blocks to make useful tools and, ultimately, interventions for medical science, including nutritional science, at the scale of ∼1–100 nm.

Keywords: aging, cells frailty, micronutrients, biochemical reactivity

Procedia PDF Downloads 379
3165 A Research on Determining the Viability of a Job Board Website for Refugees in Kenya

Authors: Prince Mugoya, Collins Oduor Ondiek, Patrick Kanyi Wamuyu

Abstract:

Refugee Job Board Website is a web-based application that provides a platform for organizations to post jobs specifically for refugees. Organizations upload job opportunities and refugees can view them on the website. The website also allows refugees to input their skills and qualifications. The methodology used to develop this system is a waterfall (traditional) methodology. Software development tools include Brackets which will be used to code the website and PhpMyAdmin to store all the data in a database.

Keywords: information technology, refugee, skills, utilization, economy, jobs

Procedia PDF Downloads 144
3164 Precise CNC Machine for Multi-Tasking

Authors: Haroon Jan Khan, Xian-Feng Xu, Syed Nasir Shah, Anooshay Niazi

Abstract:

CNC machines are not only used on a large scale but also now become a prominent necessity among households and smaller businesses. Printed Circuit Boards manufactured by the chemical process are not only risky and unsafe but also expensive and time-consuming. A 3-axis precise CNC machine has been developed, which not only fabricates PCB but has also been used for multi-tasks just by changing the materials used and tools, making it versatile. The advanced CNC machine takes data from CAM software. The TB-6560 controller is used in the CNC machine to adjust variation in the X, Y, and Z axes. The advanced machine is efficient in automatic drilling, engraving, and cutting.

Keywords: CNC, G-code, CAD, CAM, Proteus, FLATCAM, Easel

Procedia PDF Downloads 137
3163 Microbial Bioproduction with Design of Metabolism and Enzyme Engineering

Authors: Tomokazu Shirai, Akihiko Kondo

Abstract:

Technologies of metabolic engineering or synthetic biology are essential for effective microbial bioproduction. It is especially important to develop an in silico tool for designing a metabolic pathway producing an unnatural and valuable chemical such as fossil materials of fuel or plastics. We here demonstrated two in silico tools for designing novel metabolic pathways: BioProV and HyMeP. Furthermore, we succeeded in creating an artificial metabolic pathway by enzyme engineering.

Keywords: bioinformatics, metabolic engineering, synthetic biology, genome scale model

Procedia PDF Downloads 322
3162 Lexical Bundles in the Alexiad of Anna Comnena: Computational and Discourse Analysis Approach

Authors: Georgios Alexandropoulos

Abstract:

The purpose of this study is to examine the historical text of Alexiad by Anna Comnena using computational tools for the extraction of lexical bundles containing the name of her father, Alexius Comnenus. For this reason, in this research we apply corpus linguistics techniques for the automatic extraction of lexical bundles and through them we will draw conclusions about how these lexical bundles serve her support provided to her father.

Keywords: lexical bundles, computational literature, critical discourse analysis, Alexiad

Procedia PDF Downloads 603
3161 Review of Concepts and Tools Applied to Assess Risks Associated with Food Imports

Authors: A. Falenski, A. Kaesbohrer, M. Filter

Abstract:

Introduction: Risk assessments can be performed in various ways and in different degrees of complexity. In order to assess risks associated with imported foods additional information needs to be taken into account compared to a risk assessment on regional products. The present review is an overview on currently available best practise approaches and data sources used for food import risk assessments (IRAs). Methods: A literature review has been performed. PubMed was searched for articles about food IRAs published in the years 2004 to 2014 (English and German texts only, search string “(English [la] OR German [la]) (2004:2014 [dp]) import [ti] risk”). Titles and abstracts were screened for import risks in the context of IRAs. The finally selected publications were analysed according to a predefined questionnaire extracting the following information: risk assessment guidelines followed, modelling methods used, data and software applied, existence of an analysis of uncertainty and variability. IRAs cited in these publications were also included in the analysis. Results: The PubMed search resulted in 49 publications, 17 of which contained information about import risks and risk assessments. Within these 19 cross references were identified to be of interest for the present study. These included original articles, reviews and guidelines. At least one of the guidelines of the World Organisation for Animal Health (OIE) and the Codex Alimentarius Commission were referenced in any of the IRAs, either for import of animals or for imports concerning foods, respectively. Interestingly, also a combination of both was used to assess the risk associated with the import of live animals serving as the source of food. Methods ranged from full quantitative IRAs using probabilistic models and dose-response models to qualitative IRA in which decision trees or severity tables were set up using parameter estimations based on expert opinions. Calculations were done using @Risk, R or Excel. Most heterogeneous was the type of data used, ranging from general information on imported goods (food, live animals) to pathogen prevalence in the country of origin. These data were either publicly available in databases or lists (e.g., OIE WAHID and Handystatus II, FAOSTAT, Eurostat, TRACES), accessible on a national level (e.g., herd information) or only open to a small group of people (flight passenger import data at national airport customs office). In the IRAs, an uncertainty analysis has been mentioned in some cases, but calculations have been performed only in a few cases. Conclusion: The current state-of-the-art in the assessment of risks of imported foods is characterized by a great heterogeneity in relation to general methodology and data used. Often information is gathered on a case-by-case basis and reformatted by hand in order to perform the IRA. This analysis therefore illustrates the need for a flexible, modular framework supporting the connection of existing data sources with data analysis and modelling tools. Such an infrastructure could pave the way to IRA workflows applicable ad-hoc, e.g. in case of a crisis situation.

Keywords: import risk assessment, review, tools, food import

Procedia PDF Downloads 290
3160 Application of Data Driven Based Models as Early Warning Tools of High Stream Flow Events and Floods

Authors: Mohammed Seyam, Faridah Othman, Ahmed El-Shafie

Abstract:

The early warning of high stream flow events (HSF) and floods is an important aspect in the management of surface water and rivers systems. This process can be performed using either process-based models or data driven-based models such as artificial intelligence (AI) techniques. The main goal of this study is to develop efficient AI-based model for predicting the real-time hourly stream flow (Q) and apply it as early warning tool of HSF and floods in the downstream area of the Selangor River basin, taken here as a paradigm of humid tropical rivers in Southeast Asia. The performance of AI-based models has been improved through the integration of the lag time (Lt) estimation in the modelling process. A total of 8753 patterns of Q, water level, and rainfall hourly records representing one-year period (2011) were utilized in the modelling process. Six hydrological scenarios have been arranged through hypothetical cases of input variables to investigate how the changes in RF intensity in upstream stations can lead formation of floods. The initial SF was changed for each scenario in order to include wide range of hydrological situations in this study. The performance evaluation of the developed AI-based model shows that high correlation coefficient (R) between the observed and predicted Q is achieved. The AI-based model has been successfully employed in early warning throughout the advance detection of the hydrological conditions that could lead to formations of floods and HSF, where represented by three levels of severity (i.e., alert, warning, and danger). Based on the results of the scenarios, reaching the danger level in the downstream area required high RF intensity in at least two upstream areas. According to results of applications, it can be concluded that AI-based models are beneficial tools to the local authorities for flood control and awareness.

Keywords: floods, stream flow, hydrological modelling, hydrology, artificial intelligence

Procedia PDF Downloads 228
3159 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts

Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig

Abstract:

This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.

Keywords: expert interview, hazard management, modeling, simulation, snow avalanche

Procedia PDF Downloads 311
3158 Creating a Dementia-Friendly Community

Authors: Annika Kjallman Alm, Ove Hellzen, Malin Rising-Homlstrom

Abstract:

The concept of dementia‐friendly communities focuses on the lived experience of people who have dementia and is most relevant to addressing their needs and the needs of those people who live with and provide support for them. The goal of communities becoming dementia‐friendly is for dementia to be normalized and recognized as a disabling condition. People with dementia find being connected to self, to others, and to the environment by meaningful activities as important. According to the concept underlying dementia-friendly communities, people with dementia or cognitive decline can continue to live in the community if their residential community has sufficiently strong social capital. The aim of this study is to explore staff and leaders’ experiences in implementing interventions to enhance a more inclusive dementia-friendly community. A municipality in northern Sweden with a population of approx. 100 000 inhabitants decided to create a dementia friendly municipality. As part of the initiative, a Centre for support was established. The Centre offered support for both individuals and groups, did home visits, and provided information about Dementia. Interviews were conducted with staff who had undergone training in a structured form of multidimensional support, the PER-model®, and worked at the Centre for support. The staff consisted of registered nurses, occupational therapists, and specialized nurses who had worked there for more than five years, and all had training in dementia. All interviews were audio-recorded and transcribed verbatim. The transcribed data were analyzed using qualitative content analysis. Results suggest that implementing the PER-model® of support for persons in the early stages of dementia and their next of kin added a much-needed form of support and perceived possibilities to enhance daily life in the early stages of dementia. The staff appreciated that the structure of PER-model® was evidenced based. They also realized that they never even considered that the person with dementia also needed support in the early stages but that they now had tools for that as well. Creating a dementia friendly municipality offering different kinds of support for all stages of dementia is a challenge. However, evidence-based tools and a broad spectrum of different types of support, whether individual or group, are needed to tailor to everyone’s needs. A conviction that all citizens are equal and should all be involved in the community is a strong motivator.

Keywords: dementia, dementia-friendly, municipality, support

Procedia PDF Downloads 141
3157 An Overview of the SIAFIM Connected Resources

Authors: Tiberiu Boros, Angela Ionita, Maria Visan

Abstract:

Wildfires are one of the frequent and uncontrollable phenomena that currently affect large areas of the world where the climate, geographic and social conditions make it impossible to prevent and control such events. In this paper we introduce the ground concepts that lie behind the SIAFIM (Satellite Image Analysis for Fire Monitoring) project in order to create a context and we introduce a set of newly created tools that are external to the project but inherently in interventions and complex decision making based on geospatial information and spatial data infrastructures.

Keywords: wildfire, forest fire, natural language processing, mobile applications, communication, GPS

Procedia PDF Downloads 563
3156 Effectiveness of High-Intensity Interval Training in Overweight Individuals between 25-45 Years of Age Registered in Sports Medicine Clinic, General Hospital Kalutara

Authors: Dimuthu Manage

Abstract:

Introduction: The prevalence of obesity and obesity-related non-communicable diseases are becoming a massive health concern in the whole world. Physical activity is recognized as an effective solution for this matter. The published data on the effectiveness of High-Intensity Interval Training (HIIT) in improving health parameters in overweight and obese individuals in Sri Lanka is sparse. Hence this study is conducted. Methodology: This is a quasi-experimental study that was conducted at the Sports medicine clinic, General Hospital, Kalutara. Participants have engaged in a programme of HIIT three times per week for six weeks. Data collection was based on precise measurements by using structured and validated methods. Ethical clearance was obtained. Results: Registered number for the study was 48, and only 52% have completed the study. The mean age was 32 (SD=6.397) years, with 64% males. All the anthropometric measurements which were assessed (i.e. waist circumference(P<0.001), weight(P<0.001) and BMI(P<0.001)), body fat percentage(P<0.001), VO2 max(P<0.001), and lipid profile (ie. HDL(P=0.016), LDL(P<0.001), cholesterol(P<0.001), triglycerides(P<0.010) and LDL: HDL(P<0.001)) had shown statistically significant improvement after the intervention with the HIIT programme. Conclusions: This study confirms HIIT as a time-saving and effective exercise method, which helps in preventing obesity as well as non-communicable diseases. HIIT ameliorates body anthropometry, fat percentage, cardiopulmonary status, and lipid profile in overweight and obese individuals markedly. As with the majority of studies, the design of the current study is subject to some limitations. The first is the study focused on a correlational study. If it is a comparative study, comparing it with other methods of training programs would have given more validity. Although the validated tools used to measure variables and the same tools used in pre and post-exercise occasions with the available facilities, it would have been better to measure some of them using gold-standard methods. However, this evidence should be further assessed in larger-scale trials using comparative groups to generalize the efficacy of the HIIT exercise program.

Keywords: HIIT, lipid profile, BMI, VO2 max

Procedia PDF Downloads 53
3155 Animations for Teaching Food Chemistry: A Design Approach for Linking Chemistry Theory to Everyday Food

Authors: Paulomi (Polly) Burey, Zoe Lynch

Abstract:

In STEM education, students often have difficulty linking static images and words from textbooks or online resources, to the underlying mechanisms of the topic of study. This can often dissuade some students from pursuing study in the physical and chemical sciences. A growing movement in current day students demonstrates that the YouTube generation feel they learn best from video or dynamic, interactive learning tools, and will seek these out as alternatives to their textbooks and the classroom learning environment. Chemistry, and in particular visualization of molecular structures in everyday materials, can prove difficult to comprehend without significant interaction with the teacher of the content and concepts, beyond the timeframe of a typical class. This can cause a learning hurdle for distance education students, and so it is necessary to provide strong electronic tools and resources to aid their learning. As one of the electronic resources, an animation design approach to link everyday materials to their underlying chemistry would be beneficial for student learning, with the focus here being on food. These animations were designed and storyboarded with a scaling approach and commence with a focus on the food material itself and its component parts. This is followed by animated transitions to its underlying microstructure and identifying features, and finally showing the molecules responsible for these microstructural features. The animation ends with a reverse transition back through the molecular structure, microstructure, all the way back to the original food material, and also animates some reactions that may occur during food processing to demonstrate the purpose of the underlying chemistry and how it affects the food we eat. Using this cyclical approach of linking students’ existing knowledge of food to help guide them to understanding more complex knowledge, and then reinforcing their learning by linking back to their prior knowledge again, enhances student understanding. Food is also an ideal material system for students to interact with, in a hands-on manner to further reinforce their learning. These animations were launched this year in a 2nd year University Food Chemistry course with improved learning outcomes for the cohort.

Keywords: chemistry, food science, future pedagogy, STEM Education

Procedia PDF Downloads 136
3154 A Computerized Tool for Predicting Future Reading Abilities in Pre-Readers Children

Authors: Stephanie Ducrot, Marie Vernet, Eve Meiss, Yves Chaix

Abstract:

Learning to read is a key topic of debate today, both in terms of its implications on school failure and illiteracy and regarding what are the best teaching methods to develop. It is estimated today that four to six percent of school-age children suffer from specific developmental disorders that impair learning. The findings from people with dyslexia and typically developing readers suggest that the problems children experience in learning to read are related to the preliteracy skills that they bring with them from kindergarten. Most tools available to professionals are designed for the evaluation of child language problems. In comparison, there are very few tools for assessing the relations between visual skills and the process of learning to read. Recent literature reports that visual-motor skills and visual-spatial attention in preschoolers are important predictors of reading development — the main goal of this study aimed at improving screening for future reading difficulties in preschool children. We used a prospective, longitudinal approach where oculomotor processes (assessed with the DiagLECT test) were measured in pre-readers, and the impact of these skills on future reading development was explored. The dialect test specifically measures the online time taken to name numbers arranged irregularly in horizontal rows (horizontal time, HT), and the time taken to name numbers arranged in vertical columns (vertical time, VT). A total of 131 preschoolers took part in this study. At Time 0 (kindergarten), the mean VT, HT, errors were recorded. One year later, at Time 1, the reading level of the same children was evaluated. Firstly, this study allowed us to provide normative data for a standardized evaluation of the oculomotor skills in 5- and 6-year-old children. The data also revealed that 25% of our sample of preschoolers showed oculomotor impairments (without any clinical complaints). Finally, the results of this study assessed the validity of the DiagLECT test for predicting reading outcomes; the better a child's oculomotor skills are, the better his/her reading abilities will be.

Keywords: vision, attention, oculomotor processes, reading, preschoolers

Procedia PDF Downloads 129
3153 When Food Cultures Meet: The Fur Trade Era on the North American Plains

Authors: C. Thomas Shay

Abstract:

When cultures meet, so do their foods. Beginning in the seventeenth century, European explorers, missionaries and fur traders entered the North American Great Plains, bringing with them deadly weapons, metal tools and a host of trade goods. Over time, they also brought barrels of their favorite comestibles—even candied ginger. While Indigenous groups actively bartered for the material goods, there was limited interest in European foods, mainly because they possessed a rich cuisine of their own.

Keywords: native Americans, europeans, great plains, fur trade, food

Procedia PDF Downloads 99