Search results for: ICT Deployment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 446

Search results for: ICT Deployment

146 A Literature Review on Emotion Recognition Using Wireless Body Area Network

Authors: Christodoulou Christos, Politis Anastasios

Abstract:

The utilization of Wireless Body Area Network (WBAN) is experiencing a notable surge in popularity as a result of its widespread implementation in the field of smart health. WBANs utilize small sensors implanted within the human body to monitor and record physiological indicators. These sensors transmit the collected data to hospitals and healthcare facilities through designated access points. Bio-sensors exhibit a diverse array of shapes and sizes, and their deployment can be tailored to the condition of the individual. Multiple sensors may be strategically placed within, on, or around the human body to effectively observe, record, and transmit essential physiological indicators. These measurements serve as a basis for subsequent analysis, evaluation, and therapeutic interventions. In conjunction with physical health concerns, numerous smartwatches are engineered to employ artificial intelligence techniques for the purpose of detecting mental health conditions such as depression and anxiety. The utilization of smartwatches serves as a secure and cost-effective solution for monitoring mental health. Physiological signals are widely regarded as a highly dependable method for the recognition of emotions due to the inherent inability of individuals to deliberately influence them over extended periods of time. The techniques that WBANs employ to recognize emotions are thoroughly examined in this article.

Keywords: emotion recognition, wireless body area network, WBAN, ERC, wearable devices, psychological signals, emotion, smart-watch, prediction

Procedia PDF Downloads 13
145 Digital Manufacturing: Evolution and a Process Oriented Approach to Align with Business Strategy

Authors: Abhimanyu Pati, Prabir K. Bandyopadhyay

Abstract:

The paper intends to highlight the significance of Digital Manufacturing (DM) strategy in support and achievement of business strategy and goals of any manufacturing organization. Towards this end, DM initiatives have been given a process perspective, while not undermining its technological significance, with a view to link its benefits directly with fulfilment of customer needs and expectations in a responsive and cost-effective manner. A digital process model has been proposed to categorize digitally enabled organizational processes with a view to create synergistic groups, which adopt and use digital tools having similar characteristics and functionalities. This will throw future opportunities for researchers and developers to create a unified technology environment for integration and orchestration of processes. Secondly, an effort has been made to apply “what” and “how” features of Quality Function Deployment (QFD) framework to establish the relationship between customers’ needs – both for external and internal customers, and the features of various digital processes, which support for the achievement of these customer expectations. The paper finally concludes that in the present highly competitive environment, business organizations cannot thrive to sustain unless they understand the significance of digital strategy and integrate it with their business strategy with a clearly defined implementation roadmap. A process-oriented approach to DM strategy will help business executives and leaders to appreciate its value propositions and its direct link to organization’s competitiveness.

Keywords: knowledge management, cloud computing, knowledge management approaches, cloud-based knowledge management

Procedia PDF Downloads 282
144 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework

Procedia PDF Downloads 254
143 Numerical Investigation of a Spiral Bladed Tidal Turbine

Authors: Mohammad Fereidoonnezhad, Seán Leen, Stephen Nash, Patrick McGarry

Abstract:

From the perspective of research innovation, the tidal energy industry is still in its early stages. While a very small number of turbines have progressed to utility-scale deployment, blade breakage is commonly reported due to the enormous hydrodynamic loading applied to devices. The aim of this study is the development of computer simulation technologies for the design of next-generation fibre-reinforced composite tidal turbines. This will require significant technical advances in the areas of tidal turbine testing and multi-scale computational modelling. The complex turbine blade profiles are designed to incorporate non-linear distributions of airfoil sections to optimize power output and self-starting capability while reducing power fluctuations. A number of candidate blade geometries are investigated, ranging from spiral geometries to parabolic geometries, with blades arranged in both cylindrical and spherical configurations on a vertical axis turbine. A combined blade element theory (BET-start-up model) is developed in MATLAB to perform computationally efficient parametric design optimisation for a range of turbine blade geometries. Finite element models are developed to identify optimal fibre-reinforced composite designs to increase blade strength and fatigue life. Advanced fluid-structure-interaction models are also carried out to compute blade deflections following design optimisation.

Keywords: tidal turbine, composite materials, fluid-structure-interaction, start-up capability

Procedia PDF Downloads 86
142 Integrating Cost-Benefit Assessment and Contract Design to Support Industrial Symbiosis Deployment

Authors: Robin Molinier

Abstract:

Industrial symbiosis (I.S) is the realization of Industrial Ecology (I.E) principles in production systems in function. I.S consists in the use of waste materials, fatal energy, recirculated utilities and infrastructure/service sharing as resources for production. Environmental benefits can be achieved from resource conservation but economic profitability is required by the participating actors. I.S indeed involves several actors with their own objectives and resources so that each one must be satisfied by ex-ante arrangements to commit toward I.S execution (investments and transactions). Following the Resource-Based View of transactions we build a modular framework to assess global I.S profitability and to specify each actor’s contributions to costs and benefits in line with their resource endowments and performance requirements formulations. I.S projects specificities implied by the need for customization (asset specificity, non-homogeneity) induce the use of long-term contracts for transactions following Transaction costs economics arguments. Thus we propose first a taxonomy of costs and value drivers for I.S and an assignment to each actor of I.S specific risks that we identified as load profiles mismatch, quality problems and value fluctuations. Then appropriate contractual guidelines (pricing, cost sharing and warranties) that support mutual profitability are derived from the detailed identification of contributions by the cost-benefits model. This analytical framework helps identifying what points to focus on when bargaining over contracting for transactions and investments. Our methodology is applied to I.S archetypes raised from a literature survey on eco-industrial parks initiatives and practitioners interviews.

Keywords: contracts, cost-benefit analysis, industrial symbiosis, risks

Procedia PDF Downloads 309
141 Use of Galileo Advanced Features in Maritime Domain

Authors: Olivier Chaigneau, Damianos Oikonomidis, Marie-Cecile Delmas

Abstract:

GAMBAS (Galileo Advanced features for the Maritime domain: Breakthrough Applications for Safety and security) is a project funded by the European Space Program Agency (EUSPA) aiming at identifying the search-and-rescue and ship security alert system needs for maritime users (including operators and fishing stakeholders) and developing operational concepts to answer these needs. The general objective of the GAMBAS project is to support the deployment of Galileo exclusive features in the maritime domain in order to improve safety and security at sea, detection of illegal activities and associated surveillance means, resilience to natural and human-induced emergency situations, and develop, integrate, demonstrate, standardize and disseminate these new associated capabilities. The project aims to demonstrate: improvement of the SAR (Search And Rescue) and SSAS (Ship Security Alert System) detection and response to maritime distress through the integration of new features into the beacon for SSAS in terms of cost optimization, user-friendly aspects, integration of Galileo and OS NMA (Open Service Navigation Message Authentication) reception for improved authenticated localization performance and reliability, and at sea triggering capabilities, optimization of the responsiveness of RCCs (Rescue Co-ordination Centre) towards the distress situations affecting vessels, the adaptation of the MCCs (Mission Control Center) and MEOLUT (Medium Earth Orbit Local User Terminal) to the data distribution of SSAS alerts.

Keywords: Galileo new advanced features, maritime, safety, security

Procedia PDF Downloads 56
140 Digital Platform for Psychological Assessment Supported by Sensors and Efficiency Algorithms

Authors: Francisco M. Silva

Abstract:

Technology is evolving, creating an impact on our everyday lives and the telehealth industry. Telehealth encapsulates the provision of healthcare services and information via a technological approach. There are several benefits of using web-based methods to provide healthcare help. Nonetheless, few health and psychological help approaches combine this method with wearable sensors. This paper aims to create an online platform for users to receive self-care help and information using wearable sensors. In addition, researchers developing a similar project obtain a solid foundation as a reference. This study provides descriptions and analyses of the software and hardware architecture. Exhibits and explains a heart rate dynamic and efficient algorithm that continuously calculates the desired sensors' values. Presents diagrams that illustrate the website deployment process and the webserver means of handling the sensors' data. The goal is to create a working project using Arduino compatible hardware. Heart rate sensors send their data values to an online platform. A microcontroller board uses an algorithm to calculate the sensor heart rate values and outputs it to a web server. The platform visualizes the sensor's data, summarizes it in a report, and creates alerts for the user. Results showed a solid project structure and communication from the hardware and software. The web server displays the conveyed heart rate sensor's data on the online platform, presenting observations and evaluations.

Keywords: Arduino, heart rate BPM, microcontroller board, telehealth, wearable sensors, web-based healthcare

Procedia PDF Downloads 96
139 Load Balancing Technique for Energy - Efficiency in Cloud Computing

Authors: Rani Danavath, V. B. Narsimha

Abstract:

Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.

Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission

Procedia PDF Downloads 414
138 Sustainability of Photovoltaic Recycling Planning

Authors: Jun-Ki Choi

Abstract:

The usage of valuable resources and the potential for waste generation at the end of the life cycle of photovoltaic (PV) technologies necessitate a proactive planning for a PV recycling infrastructure. To ensure the sustainability of PV in large scales of deployment, it is vital to develop and institute low-cost recycling technologies and infrastructure for the emerging PV industry in parallel with the rapid commercialization of these new technologies. There are various issues involved in the economics of PV recycling and this research examine those at macro and micro levels, developing a holistic interpretation of the economic viability of the PV recycling systems. This study developed mathematical models to analyze the profitability of recycling technologies and to guide tactical decisions for allocating optimal location of PV take-back centers (PVTBC), necessary for the collection of end of life products. The economic decision is usually based on the level of the marginal capital cost of each PVTBC, cost of reverse logistics, distance traveled, and the amount of PV waste collected from various locations. Results illustrated that the reverse logistics costs comprise a major portion of the cost of PVTBC; PV recycling centers can be constructed in the optimally selected locations to minimize the total reverse logistics cost for transporting the PV wastes from various collection facilities to the recycling center. In the micro- process level, automated recycling processes should be developed to handle the large amount of growing PV wastes economically. The market price of the reclaimed materials are important factors for deciding the profitability of the recycling process and this illustrates the importance of the recovering the glass and expensive metals from PV modules.

Keywords: photovoltaic, recycling, mathematical models, sustainability

Procedia PDF Downloads 216
137 Coherent All-Fiber and Polarization Maintaining Source for CO2 Range-Resolved Differential Absorption Lidar

Authors: Erwan Negre, Ewan J. O'Connor, Juha Toivonen

Abstract:

The need for CO2 monitoring technologies grows simultaneously with the worldwide concerns regarding environmental challenges. To that purpose, we developed a compact coherent all-fiber ranged-resolved Differential Absorption Lidar (RR-DIAL). It has been designed along a tunable 2x1fiber optic switch set to a frequency of 1 Hz between two Distributed FeedBack (DFB) lasers emitting in the continuous-wave mode at 1571.41 nm (absorption line of CO2) and 1571.25 nm (CO2 absorption-free line), with linewidth and tuning range of respectively 1 MHz and 3 nm over operating wavelength. A three stages amplification through Erbium and Erbium-Ytterbium doped fibers coupled to a Radio Frequency (RF) driven Acousto-Optic Modulator (AOM) generates 100 ns pulses at a repetition rate from 10 to 30 kHz with a peak power up to 2.5 kW and a spatial resolution of 15 m, allowing fast and highly resolved CO2 profiles. The same afocal collection system is used for the output of the laser source and the backscattered light which is then directed to a circulator before being mixed with the local oscillator for heterodyne detection. Packaged in an easily transportable box which also includes a server and a Field Programmable Gate Array (FPGA) card for on-line data processing and storing, our setup allows an effective and quick deployment for versatile in-situ analysis, whether it be vertical atmospheric monitoring, large field mapping or sequestration site continuous oversight. Setup operation and results from initial field measurements will be discussed.

Keywords: CO2 profiles, coherent DIAL, in-situ atmospheric sensing, near infrared fiber source

Procedia PDF Downloads 103
136 Flirting with Ephemerality and the Daily Production of the Fleeting City

Authors: Rafael Martinez

Abstract:

Our view of cities is dominated by the built environment. Buildings, streets, avenues, bridges, flyovers, and so on virtually exclude anything not fixed, permanently alterable or indefinitely temporal. Yet, city environments can also be shaped by temporally produced structures which, regardless of their transience, act as thresholds separating or segregating people and spaces. Academic works on cities conceptualize them, whether temporary or permanent, as tangible environments. This paper considers the idea of the ephemeral city, a city purposely produced and lived in as an impermanent, fluid and transitional environment resulting from an alignment of different forces. In particular, the paper proposes to observe how certain performative practices inform the emergence of ephemeral spaces in the city’s daily life. With Singapore as its backdrop and focusing foreign workers, the paper aims at documenting how everyday life practices, such as flirting, result in production of transitional space, informed by semiotic blurs, and yet material, perceptible, human and tangible for some. In this paper, it is argued that flirting for Singapore's foreign workers entails skillful understanding of what is proposed as the 'flirting cartography.' Thus, spatially, flirtation becomes not only a matter to be taken for granted but also a form of producing a fleeting space that requires deployment of various techniques drawn upon a particular knowledge. The paper is based upon a performative methodology which seeks to understand the praxis and rationale of the ephemerality of some spaces produced by foreign workers within this cosmopolitan city. By resorting to this methodological approach, the paper aims to establish the connection between the visibility gained by usually marginalized populations through their ephemeral reclamation of public spaces in the city.

Keywords: ephemeral, flirting, Singapore, space

Procedia PDF Downloads 76
135 Development of Mobile Application for Internship Program Management Using the Concept of Model View Controller (MVC) Pattern

Authors: Shutchapol Chopvitayakun

Abstract:

Nowadays, especially for the last 5 years, mobile devices, mobile applications and mobile users, through the deployment of wireless communication and mobile phone cellular network, all these components are growing significantly bigger and stronger. They are being integrated into each other to create multiple purposes and pervasive deployments into every business and non-business sector such as education, medicine, traveling, finance, real estate and many more. Objective of this study was to develop a mobile application for seniors or last-year students who enroll the internship program at each tertiary school (undergraduate school) and do onsite practice at real field sties, real organizations and real workspaces. During the internship session, all students as the interns are required to exercise, drilling and training onsite with specific locations and specific tasks or may be some assignments from their supervisor. Their work spaces are both private and government corporates and enterprises. This mobile application is developed under schema of a transactional processing system that enables users to keep daily work or practice log, monitor true working locations and ability to follow daily tasks of each trainee. Moreover, it provides useful guidance from each intern’s advisor, in case of emergency. Finally, it can summarize all transactional data then calculate each internship cumulated hours from the field practice session for each individual intern.

Keywords: internship, mobile application, Android OS, smart phone devices, mobile transactional processing system, guidance and monitoring, tertiary education, senior students, model view controller (MVC)

Procedia PDF Downloads 277
134 Simulation, Optimization, and Analysis Approach of Microgrid Systems

Authors: Saqib Ali

Abstract:

Sources are classified into two depending upon the factor of reviving. These sources, which cannot be revived into their original shape once they are consumed, are considered as nonrenewable energy resources, i.e., (coal, fuel) Moreover, those energy resources which are revivable to the original condition even after being consumed are known as renewable energy resources, i.e., (wind, solar, hydel) Renewable energy is a cost-effective way to generate clean and green electrical energy Now a day’s majority of the countries are paying heed to energy generation from RES Pakistan is mostly relying on conventional energy resources which are mostly nonrenewable in nature coal, fuel is one of the major resources, and with the advent of time their prices are increasing on the other hand RES have great potential in the country with the deployment of RES greater reliability and an effective power system can be obtained In this thesis, a similar concept is being used and a hybrid power system is proposed which is composed of intermixing of renewable and nonrenewable sources The Source side is composed of solar, wind, fuel cells which will be used in an optimal manner to serve load The goal is to provide an economical, reliable, uninterruptable power supply. This is achieved by optimal controller (PI, PD, PID, FOPID) Optimization techniques are applied to the controllers to achieve the desired results. Advanced algorithms (Particle swarm optimization, Flower Pollination Algorithm) will be used to extract the desired output from the controller Detailed comparison in the form of tables and results will be provided, which will highlight the efficiency of the proposed system.

Keywords: distributed generation, demand-side management, hybrid power system, micro grid, renewable energy resources, supply-side management

Procedia PDF Downloads 64
133 Global Healthcare Village Based on Mobile Cloud Computing

Authors: Laleh Boroumand, Muhammad Shiraz, Abdullah Gani, Rashid Hafeez Khokhar

Abstract:

Cloud computing being the use of hardware and software that are delivered as a service over a network has its application in the area of health care. Due to the emergency cases reported in most of the medical centers, prompt for an efficient scheme to make health data available with less response time. To this end, we propose a mobile global healthcare village (MGHV) model that combines the components of three deployment model which include country, continent and global health cloud to help in solving the problem mentioned above. In the creation of continent model, two (2) data centers are created of which one is local and the other is global. The local replay the request of residence within the continent, whereas the global replay the requirements of others. With the methods adopted, there is an assurance of the availability of relevant medical data to patients, specialists, and emergency staffs regardless of locations and time. From our intensive experiment using the simulation approach, it was observed that, broker policy scheme with respect to optimized response time, yields a very good performance in terms of reduction in response time. Though, our results are comparable to others when there is an increase in the number of virtual machines (80-640 virtual machines). The proportionality in increase of response time is within 9%. The results gotten from our simulation experiments shows that utilizing MGHV leads to the reduction of health care expenditures and helps in solving the problems of unqualified medical staffs faced by both developed and developing countries.

Keywords: cloud computing (MCC), e-healthcare, availability, response time, service broker policy

Procedia PDF Downloads 332
132 A Novel Rapid Well Control Technique Modelled in Computational Fluid Dynamics Software

Authors: Michael Williams

Abstract:

The ability to control a flowing well is of the utmost important. During the kill phase, heavy weight kill mud is circulated around the well. While increasing bottom hole pressure near wellbore formation, the damage is increased. The addition of high density spherical objects has the potential to minimise this near wellbore damage, increase bottom hole pressure and reduce operational time to kill the well. This operational time saving is seen in the rapid deployment of high density spherical objects instead of building high density drilling fluid. The research aims to model the well kill process using a Computational Fluid Dynamics software. A model has been created as a proof of concept to analyse the flow of micron sized spherical objects in the drilling fluid. Initial results show that this new methodology of spherical objects in drilling fluid agrees with traditional stream lines seen in non-particle flow. Additional models have been created to demonstrate that areas of higher flow rate around the bit can lead to increased probability of wash out of formations but do not affect the flow of micron sized spherical objects. Interestingly, areas that experience dimensional changes such as tool joints and various BHA components do not appear at this initial stage to experience increased velocity or create areas of turbulent flow, which could lead to further borehole stability. In conclusion, the initial models of this novel well control methodology have not demonstrated any adverse flow patterns, which would conclude that this model may be viable under field conditions.

Keywords: well control, fluid mechanics, safety, environment

Procedia PDF Downloads 133
131 Integrating Knowledge Distillation of Multiple Strategies

Authors: Min Jindong, Wang Mingxia

Abstract:

With the widespread use of artificial intelligence in life, computer vision, especially deep convolutional neural network models, has developed rapidly. With the increase of the complexity of the real visual target detection task and the improvement of the recognition accuracy, the target detection network model is also very large. The huge deep neural network model is not conducive to deployment on edge devices with limited resources, and the timeliness of network model inference is poor. In this paper, knowledge distillation is used to compress the huge and complex deep neural network model, and the knowledge contained in the complex network model is comprehensively transferred to another lightweight network model. Different from traditional knowledge distillation methods, we propose a novel knowledge distillation that incorporates multi-faceted features, called M-KD. In this paper, when training and optimizing the deep neural network model for target detection, the knowledge of the soft target output of the teacher network in knowledge distillation, the relationship between the layers of the teacher network and the feature attention map of the hidden layer of the teacher network are transferred to the student network as all knowledge. in the model. At the same time, we also introduce an intermediate transition layer, that is, an intermediate guidance layer, between the teacher network and the student network to make up for the huge difference between the teacher network and the student network. Finally, this paper adds an exploration module to the traditional knowledge distillation teacher-student network model. The student network model not only inherits the knowledge of the teacher network but also explores some new knowledge and characteristics. Comprehensive experiments in this paper using different distillation parameter configurations across multiple datasets and convolutional neural network models demonstrate that our proposed new network model achieves substantial improvements in speed and accuracy performance.

Keywords: object detection, knowledge distillation, convolutional network, model compression

Procedia PDF Downloads 242
130 Russia’s Role in Resolving the Nagorno-Karabakh Conflict 1990-2020

Authors: Friba Haidari

Abstract:

The aim of the study is to identify Russia's role in managing the Nagorno-Karabakh conflict betweenArmenia and Azerbaijan during the years 1990 to 2020. The Nagorno-Karabakh crisis can not be considered a mere territorial conflict but also a crossroads of interests of foreign actors. Geopolitical rivalries and the access to energy by regional and trans-regional actors have complicated the crisis and created a security challenge in the region, which is likely to escalate into a full-blown war between the parties involved. The geopolitical situation of Nagorno-Karabakh and its current situation have affected all peripheral states in some way. Russia, as one of the main actors in this scene, has been actively involved since the beginning of the crisis. The Russians have always sought to strengthen their influence and presence in the Nagorno-Karabakh crisis. Russia's efforts to weaken the role of the Minsk Group, The presence of Western actors, and the deployment of Russian forces in the disputed area can be assessed in this context. However, this study seeks to answer the question of what role did Russia play in managing the Nagorno-Karabakh conflict between Armenia and Azerbaijan between 1990 and 2020? The study hypothesizes that Russia has prevented the escalation of the Nagorno-Karabakh conflict through mediation and some coercion. This study is divided into four parts, including conflict management as a theoretical framework; Examining the competition and the role of actors in the Caucasus region, especially the role of the Minsk Group, and what approach or tools and methods Russia has used in its foreign policy in managing the conflict, and finally what are the relations between the countries involved and what will be Russia's role in the future? Was discussed. This study examines the analysis and transfer of ideas and information using authoritative international sources with an explanatory method and shares its results with everyone.

Keywords: Russia, conflict, nagorno-karabakh, management

Procedia PDF Downloads 56
129 Monitoring of the Chillon Viaducts after Rehabilitation with Ultra High Performance Fiber Reinforced Cement-Based Composite

Authors: Henar Martín-Sanz García, Eleni Chatzi, Eugen Brühwiler

Abstract:

Located on the shore of Geneva Lake, in Switzerland, the Chillon Viaducts are two parallel structures consisted of post-tensioned concrete box girders, with a total length of 2 kilometers and 100m spans. Built in 1969, the bridges currently accommodate a traffic load of 50.000 vehicles per day, thereby holding a key role both in terms of historic value as well as socio-economic significance. Although several improvements have been carried out in the past two decades, recent inspections demonstrate an Alkali-Aggregate reaction in the concrete deck and piers reducing the concrete strength. In order to prevent further expansion of this issue, a layer of 40 mm of Ultra High Performance Fiber Reinforced cement-based Composite (UHPFRC) (incorporating rebars) was casted over the slabs, acting as a waterproof membrane and providing significant increase in resistance of the bridge structure by composite UHPFRC – RC composite action in particular of the deck slab. After completing the rehabilitation works, a Structural Monitoring campaign was installed on the deck slab in one representative span, based on accelerometers, strain gauges, thermal and humidity sensors. This campaign seeks to reveal information on the behavior of UHPFRC-concrete composite systems, such as increase in stiffness, fatigue strength, durability and long-term performance. Consequently, the structural monitoring is expected to last for at least three years. A first insight of the analyzed results from the initial months of measurements is presented herein, along with future improvements or necessary changes on the deployment.

Keywords: composite materials, rehabilitation, structural health monitoring, UHPFRC

Procedia PDF Downloads 249
128 Human Vibrotactile Discrimination Thresholds for Simultaneous and Sequential Stimuli

Authors: Joanna Maj

Abstract:

Body machine interfaces (BMIs) afford users a non-invasive way coordinate movement. Vibrotactile stimulation has been incorporated into BMIs to allow feedback in real-time and guide movement control to benefit patients with cognitive deficits, such as stroke survivors. To advance research in this area, we examined vibrational discrimination thresholds at four body locations to determine suitable application sites for future multi-channel BMIs using vibration cues to guide movement planning and control. Twelve healthy adults had a pair of small vibrators (tactors) affixed to the skin at each location: forearm, shoulders, torso, and knee. A "standard" stimulus (186 Hz; 750 ms) and "probe" stimuli (11 levels ranging from 100 Hz to 235 Hz; 750 ms) were delivered. Probe and test stimulus pairs could occur sequentially or simultaneously (timing). Participants verbally indicated which stimulus felt more intense. Stimulus order was counterbalanced across tactors and body locations. Probabilities that probe stimuli felt more intense than the standard stimulus were computed and fit with a cumulative Gaussian function; the discrimination threshold was defined as one standard deviation of the underlying distribution. Threshold magnitudes depended on stimulus timing and location. Discrimination thresholds were better for stimuli applied sequentially vs. simultaneously at the torso as well as the knee. Thresholds were small (better) and relatively insensitive to timing differences for vibrations applied at the shoulder. BMI applications requiring multiple channels of simultaneous vibrotactile stimulation should therefore consider the shoulder as a deployment site for a vibrotactile BMI interface.

Keywords: electromyography, electromyogram, neuromuscular disorders, biomedical instrumentation, controls engineering

Procedia PDF Downloads 28
127 The Security Trade-Offs in Resource Constrained Nodes for IoT Application

Authors: Sultan Alharby, Nick Harris, Alex Weddell, Jeff Reeve

Abstract:

The concept of the Internet of Things (IoT) has received much attention over the last five years. It is predicted that the IoT will influence every aspect of our lifestyles in the near future. Wireless Sensor Networks are one of the key enablers of the operation of IoTs, allowing data to be collected from the surrounding environment. However, due to limited resources, nature of deployment and unattended operation, a WSN is vulnerable to various types of attack. Security is paramount for reliable and safe communication between IoT embedded devices, but it does, however, come at a cost to resources. Nodes are usually equipped with small batteries, which makes energy conservation crucial to IoT devices. Nevertheless, security cost in terms of energy consumption has not been studied sufficiently. Previous research has used a security specification of 802.15.4 for IoT applications, but the energy cost of each security level and the impact on quality of services (QoS) parameters remain unknown. This research focuses on the cost of security at the IoT media access control (MAC) layer. It begins by studying the energy consumption of IEEE 802.15.4 security levels, which is followed by an evaluation for the impact of security on data latency and throughput, and then presents the impact of transmission power on security overhead, and finally shows the effects of security on memory footprint. The results show that security overhead in terms of energy consumption with a payload of 24 bytes fluctuates between 31.5% at minimum level over non-secure packets and 60.4% at the top security level of 802.15.4 security specification. Also, it shows that security cost has less impact at longer packet lengths, and more with smaller packet size. In addition, the results depicts a significant impact on data latency and throughput. Overall, maximum authentication length decreases throughput by almost 53%, and encryption and authentication together by almost 62%.

Keywords: energy consumption, IEEE 802.15.4, IoT security, security cost evaluation

Procedia PDF Downloads 127
126 Fast Aerodynamic Evaluation of Transport Aircraft in Early Phases

Authors: Xavier Bertrand, Alexandre Cayrel

Abstract:

The early phase of an aircraft development is instrumental as it really drives the potential of a new concept. Any weakness in the high-level design (wing planform, moveable surfaces layout etc.) will be extremely difficult and expensive to recover later in the aircraft development process. Aerodynamic evaluation in this very early development phase is driven by two main criteria: a short lead-time to allow quick iterations of the geometrical design, and a high quality of the calculations to get an accurate & reliable assessment of the current status. These two criteria are usually quite contradictory. Actually, short lead time of a couple of hours from end-to-end can be obtained with very simple tools (semi-empirical methods for instance) although their accuracy is limited, whereas higher quality calculations require heavier/more complex tools, which obviously need more complex inputs as well, and a significantly longer lead time. At this point, the choice has to be done between accuracy and lead-time. A brand new approach has been developed within Airbus, aiming at obtaining quickly high quality evaluations of the aerodynamic of an aircraft. This methodology is based on a joint use of Surrogate Modelling and a lifting line code. The Surrogate Modelling is used to get the wing sections characteristics (e.g. lift coefficient vs. angle of attack), whatever the airfoil geometry, the status of the moveable surfaces (aileron/spoilers) or the high-lift devices deployment. From these characteristics, the lifting line code is used to get the 3D effects on the wing whatever the flow conditions (low/high Mach numbers etc.). This methodology has been applied successfully to a concept of medium range aircraft.

Keywords: aerodynamics, lifting line, surrogate model, CFD

Procedia PDF Downloads 313
125 Importance of Field Hospitals in Trauma Management: An Experience from Nepal Earthquake

Authors: Krishna Gopal Lageju

Abstract:

On 25th April 2015, a 7.6 magnitude earthquake struck Gorkha district of Nepal, which resulted over 8,790 deaths and 22,300 injuries. In addition, almost one-third of the country’s healthcare service has been disrupted. A total of 1,211 health facilities became non-operational, due to 446 completely and other 765 partially damaged. Nearly 84 percent (375 out of 446) of the completely damaged health facilities are in the 14 most affected districts. As a result, the ability of health facilities to respond to health care needs has been harshly affected. In addition, 18 health workers lost their lives and 75 are injured, which added further challenges in the delivery of health services. Thus, to address the immediate health needs in the most devastated areas, Nepal Red Cross Society (NRCS) in coordination with IFRC and Nepal Government, 8 Field hospitals established with surgical capacities, where around 492 international Emergency Response Units (ERUs) Members are mobilized for 3 months period. More than 54,000 patients have been treated in the Red Cross operated field hospitals. Trauma cases accounted 9,180 (17%) of the total patients off which 1,285 (14%) are major surgical cases. Most of the case loads 44,830 (83%) are outpatients and 9,180 patients got inpatients service. Similarly, 112 births have been performed in the field hospitals. Inpatient mortality rate remained 1.5% (21 deaths), many of them are presented with critical injuries or illnesses. No outbreak has been seen during the ERU operation. Deployment of ERUs together with national health workers are very important to address the immediate health needs of the affected communities. This will ease for transition and handover of emergency service and equipments to local provider. Likewise, capacity building of local staff as on the job training on various clinical teachings would be another important issue to look at before phasing out such services.

Keywords: trauma management, critical injuries, earthquake, health

Procedia PDF Downloads 214
124 Game-Theory-Based on Downlink Spectrum Allocation in Two-Tier Networks

Authors: Yu Zhang, Ye Tian, Fang Ye Yixuan Kang

Abstract:

The capacity of conventional cellular networks has reached its upper bound and it can be well handled by introducing femtocells with low-cost and easy-to-deploy. Spectrum interference issue becomes more critical in peace with the value-added multimedia services growing up increasingly in two-tier cellular networks. Spectrum allocation is one of effective methods in interference mitigation technology. This paper proposes a game-theory-based on OFDMA downlink spectrum allocation aiming at reducing co-channel interference in two-tier femtocell networks. The framework is formulated as a non-cooperative game, wherein the femto base stations are players and frequency channels available are strategies. The scheme takes full account of competitive behavior and fairness among stations. In addition, the utility function reflects the interference from the standpoint of channels essentially. This work focuses on co-channel interference and puts forward a negative logarithm interference function on distance weight ratio aiming at suppressing co-channel interference in the same layer network. This scenario is more suitable for actual network deployment and the system possesses high robustness. According to the proposed mechanism, interference exists only when players employ the same channel for data communication. This paper focuses on implementing spectrum allocation in a distributed fashion. Numerical results show that signal to interference and noise ratio can be obviously improved through the spectrum allocation scheme and the users quality of service in downlink can be satisfied. Besides, the average spectrum efficiency in cellular network can be significantly promoted as simulations results shown.

Keywords: femtocell networks, game theory, interference mitigation, spectrum allocation

Procedia PDF Downloads 122
123 Stressors Faced by Border Security Officers: The Singapore Experience

Authors: Jansen Ang, Andrew Neo, Dawn Chia

Abstract:

Border Security is unlike mainstream policing in that officers are essentially in static deployment, working round the clock every day and every hour of the year looking for illegitimate entry of persons and goods. In Singapore, Border Security officers perform multiple functions to ensure the nation’s safety and security. They are responsible for safeguarding the borders of Singapore to prevent threats from entering the country. Being the first line of defence in ensuring the nation’s border security officers are entrusted with the responsibility of screening travellers inbound and outbound of Singapore daily. They examined 99 million arrivals and departures at the various checkpoints in 2014, which is a considerable volume compared to most immigration agencies. The officers’ work scopes also include cargo clearance, protective and security functions of checkpoints. The officers work in very demanding environment which can range from the smog at the land checkpoints to the harshness of the ports at the sea checkpoints. In addition, all immigration checkpoints are located at the boundaries, posing commuting challenges for officers. At the land checkpoints, festive seasons and school breaks are peak periods as given the surge of inbound and outbound travellers at the various checkpoints. Such work provides unique challenges in comparison to other law enforcement duties. This paper assesses the current stressors faced by officers of a border security agency through the conduct of ground observations as well as a perceived stress survey as well as recommendations in combating stressors faced by border security officers. The findings from the field observations and surveys indicate organisational and operational stressors that are unique to border security and recommends interventions in managing these stressors. Understanding these stressors would better inform border security agencies on the interventions needed to enhance the resilience of border security officers.

Keywords: border security, Singapore, stress, operations

Procedia PDF Downloads 290
122 Comparative Analysis of Data Gathering Protocols with Multiple Mobile Elements for Wireless Sensor Network

Authors: Bhat Geetalaxmi Jairam, D. V. Ashoka

Abstract:

Wireless Sensor Networks are used in many applications to collect sensed data from different sources. Sensed data has to be delivered through sensors wireless interface using multi-hop communication towards the sink. The data collection in wireless sensor networks consumes energy. Energy consumption is the major constraints in WSN .Reducing the energy consumption while increasing the amount of generated data is a great challenge. In this paper, we have implemented two data gathering protocols with multiple mobile sinks/elements to collect data from sensor nodes. First, is Energy-Efficient Data Gathering with Tour Length-Constrained Mobile Elements in Wireless Sensor Networks (EEDG), in which mobile sinks uses vehicle routing protocol to collect data. Second is An Intelligent Agent-based Routing Structure for Mobile Sinks in WSNs (IAR), in which mobile sinks uses prim’s algorithm to collect data. Authors have implemented concepts which are common to both protocols like deployment of mobile sinks, generating visiting schedule, collecting data from the cluster member. Authors have compared the performance of both protocols by taking statistics based on performance parameters like Delay, Packet Drop, Packet Delivery Ratio, Energy Available, Control Overhead. Authors have concluded this paper by proving EEDG is more efficient than IAR protocol but with few limitations which include unaddressed issues likes Redundancy removal, Idle listening, Mobile Sink’s pause/wait state at the node. In future work, we plan to concentrate more on these limitations to avail a new energy efficient protocol which will help in improving the life time of the WSN.

Keywords: aggregation, consumption, data gathering, efficiency

Procedia PDF Downloads 457
121 Cognitive Science Based Scheduling in Grid Environment

Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya

Abstract:

Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.

Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence

Procedia PDF Downloads 361
120 Study on the Situation between France and the South China Sea from the Perspective of Balance of Power Theory

Authors: Zhenyi Chen

Abstract:

With the rise of China and the escalation of tension between China and the United States, European countries led by Great Britain, France, and Germany pay increasing attention to the regional situation in the Asia-Pacific (now known as "Indo-Pacific"). Among them, the South China Sea (SCS) is one of the main areas disputed by China, the United States, Southeast Asian countries and some European countries. Western countries are worried that the rise of China's military power will break the stability of the situation in SCS and alter the balance of power among major powers. Therefore, they tried to balance China's rise through alliance. In France's Indo-Pacific strategy, France aims to build a regional order with the alliance of France, India and Australia as the core, and regularly carry out military exercises targeting SCS with the United States, Japan and Southeast Asian countries. For China, the instability of the situation in SCS could also threaten the security of the southeast coastal areas and Taiwan, affect China's peaceful development process, and pose a threat to China's territorial sovereignty. This paper aims to study the activities and motivation of France in the South China Sea, and put the situation in SCS under the perspective of Balance of Power Theory, focusing on China, America and France. To be more specific, this paper will first briefly introduce Balance of Power Theory, then describe the new trends of France in recent years, followed with the analysis on the motivation of the increasing trend of France's involvement in SCS, and finally analyze the situation in SCS from the perspective of "balance of power" theory. It will be argued that great powers are carefully maintaining the balance of military power in SCS, and it is highly possible that this trend would still last in the middle and long term, particularly via military deployment and strategic alliances.

Keywords: South China Sea, France, China, balance of power theory, Indo-Pacific

Procedia PDF Downloads 139
119 Reinforcement-Learning Based Handover Optimization for Cellular Unmanned Aerial Vehicles Connectivity

Authors: Mahmoud Almasri, Xavier Marjou, Fanny Parzysz

Abstract:

The demand for services provided by Unmanned Aerial Vehicles (UAVs) is increasing pervasively across several sectors including potential public safety, economic, and delivery services. As the number of applications using UAVs grows rapidly, more and more powerful, quality of service, and power efficient computing units are necessary. Recently, cellular technology draws more attention to connectivity that can ensure reliable and flexible communications services for UAVs. In cellular technology, flying with a high speed and altitude is subject to several key challenges, such as frequent handovers (HOs), high interference levels, connectivity coverage holes, etc. Additional HOs may lead to “ping-pong” between the UAVs and the serving cells resulting in a decrease of the quality of service and energy consumption. In order to optimize the number of HOs, we develop in this paper a Q-learning-based algorithm. While existing works focus on adjusting the number of HOs in a static network topology, we take into account the impact of cells deployment for three different simulation scenarios (Rural, Semi-rural and Urban areas). We also consider the impact of the decision distance, where the drone has the choice to make a switching decision on the number of HOs. Our results show that a Q-learning-based algorithm allows to significantly reduce the average number of HOs compared to a baseline case where the drone always selects the cell with the highest received signal. Moreover, we also propose which hyper-parameters have the largest impact on the number of HOs in the three tested environments, i.e. Rural, Semi-rural, or Urban.

Keywords: drones connectivity, reinforcement learning, handovers optimization, decision distance

Procedia PDF Downloads 69
118 The Social Psychology of Illegal Game Room Addiction in the Historic Chinatown District of Honolulu, Hawaii: Illegal Compulsive Gambling, Chinese-Polynesian Organized Crime Syndicates, Police Corruption, and Loan Sharking Rings

Authors: Gordon James Knowles

Abstract:

Historically the Chinatown district in Sandwich Islands has been plagued with the traditional vice crimes of illegal drugs, gambling, and prostitution since the early 1800s. However, a new form of psychologically addictive arcade style table gambling machines has become the dominant form of illegal revenue made in Honolulu, Hawaii. This study attempts to document the drive, desire, or will to play and wager with arcade style video gaming and understand the role of illegal game rooms in facilitating pathological gambling addiction. Indicators of police corruption by Chinese organized crime syndicates related to protection rackets, bribery, and pay-offs were revealed. Information fusion from a police science and sociological intelligence perspective indicates insurgent warfare is being waged on the streets of Honolulu by the People’s Republic of China. This state-sponsored communist terrorism in the Hawaiian Islands used “contactless” irregular warfare entailing: (1) the deployment of psychologically addictive gambling machines, (2) the distribution of the physically addictive fentanyl drug as a lethal chemical weapon, and (3) psychological warfare by circulating pro-China anti-American propaganda newspapers targeted at the small island populace.

Keywords: Chinese and Polynesian organized crime, china daily newspaper, electronic arcade style table games, gaming technology addiction, illegal compulsive gambling, and police intelligence

Procedia PDF Downloads 26
117 Wake Effects of Wind Turbines and Its Impacts on Power Curve Measurements

Authors: Sajan Antony Mathew, Bhukya Ramdas

Abstract:

Abstract—The impetus of wind energy deployment over the last few decades has seen potential sites being harvested very actively for wind farm development. Due to the scarce availability of highly potential sites, the turbines are getting more optimized in its location wherein minimum spacing between the turbines are resorted without comprising on the optimization of its energy yield. The optimization of the energy yield from a wind turbine is achieved by effective micrositing techniques. These time-tested techniques which are applied from site to site on terrain conditions that meet the requirements of the International standard for power performance measurements of wind turbines result in the positioning of wind turbines for optimized energy yields. The international standard for Power Curve Measurements has rules of procedure and methodology to evaluate the terrain, obstacles and sector for measurements. There are many challenges at the sites for complying with the requirements for terrain, obstacles and sector for measurements. Studies are being attempted to carry out these measurements within the scope of the international standard as various other procedures specified in alternate standards or the integration of LIDAR for Power Curve Measurements are in the nascent stage. The paper strives to assist in the understanding of the fact that if positioning of a wind turbine at a site is based on an optimized output, then there are no wake effects seen on the power curve of an adjacent wind turbine. The paper also demonstrates that an invalid sector for measurements could be used in the analysis in alteration to the requirement as per the international standard for power performance measurements. Therefore the paper strives firstly to demonstrate that if a wind turbine is optimally positioned, no wake effects are seen and secondly the sector for measurements in such a case could include sectors which otherwise would have to be excluded as per the requirements of International standard for power performance measurements.

Keywords: micrositing, optimization, power performance, wake effects

Procedia PDF Downloads 435