Search results for: non-uniform deployment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 470

Search results for: non-uniform deployment

140 Digital Platform for Psychological Assessment Supported by Sensors and Efficiency Algorithms

Authors: Francisco M. Silva

Abstract:

Technology is evolving, creating an impact on our everyday lives and the telehealth industry. Telehealth encapsulates the provision of healthcare services and information via a technological approach. There are several benefits of using web-based methods to provide healthcare help. Nonetheless, few health and psychological help approaches combine this method with wearable sensors. This paper aims to create an online platform for users to receive self-care help and information using wearable sensors. In addition, researchers developing a similar project obtain a solid foundation as a reference. This study provides descriptions and analyses of the software and hardware architecture. Exhibits and explains a heart rate dynamic and efficient algorithm that continuously calculates the desired sensors' values. Presents diagrams that illustrate the website deployment process and the webserver means of handling the sensors' data. The goal is to create a working project using Arduino compatible hardware. Heart rate sensors send their data values to an online platform. A microcontroller board uses an algorithm to calculate the sensor heart rate values and outputs it to a web server. The platform visualizes the sensor's data, summarizes it in a report, and creates alerts for the user. Results showed a solid project structure and communication from the hardware and software. The web server displays the conveyed heart rate sensor's data on the online platform, presenting observations and evaluations.

Keywords: Arduino, heart rate BPM, microcontroller board, telehealth, wearable sensors, web-based healthcare

Procedia PDF Downloads 102
139 Load Balancing Technique for Energy - Efficiency in Cloud Computing

Authors: Rani Danavath, V. B. Narsimha

Abstract:

Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.

Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission

Procedia PDF Downloads 419
138 Sustainability of Photovoltaic Recycling Planning

Authors: Jun-Ki Choi

Abstract:

The usage of valuable resources and the potential for waste generation at the end of the life cycle of photovoltaic (PV) technologies necessitate a proactive planning for a PV recycling infrastructure. To ensure the sustainability of PV in large scales of deployment, it is vital to develop and institute low-cost recycling technologies and infrastructure for the emerging PV industry in parallel with the rapid commercialization of these new technologies. There are various issues involved in the economics of PV recycling and this research examine those at macro and micro levels, developing a holistic interpretation of the economic viability of the PV recycling systems. This study developed mathematical models to analyze the profitability of recycling technologies and to guide tactical decisions for allocating optimal location of PV take-back centers (PVTBC), necessary for the collection of end of life products. The economic decision is usually based on the level of the marginal capital cost of each PVTBC, cost of reverse logistics, distance traveled, and the amount of PV waste collected from various locations. Results illustrated that the reverse logistics costs comprise a major portion of the cost of PVTBC; PV recycling centers can be constructed in the optimally selected locations to minimize the total reverse logistics cost for transporting the PV wastes from various collection facilities to the recycling center. In the micro- process level, automated recycling processes should be developed to handle the large amount of growing PV wastes economically. The market price of the reclaimed materials are important factors for deciding the profitability of the recycling process and this illustrates the importance of the recovering the glass and expensive metals from PV modules.

Keywords: photovoltaic, recycling, mathematical models, sustainability

Procedia PDF Downloads 222
137 Coherent All-Fiber and Polarization Maintaining Source for CO2 Range-Resolved Differential Absorption Lidar

Authors: Erwan Negre, Ewan J. O'Connor, Juha Toivonen

Abstract:

The need for CO2 monitoring technologies grows simultaneously with the worldwide concerns regarding environmental challenges. To that purpose, we developed a compact coherent all-fiber ranged-resolved Differential Absorption Lidar (RR-DIAL). It has been designed along a tunable 2x1fiber optic switch set to a frequency of 1 Hz between two Distributed FeedBack (DFB) lasers emitting in the continuous-wave mode at 1571.41 nm (absorption line of CO2) and 1571.25 nm (CO2 absorption-free line), with linewidth and tuning range of respectively 1 MHz and 3 nm over operating wavelength. A three stages amplification through Erbium and Erbium-Ytterbium doped fibers coupled to a Radio Frequency (RF) driven Acousto-Optic Modulator (AOM) generates 100 ns pulses at a repetition rate from 10 to 30 kHz with a peak power up to 2.5 kW and a spatial resolution of 15 m, allowing fast and highly resolved CO2 profiles. The same afocal collection system is used for the output of the laser source and the backscattered light which is then directed to a circulator before being mixed with the local oscillator for heterodyne detection. Packaged in an easily transportable box which also includes a server and a Field Programmable Gate Array (FPGA) card for on-line data processing and storing, our setup allows an effective and quick deployment for versatile in-situ analysis, whether it be vertical atmospheric monitoring, large field mapping or sequestration site continuous oversight. Setup operation and results from initial field measurements will be discussed.

Keywords: CO2 profiles, coherent DIAL, in-situ atmospheric sensing, near infrared fiber source

Procedia PDF Downloads 107
136 Flirting with Ephemerality and the Daily Production of the Fleeting City

Authors: Rafael Martinez

Abstract:

Our view of cities is dominated by the built environment. Buildings, streets, avenues, bridges, flyovers, and so on virtually exclude anything not fixed, permanently alterable or indefinitely temporal. Yet, city environments can also be shaped by temporally produced structures which, regardless of their transience, act as thresholds separating or segregating people and spaces. Academic works on cities conceptualize them, whether temporary or permanent, as tangible environments. This paper considers the idea of the ephemeral city, a city purposely produced and lived in as an impermanent, fluid and transitional environment resulting from an alignment of different forces. In particular, the paper proposes to observe how certain performative practices inform the emergence of ephemeral spaces in the city’s daily life. With Singapore as its backdrop and focusing foreign workers, the paper aims at documenting how everyday life practices, such as flirting, result in production of transitional space, informed by semiotic blurs, and yet material, perceptible, human and tangible for some. In this paper, it is argued that flirting for Singapore's foreign workers entails skillful understanding of what is proposed as the 'flirting cartography.' Thus, spatially, flirtation becomes not only a matter to be taken for granted but also a form of producing a fleeting space that requires deployment of various techniques drawn upon a particular knowledge. The paper is based upon a performative methodology which seeks to understand the praxis and rationale of the ephemerality of some spaces produced by foreign workers within this cosmopolitan city. By resorting to this methodological approach, the paper aims to establish the connection between the visibility gained by usually marginalized populations through their ephemeral reclamation of public spaces in the city.

Keywords: ephemeral, flirting, Singapore, space

Procedia PDF Downloads 81
135 Development of Mobile Application for Internship Program Management Using the Concept of Model View Controller (MVC) Pattern

Authors: Shutchapol Chopvitayakun

Abstract:

Nowadays, especially for the last 5 years, mobile devices, mobile applications and mobile users, through the deployment of wireless communication and mobile phone cellular network, all these components are growing significantly bigger and stronger. They are being integrated into each other to create multiple purposes and pervasive deployments into every business and non-business sector such as education, medicine, traveling, finance, real estate and many more. Objective of this study was to develop a mobile application for seniors or last-year students who enroll the internship program at each tertiary school (undergraduate school) and do onsite practice at real field sties, real organizations and real workspaces. During the internship session, all students as the interns are required to exercise, drilling and training onsite with specific locations and specific tasks or may be some assignments from their supervisor. Their work spaces are both private and government corporates and enterprises. This mobile application is developed under schema of a transactional processing system that enables users to keep daily work or practice log, monitor true working locations and ability to follow daily tasks of each trainee. Moreover, it provides useful guidance from each intern’s advisor, in case of emergency. Finally, it can summarize all transactional data then calculate each internship cumulated hours from the field practice session for each individual intern.

Keywords: internship, mobile application, Android OS, smart phone devices, mobile transactional processing system, guidance and monitoring, tertiary education, senior students, model view controller (MVC)

Procedia PDF Downloads 284
134 Simulation, Optimization, and Analysis Approach of Microgrid Systems

Authors: Saqib Ali

Abstract:

Sources are classified into two depending upon the factor of reviving. These sources, which cannot be revived into their original shape once they are consumed, are considered as nonrenewable energy resources, i.e., (coal, fuel) Moreover, those energy resources which are revivable to the original condition even after being consumed are known as renewable energy resources, i.e., (wind, solar, hydel) Renewable energy is a cost-effective way to generate clean and green electrical energy Now a day’s majority of the countries are paying heed to energy generation from RES Pakistan is mostly relying on conventional energy resources which are mostly nonrenewable in nature coal, fuel is one of the major resources, and with the advent of time their prices are increasing on the other hand RES have great potential in the country with the deployment of RES greater reliability and an effective power system can be obtained In this thesis, a similar concept is being used and a hybrid power system is proposed which is composed of intermixing of renewable and nonrenewable sources The Source side is composed of solar, wind, fuel cells which will be used in an optimal manner to serve load The goal is to provide an economical, reliable, uninterruptable power supply. This is achieved by optimal controller (PI, PD, PID, FOPID) Optimization techniques are applied to the controllers to achieve the desired results. Advanced algorithms (Particle swarm optimization, Flower Pollination Algorithm) will be used to extract the desired output from the controller Detailed comparison in the form of tables and results will be provided, which will highlight the efficiency of the proposed system.

Keywords: distributed generation, demand-side management, hybrid power system, micro grid, renewable energy resources, supply-side management

Procedia PDF Downloads 70
133 Global Healthcare Village Based on Mobile Cloud Computing

Authors: Laleh Boroumand, Muhammad Shiraz, Abdullah Gani, Rashid Hafeez Khokhar

Abstract:

Cloud computing being the use of hardware and software that are delivered as a service over a network has its application in the area of health care. Due to the emergency cases reported in most of the medical centers, prompt for an efficient scheme to make health data available with less response time. To this end, we propose a mobile global healthcare village (MGHV) model that combines the components of three deployment model which include country, continent and global health cloud to help in solving the problem mentioned above. In the creation of continent model, two (2) data centers are created of which one is local and the other is global. The local replay the request of residence within the continent, whereas the global replay the requirements of others. With the methods adopted, there is an assurance of the availability of relevant medical data to patients, specialists, and emergency staffs regardless of locations and time. From our intensive experiment using the simulation approach, it was observed that, broker policy scheme with respect to optimized response time, yields a very good performance in terms of reduction in response time. Though, our results are comparable to others when there is an increase in the number of virtual machines (80-640 virtual machines). The proportionality in increase of response time is within 9%. The results gotten from our simulation experiments shows that utilizing MGHV leads to the reduction of health care expenditures and helps in solving the problems of unqualified medical staffs faced by both developed and developing countries.

Keywords: cloud computing (MCC), e-healthcare, availability, response time, service broker policy

Procedia PDF Downloads 337
132 A Novel Rapid Well Control Technique Modelled in Computational Fluid Dynamics Software

Authors: Michael Williams

Abstract:

The ability to control a flowing well is of the utmost important. During the kill phase, heavy weight kill mud is circulated around the well. While increasing bottom hole pressure near wellbore formation, the damage is increased. The addition of high density spherical objects has the potential to minimise this near wellbore damage, increase bottom hole pressure and reduce operational time to kill the well. This operational time saving is seen in the rapid deployment of high density spherical objects instead of building high density drilling fluid. The research aims to model the well kill process using a Computational Fluid Dynamics software. A model has been created as a proof of concept to analyse the flow of micron sized spherical objects in the drilling fluid. Initial results show that this new methodology of spherical objects in drilling fluid agrees with traditional stream lines seen in non-particle flow. Additional models have been created to demonstrate that areas of higher flow rate around the bit can lead to increased probability of wash out of formations but do not affect the flow of micron sized spherical objects. Interestingly, areas that experience dimensional changes such as tool joints and various BHA components do not appear at this initial stage to experience increased velocity or create areas of turbulent flow, which could lead to further borehole stability. In conclusion, the initial models of this novel well control methodology have not demonstrated any adverse flow patterns, which would conclude that this model may be viable under field conditions.

Keywords: well control, fluid mechanics, safety, environment

Procedia PDF Downloads 136
131 Integrating Knowledge Distillation of Multiple Strategies

Authors: Min Jindong, Wang Mingxia

Abstract:

With the widespread use of artificial intelligence in life, computer vision, especially deep convolutional neural network models, has developed rapidly. With the increase of the complexity of the real visual target detection task and the improvement of the recognition accuracy, the target detection network model is also very large. The huge deep neural network model is not conducive to deployment on edge devices with limited resources, and the timeliness of network model inference is poor. In this paper, knowledge distillation is used to compress the huge and complex deep neural network model, and the knowledge contained in the complex network model is comprehensively transferred to another lightweight network model. Different from traditional knowledge distillation methods, we propose a novel knowledge distillation that incorporates multi-faceted features, called M-KD. In this paper, when training and optimizing the deep neural network model for target detection, the knowledge of the soft target output of the teacher network in knowledge distillation, the relationship between the layers of the teacher network and the feature attention map of the hidden layer of the teacher network are transferred to the student network as all knowledge. in the model. At the same time, we also introduce an intermediate transition layer, that is, an intermediate guidance layer, between the teacher network and the student network to make up for the huge difference between the teacher network and the student network. Finally, this paper adds an exploration module to the traditional knowledge distillation teacher-student network model. The student network model not only inherits the knowledge of the teacher network but also explores some new knowledge and characteristics. Comprehensive experiments in this paper using different distillation parameter configurations across multiple datasets and convolutional neural network models demonstrate that our proposed new network model achieves substantial improvements in speed and accuracy performance.

Keywords: object detection, knowledge distillation, convolutional network, model compression

Procedia PDF Downloads 248
130 Russia’s Role in Resolving the Nagorno-Karabakh Conflict 1990-2020

Authors: Friba Haidari

Abstract:

The aim of the study is to identify Russia's role in managing the Nagorno-Karabakh conflict betweenArmenia and Azerbaijan during the years 1990 to 2020. The Nagorno-Karabakh crisis can not be considered a mere territorial conflict but also a crossroads of interests of foreign actors. Geopolitical rivalries and the access to energy by regional and trans-regional actors have complicated the crisis and created a security challenge in the region, which is likely to escalate into a full-blown war between the parties involved. The geopolitical situation of Nagorno-Karabakh and its current situation have affected all peripheral states in some way. Russia, as one of the main actors in this scene, has been actively involved since the beginning of the crisis. The Russians have always sought to strengthen their influence and presence in the Nagorno-Karabakh crisis. Russia's efforts to weaken the role of the Minsk Group, The presence of Western actors, and the deployment of Russian forces in the disputed area can be assessed in this context. However, this study seeks to answer the question of what role did Russia play in managing the Nagorno-Karabakh conflict between Armenia and Azerbaijan between 1990 and 2020? The study hypothesizes that Russia has prevented the escalation of the Nagorno-Karabakh conflict through mediation and some coercion. This study is divided into four parts, including conflict management as a theoretical framework; Examining the competition and the role of actors in the Caucasus region, especially the role of the Minsk Group, and what approach or tools and methods Russia has used in its foreign policy in managing the conflict, and finally what are the relations between the countries involved and what will be Russia's role in the future? Was discussed. This study examines the analysis and transfer of ideas and information using authoritative international sources with an explanatory method and shares its results with everyone.

Keywords: Russia, conflict, nagorno-karabakh, management

Procedia PDF Downloads 61
129 Monitoring of the Chillon Viaducts after Rehabilitation with Ultra High Performance Fiber Reinforced Cement-Based Composite

Authors: Henar Martín-Sanz García, Eleni Chatzi, Eugen Brühwiler

Abstract:

Located on the shore of Geneva Lake, in Switzerland, the Chillon Viaducts are two parallel structures consisted of post-tensioned concrete box girders, with a total length of 2 kilometers and 100m spans. Built in 1969, the bridges currently accommodate a traffic load of 50.000 vehicles per day, thereby holding a key role both in terms of historic value as well as socio-economic significance. Although several improvements have been carried out in the past two decades, recent inspections demonstrate an Alkali-Aggregate reaction in the concrete deck and piers reducing the concrete strength. In order to prevent further expansion of this issue, a layer of 40 mm of Ultra High Performance Fiber Reinforced cement-based Composite (UHPFRC) (incorporating rebars) was casted over the slabs, acting as a waterproof membrane and providing significant increase in resistance of the bridge structure by composite UHPFRC – RC composite action in particular of the deck slab. After completing the rehabilitation works, a Structural Monitoring campaign was installed on the deck slab in one representative span, based on accelerometers, strain gauges, thermal and humidity sensors. This campaign seeks to reveal information on the behavior of UHPFRC-concrete composite systems, such as increase in stiffness, fatigue strength, durability and long-term performance. Consequently, the structural monitoring is expected to last for at least three years. A first insight of the analyzed results from the initial months of measurements is presented herein, along with future improvements or necessary changes on the deployment.

Keywords: composite materials, rehabilitation, structural health monitoring, UHPFRC

Procedia PDF Downloads 254
128 Human Vibrotactile Discrimination Thresholds for Simultaneous and Sequential Stimuli

Authors: Joanna Maj

Abstract:

Body machine interfaces (BMIs) afford users a non-invasive way coordinate movement. Vibrotactile stimulation has been incorporated into BMIs to allow feedback in real-time and guide movement control to benefit patients with cognitive deficits, such as stroke survivors. To advance research in this area, we examined vibrational discrimination thresholds at four body locations to determine suitable application sites for future multi-channel BMIs using vibration cues to guide movement planning and control. Twelve healthy adults had a pair of small vibrators (tactors) affixed to the skin at each location: forearm, shoulders, torso, and knee. A "standard" stimulus (186 Hz; 750 ms) and "probe" stimuli (11 levels ranging from 100 Hz to 235 Hz; 750 ms) were delivered. Probe and test stimulus pairs could occur sequentially or simultaneously (timing). Participants verbally indicated which stimulus felt more intense. Stimulus order was counterbalanced across tactors and body locations. Probabilities that probe stimuli felt more intense than the standard stimulus were computed and fit with a cumulative Gaussian function; the discrimination threshold was defined as one standard deviation of the underlying distribution. Threshold magnitudes depended on stimulus timing and location. Discrimination thresholds were better for stimuli applied sequentially vs. simultaneously at the torso as well as the knee. Thresholds were small (better) and relatively insensitive to timing differences for vibrations applied at the shoulder. BMI applications requiring multiple channels of simultaneous vibrotactile stimulation should therefore consider the shoulder as a deployment site for a vibrotactile BMI interface.

Keywords: electromyography, electromyogram, neuromuscular disorders, biomedical instrumentation, controls engineering

Procedia PDF Downloads 35
127 The Security Trade-Offs in Resource Constrained Nodes for IoT Application

Authors: Sultan Alharby, Nick Harris, Alex Weddell, Jeff Reeve

Abstract:

The concept of the Internet of Things (IoT) has received much attention over the last five years. It is predicted that the IoT will influence every aspect of our lifestyles in the near future. Wireless Sensor Networks are one of the key enablers of the operation of IoTs, allowing data to be collected from the surrounding environment. However, due to limited resources, nature of deployment and unattended operation, a WSN is vulnerable to various types of attack. Security is paramount for reliable and safe communication between IoT embedded devices, but it does, however, come at a cost to resources. Nodes are usually equipped with small batteries, which makes energy conservation crucial to IoT devices. Nevertheless, security cost in terms of energy consumption has not been studied sufficiently. Previous research has used a security specification of 802.15.4 for IoT applications, but the energy cost of each security level and the impact on quality of services (QoS) parameters remain unknown. This research focuses on the cost of security at the IoT media access control (MAC) layer. It begins by studying the energy consumption of IEEE 802.15.4 security levels, which is followed by an evaluation for the impact of security on data latency and throughput, and then presents the impact of transmission power on security overhead, and finally shows the effects of security on memory footprint. The results show that security overhead in terms of energy consumption with a payload of 24 bytes fluctuates between 31.5% at minimum level over non-secure packets and 60.4% at the top security level of 802.15.4 security specification. Also, it shows that security cost has less impact at longer packet lengths, and more with smaller packet size. In addition, the results depicts a significant impact on data latency and throughput. Overall, maximum authentication length decreases throughput by almost 53%, and encryption and authentication together by almost 62%.

Keywords: energy consumption, IEEE 802.15.4, IoT security, security cost evaluation

Procedia PDF Downloads 134
126 Fast Aerodynamic Evaluation of Transport Aircraft in Early Phases

Authors: Xavier Bertrand, Alexandre Cayrel

Abstract:

The early phase of an aircraft development is instrumental as it really drives the potential of a new concept. Any weakness in the high-level design (wing planform, moveable surfaces layout etc.) will be extremely difficult and expensive to recover later in the aircraft development process. Aerodynamic evaluation in this very early development phase is driven by two main criteria: a short lead-time to allow quick iterations of the geometrical design, and a high quality of the calculations to get an accurate & reliable assessment of the current status. These two criteria are usually quite contradictory. Actually, short lead time of a couple of hours from end-to-end can be obtained with very simple tools (semi-empirical methods for instance) although their accuracy is limited, whereas higher quality calculations require heavier/more complex tools, which obviously need more complex inputs as well, and a significantly longer lead time. At this point, the choice has to be done between accuracy and lead-time. A brand new approach has been developed within Airbus, aiming at obtaining quickly high quality evaluations of the aerodynamic of an aircraft. This methodology is based on a joint use of Surrogate Modelling and a lifting line code. The Surrogate Modelling is used to get the wing sections characteristics (e.g. lift coefficient vs. angle of attack), whatever the airfoil geometry, the status of the moveable surfaces (aileron/spoilers) or the high-lift devices deployment. From these characteristics, the lifting line code is used to get the 3D effects on the wing whatever the flow conditions (low/high Mach numbers etc.). This methodology has been applied successfully to a concept of medium range aircraft.

Keywords: aerodynamics, lifting line, surrogate model, CFD

Procedia PDF Downloads 320
125 Importance of Field Hospitals in Trauma Management: An Experience from Nepal Earthquake

Authors: Krishna Gopal Lageju

Abstract:

On 25th April 2015, a 7.6 magnitude earthquake struck Gorkha district of Nepal, which resulted over 8,790 deaths and 22,300 injuries. In addition, almost one-third of the country’s healthcare service has been disrupted. A total of 1,211 health facilities became non-operational, due to 446 completely and other 765 partially damaged. Nearly 84 percent (375 out of 446) of the completely damaged health facilities are in the 14 most affected districts. As a result, the ability of health facilities to respond to health care needs has been harshly affected. In addition, 18 health workers lost their lives and 75 are injured, which added further challenges in the delivery of health services. Thus, to address the immediate health needs in the most devastated areas, Nepal Red Cross Society (NRCS) in coordination with IFRC and Nepal Government, 8 Field hospitals established with surgical capacities, where around 492 international Emergency Response Units (ERUs) Members are mobilized for 3 months period. More than 54,000 patients have been treated in the Red Cross operated field hospitals. Trauma cases accounted 9,180 (17%) of the total patients off which 1,285 (14%) are major surgical cases. Most of the case loads 44,830 (83%) are outpatients and 9,180 patients got inpatients service. Similarly, 112 births have been performed in the field hospitals. Inpatient mortality rate remained 1.5% (21 deaths), many of them are presented with critical injuries or illnesses. No outbreak has been seen during the ERU operation. Deployment of ERUs together with national health workers are very important to address the immediate health needs of the affected communities. This will ease for transition and handover of emergency service and equipments to local provider. Likewise, capacity building of local staff as on the job training on various clinical teachings would be another important issue to look at before phasing out such services.

Keywords: trauma management, critical injuries, earthquake, health

Procedia PDF Downloads 217
124 Game-Theory-Based on Downlink Spectrum Allocation in Two-Tier Networks

Authors: Yu Zhang, Ye Tian, Fang Ye Yixuan Kang

Abstract:

The capacity of conventional cellular networks has reached its upper bound and it can be well handled by introducing femtocells with low-cost and easy-to-deploy. Spectrum interference issue becomes more critical in peace with the value-added multimedia services growing up increasingly in two-tier cellular networks. Spectrum allocation is one of effective methods in interference mitigation technology. This paper proposes a game-theory-based on OFDMA downlink spectrum allocation aiming at reducing co-channel interference in two-tier femtocell networks. The framework is formulated as a non-cooperative game, wherein the femto base stations are players and frequency channels available are strategies. The scheme takes full account of competitive behavior and fairness among stations. In addition, the utility function reflects the interference from the standpoint of channels essentially. This work focuses on co-channel interference and puts forward a negative logarithm interference function on distance weight ratio aiming at suppressing co-channel interference in the same layer network. This scenario is more suitable for actual network deployment and the system possesses high robustness. According to the proposed mechanism, interference exists only when players employ the same channel for data communication. This paper focuses on implementing spectrum allocation in a distributed fashion. Numerical results show that signal to interference and noise ratio can be obviously improved through the spectrum allocation scheme and the users quality of service in downlink can be satisfied. Besides, the average spectrum efficiency in cellular network can be significantly promoted as simulations results shown.

Keywords: femtocell networks, game theory, interference mitigation, spectrum allocation

Procedia PDF Downloads 128
123 Stressors Faced by Border Security Officers: The Singapore Experience

Authors: Jansen Ang, Andrew Neo, Dawn Chia

Abstract:

Border Security is unlike mainstream policing in that officers are essentially in static deployment, working round the clock every day and every hour of the year looking for illegitimate entry of persons and goods. In Singapore, Border Security officers perform multiple functions to ensure the nation’s safety and security. They are responsible for safeguarding the borders of Singapore to prevent threats from entering the country. Being the first line of defence in ensuring the nation’s border security officers are entrusted with the responsibility of screening travellers inbound and outbound of Singapore daily. They examined 99 million arrivals and departures at the various checkpoints in 2014, which is a considerable volume compared to most immigration agencies. The officers’ work scopes also include cargo clearance, protective and security functions of checkpoints. The officers work in very demanding environment which can range from the smog at the land checkpoints to the harshness of the ports at the sea checkpoints. In addition, all immigration checkpoints are located at the boundaries, posing commuting challenges for officers. At the land checkpoints, festive seasons and school breaks are peak periods as given the surge of inbound and outbound travellers at the various checkpoints. Such work provides unique challenges in comparison to other law enforcement duties. This paper assesses the current stressors faced by officers of a border security agency through the conduct of ground observations as well as a perceived stress survey as well as recommendations in combating stressors faced by border security officers. The findings from the field observations and surveys indicate organisational and operational stressors that are unique to border security and recommends interventions in managing these stressors. Understanding these stressors would better inform border security agencies on the interventions needed to enhance the resilience of border security officers.

Keywords: border security, Singapore, stress, operations

Procedia PDF Downloads 296
122 Comparative Analysis of Data Gathering Protocols with Multiple Mobile Elements for Wireless Sensor Network

Authors: Bhat Geetalaxmi Jairam, D. V. Ashoka

Abstract:

Wireless Sensor Networks are used in many applications to collect sensed data from different sources. Sensed data has to be delivered through sensors wireless interface using multi-hop communication towards the sink. The data collection in wireless sensor networks consumes energy. Energy consumption is the major constraints in WSN .Reducing the energy consumption while increasing the amount of generated data is a great challenge. In this paper, we have implemented two data gathering protocols with multiple mobile sinks/elements to collect data from sensor nodes. First, is Energy-Efficient Data Gathering with Tour Length-Constrained Mobile Elements in Wireless Sensor Networks (EEDG), in which mobile sinks uses vehicle routing protocol to collect data. Second is An Intelligent Agent-based Routing Structure for Mobile Sinks in WSNs (IAR), in which mobile sinks uses prim’s algorithm to collect data. Authors have implemented concepts which are common to both protocols like deployment of mobile sinks, generating visiting schedule, collecting data from the cluster member. Authors have compared the performance of both protocols by taking statistics based on performance parameters like Delay, Packet Drop, Packet Delivery Ratio, Energy Available, Control Overhead. Authors have concluded this paper by proving EEDG is more efficient than IAR protocol but with few limitations which include unaddressed issues likes Redundancy removal, Idle listening, Mobile Sink’s pause/wait state at the node. In future work, we plan to concentrate more on these limitations to avail a new energy efficient protocol which will help in improving the life time of the WSN.

Keywords: aggregation, consumption, data gathering, efficiency

Procedia PDF Downloads 467
121 Cognitive Science Based Scheduling in Grid Environment

Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya

Abstract:

Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.

Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence

Procedia PDF Downloads 368
120 Study on the Situation between France and the South China Sea from the Perspective of Balance of Power Theory

Authors: Zhenyi Chen

Abstract:

With the rise of China and the escalation of tension between China and the United States, European countries led by Great Britain, France, and Germany pay increasing attention to the regional situation in the Asia-Pacific (now known as "Indo-Pacific"). Among them, the South China Sea (SCS) is one of the main areas disputed by China, the United States, Southeast Asian countries and some European countries. Western countries are worried that the rise of China's military power will break the stability of the situation in SCS and alter the balance of power among major powers. Therefore, they tried to balance China's rise through alliance. In France's Indo-Pacific strategy, France aims to build a regional order with the alliance of France, India and Australia as the core, and regularly carry out military exercises targeting SCS with the United States, Japan and Southeast Asian countries. For China, the instability of the situation in SCS could also threaten the security of the southeast coastal areas and Taiwan, affect China's peaceful development process, and pose a threat to China's territorial sovereignty. This paper aims to study the activities and motivation of France in the South China Sea, and put the situation in SCS under the perspective of Balance of Power Theory, focusing on China, America and France. To be more specific, this paper will first briefly introduce Balance of Power Theory, then describe the new trends of France in recent years, followed with the analysis on the motivation of the increasing trend of France's involvement in SCS, and finally analyze the situation in SCS from the perspective of "balance of power" theory. It will be argued that great powers are carefully maintaining the balance of military power in SCS, and it is highly possible that this trend would still last in the middle and long term, particularly via military deployment and strategic alliances.

Keywords: South China Sea, France, China, balance of power theory, Indo-Pacific

Procedia PDF Downloads 149
119 Reinforcement-Learning Based Handover Optimization for Cellular Unmanned Aerial Vehicles Connectivity

Authors: Mahmoud Almasri, Xavier Marjou, Fanny Parzysz

Abstract:

The demand for services provided by Unmanned Aerial Vehicles (UAVs) is increasing pervasively across several sectors including potential public safety, economic, and delivery services. As the number of applications using UAVs grows rapidly, more and more powerful, quality of service, and power efficient computing units are necessary. Recently, cellular technology draws more attention to connectivity that can ensure reliable and flexible communications services for UAVs. In cellular technology, flying with a high speed and altitude is subject to several key challenges, such as frequent handovers (HOs), high interference levels, connectivity coverage holes, etc. Additional HOs may lead to “ping-pong” between the UAVs and the serving cells resulting in a decrease of the quality of service and energy consumption. In order to optimize the number of HOs, we develop in this paper a Q-learning-based algorithm. While existing works focus on adjusting the number of HOs in a static network topology, we take into account the impact of cells deployment for three different simulation scenarios (Rural, Semi-rural and Urban areas). We also consider the impact of the decision distance, where the drone has the choice to make a switching decision on the number of HOs. Our results show that a Q-learning-based algorithm allows to significantly reduce the average number of HOs compared to a baseline case where the drone always selects the cell with the highest received signal. Moreover, we also propose which hyper-parameters have the largest impact on the number of HOs in the three tested environments, i.e. Rural, Semi-rural, or Urban.

Keywords: drones connectivity, reinforcement learning, handovers optimization, decision distance

Procedia PDF Downloads 73
118 The Social Psychology of Illegal Game Room Addiction in the Historic Chinatown District of Honolulu, Hawaii: Illegal Compulsive Gambling, Chinese-Polynesian Organized Crime Syndicates, Police Corruption, and Loan Sharking Rings

Authors: Gordon James Knowles

Abstract:

Historically the Chinatown district in Sandwich Islands has been plagued with the traditional vice crimes of illegal drugs, gambling, and prostitution since the early 1800s. However, a new form of psychologically addictive arcade style table gambling machines has become the dominant form of illegal revenue made in Honolulu, Hawaii. This study attempts to document the drive, desire, or will to play and wager with arcade style video gaming and understand the role of illegal game rooms in facilitating pathological gambling addiction. Indicators of police corruption by Chinese organized crime syndicates related to protection rackets, bribery, and pay-offs were revealed. Information fusion from a police science and sociological intelligence perspective indicates insurgent warfare is being waged on the streets of Honolulu by the People’s Republic of China. This state-sponsored communist terrorism in the Hawaiian Islands used “contactless” irregular warfare entailing: (1) the deployment of psychologically addictive gambling machines, (2) the distribution of the physically addictive fentanyl drug as a lethal chemical weapon, and (3) psychological warfare by circulating pro-China anti-American propaganda newspapers targeted at the small island populace.

Keywords: Chinese and Polynesian organized crime, china daily newspaper, electronic arcade style table games, gaming technology addiction, illegal compulsive gambling, and police intelligence

Procedia PDF Downloads 33
117 Wake Effects of Wind Turbines and Its Impacts on Power Curve Measurements

Authors: Sajan Antony Mathew, Bhukya Ramdas

Abstract:

Abstract—The impetus of wind energy deployment over the last few decades has seen potential sites being harvested very actively for wind farm development. Due to the scarce availability of highly potential sites, the turbines are getting more optimized in its location wherein minimum spacing between the turbines are resorted without comprising on the optimization of its energy yield. The optimization of the energy yield from a wind turbine is achieved by effective micrositing techniques. These time-tested techniques which are applied from site to site on terrain conditions that meet the requirements of the International standard for power performance measurements of wind turbines result in the positioning of wind turbines for optimized energy yields. The international standard for Power Curve Measurements has rules of procedure and methodology to evaluate the terrain, obstacles and sector for measurements. There are many challenges at the sites for complying with the requirements for terrain, obstacles and sector for measurements. Studies are being attempted to carry out these measurements within the scope of the international standard as various other procedures specified in alternate standards or the integration of LIDAR for Power Curve Measurements are in the nascent stage. The paper strives to assist in the understanding of the fact that if positioning of a wind turbine at a site is based on an optimized output, then there are no wake effects seen on the power curve of an adjacent wind turbine. The paper also demonstrates that an invalid sector for measurements could be used in the analysis in alteration to the requirement as per the international standard for power performance measurements. Therefore the paper strives firstly to demonstrate that if a wind turbine is optimally positioned, no wake effects are seen and secondly the sector for measurements in such a case could include sectors which otherwise would have to be excluded as per the requirements of International standard for power performance measurements.

Keywords: micrositing, optimization, power performance, wake effects

Procedia PDF Downloads 440
116 World Peace and Conflict Resolution: A Solution from a Buddhist Point of View

Authors: Samitharathana R. Wadigala

Abstract:

The peace will not be established until the self-consciousness would reveal in the human beings. In this nuclear age, the establishment of a lasting peace on the earth represents the primary condition for the preservation of human civilization and survival of human beings. Nothing perhaps is so important and indispensable as the achievement and maintenance of peace in the modern world today. Peace in today’s world implies much more than the mere absence of war and violence. In the interdependent world of today the United Nations needs to be representative of the modern world and democratic in its functioning because it came into existence to save the generations from the scourge of war and conflict. Buddhism is the religion of peaceful co-existence and philosophy of enlightenment. Violence and conflict from the perspective of the Buddhist theory of interdependent origination (Paṭiccasamuppāda) are same with everything else in the world a product of causes and conditions. Buddhism is totally compatible with the congenial and peaceful global order. The canonical literature, doctrines, and philosophy of Buddhism are the best suited for inter-faith dialogue, harmony, and universal peace. Even today Buddhism can resurrect the universal brotherhood, peaceful co-existence and harmonious surroundings in the comity of nations. With its increasing vitality in regions around the world, many people today turn to Buddhism for relief and guidance at the time when peace seems to be a deferred dream more than ever. From a Buddhist point of view the roots of all unwholesome actions (Conflict) i. e. greed, hatred and delusion are viewed as the root cause of all human conflicts. Conflict often emanates from attachment to material things: pleasures, property, territory, wealth, economic dominance or political superiority. Buddhism has some particularly rich resources for deployment in dissolving conflict. Buddhism addresses the Buddhist perspective on the causes of conflict and ways to resolve conflict to realize world peace. The world has enough to satisfy every body’s needs but not every body’s greed.

Keywords: Buddhism, conflict-violence, peace, self-consciousness

Procedia PDF Downloads 186
115 Developing an Automated Protocol for the Wristband Extraction Process Using Opentrons

Authors: Tei Kim, Brooklynn McNeil, Kathryn Dunn, Douglas I. Walker

Abstract:

To better characterize the relationship between complex chemical exposures and disease, our laboratory uses an approach that combines low-cost, polydimethylsiloxane (silicone) wristband samplers that absorb many of the chemicals we are exposed to with untargeted high-resolution mass spectrometry (HRMS) to characterize 1000’s of chemicals at a time. In studies with human populations, these wristbands can provide an important measure of our environment: however, there is a need to use this approach in large cohorts to study exposures associated with the disease. To facilitate the use of silicone samplers in large scale population studies, the goal of this research project was to establish automated sample preparation methods that improve throughput, robustness, and scalability of analytical methods for silicone wristbands. Using the Opentron OT2 automated liquid platform, which provides a low-cost and opensource framework for automated pipetting, we created two separate workflows that translate the manual wristband preparation method to a fully automated protocol that requires minor intervention by the operator. These protocols include a sequence generation step, which defines the location of all plates and labware according to user-specified settings, and a transfer protocol that includes all necessary instrument parameters and instructions for automated solvent extraction of wristband samplers. These protocols were written in Python and uploaded to GitHub for use by others in the research community. Results from this project show it is possible to establish automated and open source methods for the preparation of silicone wristband samplers to support profiling of many environmental exposures. Ongoing studies include deployment in longitudinal cohort studies to investigate the relationship between personal chemical exposure and disease.

Keywords: bioinformatics, automation, opentrons, research

Procedia PDF Downloads 81
114 Cosmic Muon Tomography at the Wylfa Reactor Site Using an Anti-Neutrino Detector

Authors: Ronald Collins, Jonathon Coleman, Joel Dasari, George Holt, Carl Metelko, Matthew Murdoch, Alexander Morgan, Yan-Jie Schnellbach, Robert Mills, Gareth Edwards, Alexander Roberts

Abstract:

At the Wylfa Magnox Power Plant between 2014–2016, the VIDARR prototype anti-neutrino detector was deployed. It is comprised of extruded plastic scintillating bars measuring 4 cm × 1 cm × 152 cm and utilised wavelength shifting fibres (WLS) and multi-pixel photon counters (MPPCs) to detect and quantify radiation. During deployment, it took cosmic muon data in accidental coincidence with the anti-neutrino measurements with the power plant site buildings obscuring the muon sky. Cosmic muons have a significantly higher probability of being attenuated and/or absorbed by denser objects, and so one-sided cosmic muon tomography was utilised to image the reactor site buildings. In order to achieve clear building outlines, a control data set was taken at the University of Liverpool from 2016 – 2018, which had minimal occlusion of the cosmic muon flux by dense objects. By taking the ratio of these two data sets and using GEANT4 simulations, it is possible to perform a one-sided cosmic muon tomography analysis. This analysis can be used to discern specific buildings, building heights, and features at the Wylfa reactor site, including the reactor core/reactor core shielding using ∼ 3 hours worth of cosmic-ray detector live time. This result demonstrates the feasibility of using cosmic muon analysis to determine a segmented detector’s location with respect to surrounding buildings, assisted by aerial photography or satellite imagery.

Keywords: anti-neutrino, GEANT4, muon, tomography, occlusion

Procedia PDF Downloads 160
113 Deployment of Information and Communication Technology (ICT) to Reduce Occurrences of Terrorism in Nigeria

Authors: Okike Benjamin

Abstract:

Terrorism is the use of violence and threat to intimidate or coerce a person, group, society or even government especially for political purposes. Terrorism may be a way of resisting government by some group who may feel marginalized. It could also be a way of expressing displeasure over the activities of government. On 26th December, 2009, US placed Nigeria as a terrorist nation. Recently, the occurrences of terrorism in Nigeria have increased considerably. In Jos, Plateau state, Nigeria, there was a bomb blast which claimed many lives on the eve of 2010 Christmas. Similarly, there was another bomb blast in Mugadishi (Sani Abacha) Barracks Mammy market on the eve of 2011 New Year. For some time now, it is no longer news that bomb exploded in some Northern part of Nigeria. About 25 years ago, stopping terrorism in America by the Americans relied on old-fashioned tools such as strict physical security at vulnerable places, intelligence gathering by government agents, or individuals, vigilance on the part of all citizens, and a sense of community in which citizens do what could be done to protect each other. Just as technology has virtually been used to better the way many other things are done, so also this powerful new weapon called computer technology can be used to detect and prevent terrorism not only in Nigeria, but all over the world. This paper will x-ray the possible causes and effects of bomb blast, which is an act of terrorism and suggest ways in which Explosive Detection Devices (EDDs) and computer software technology could be deployed to reduce the occurrences of terrorism in Nigeria. This become necessary with the abduction of over 200 schoolgirls in Chibok, Borno State from their hostel by members of Boko Haram sect members on 14th April, 2014. Presently, Barrack Obama and other world leaders have sent some of their military personnel to help rescue those innocent schoolgirls whose offence is simply seeking to acquire western education which the sect strongly believe is forbidden.

Keywords: terrorism, bomb blast, computer technology, explosive detection devices, Nigeria

Procedia PDF Downloads 239
112 A Wearable Device to Overcome Post–Stroke Learned Non-Use; The Rehabilitation Gaming System for wearables: Methodology, Design and Usability

Authors: Javier De La Torre Costa, Belen Rubio Ballester, Martina Maier, Paul F. M. J. Verschure

Abstract:

After a stroke, a great number of patients experience persistent motor impairments such as hemiparesis or weakness in one entire side of the body. As a result, the lack of use of the paretic limb might be one of the main contributors to functional loss after clinical discharge. We aim to reverse this cycle by promoting the use of the paretic limb during activities of daily living (ADLs). To do so, we describe the key components of a system that is composed of a wearable bracelet (i.e., a smartwatch) and a mobile phone, designed to bring a set of neurorehabilitation principles that promote acquisition, retention and generalization of skills to the home of the patient. A fundamental question is whether the loss in motor function derived from learned–non–use may emerge as a consequence of decision–making processes for motor optimization. Our system is based on well-established rehabilitation strategies that aim to reverse this behaviour by increasing the reward associated with action execution as well as implicitly reducing the expected cost associated with the use of the paretic limb, following the notion of the reinforcement–induced movement therapy (RIMT). Here we validate an accelerometer–based measure of arm use, and its capacity to discriminate different activities that require increasing movement of the arm. We also show how the system can act as a personalized assistant by providing specific goals and adjusting them depending on the performance of the patients. The usability and acceptance of the device as a rehabilitation tool is tested using a battery of self–reported and objective measurements obtained from acute/subacute patients and healthy controls. We believe that an extension of these technologies will allow for the deployment of unsupervised rehabilitation paradigms during and beyond the hospitalization time.

Keywords: stroke, wearables, learned non use, hemiparesis, ADLs

Procedia PDF Downloads 179
111 Low Overhead Dynamic Channel Selection with Cluster-Based Spatial-Temporal Station Reporting in Wireless Networks

Authors: Zeyad Abdelmageid, Xianbin Wang

Abstract:

Choosing the operational channel for a WLAN access point (AP) in WLAN networks has been a static channel assignment process initiated by the user during the deployment process of the AP, which fails to cope with the dynamic conditions of the assigned channel at the station side afterward. However, the dramatically growing number of Wi-Fi APs and stations operating in the unlicensed band has led to dynamic, distributed, and often severe interference. This highlights the urgent need for the AP to dynamically select the best overall channel of operation for the basic service set (BSS) by considering the distributed and changing channel conditions at all stations. Consequently, dynamic channel selection algorithms which consider feedback from the station side have been developed. Despite the significant performance improvement, existing channel selection algorithms suffer from very high feedback overhead. Feedback latency from the STAs, due to the high overhead, can cause the eventually selected channel to no longer be optimal for operation due to the dynamic sharing nature of the unlicensed band. This has inspired us to develop our own dynamic channel selection algorithm with reduced overhead through the proposed low-overhead, cluster-based station reporting mechanism. The main idea behind the cluster-based station reporting is the observation that STAs which are very close to each other tend to have very similar channel conditions. Instead of requesting each STA to report on every candidate channel while causing high overhead, the AP divides STAs into clusters then assigns each STA in each cluster one channel to report feedback on. With the proper design of the cluster based reporting, the AP does not lose any information about the channel conditions at the station side while reducing feedback overhead. The simulation results show equal performance and, at times, better performance with a fraction of the overhead. We believe that this algorithm has great potential in designing future dynamic channel selection algorithms with low overhead.

Keywords: channel assignment, Wi-Fi networks, clustering, DBSCAN, overhead

Procedia PDF Downloads 85