Search results for: ICT deployment
138 Importance of Field Hospitals in Trauma Management: An Experience from Nepal Earthquake
Authors: Krishna Gopal Lageju
Abstract:
On 25th April 2015, a 7.6 magnitude earthquake struck Gorkha district of Nepal, which resulted over 8,790 deaths and 22,300 injuries. In addition, almost one-third of the country’s healthcare service has been disrupted. A total of 1,211 health facilities became non-operational, due to 446 completely and other 765 partially damaged. Nearly 84 percent (375 out of 446) of the completely damaged health facilities are in the 14 most affected districts. As a result, the ability of health facilities to respond to health care needs has been harshly affected. In addition, 18 health workers lost their lives and 75 are injured, which added further challenges in the delivery of health services. Thus, to address the immediate health needs in the most devastated areas, Nepal Red Cross Society (NRCS) in coordination with IFRC and Nepal Government, 8 Field hospitals established with surgical capacities, where around 492 international Emergency Response Units (ERUs) Members are mobilized for 3 months period. More than 54,000 patients have been treated in the Red Cross operated field hospitals. Trauma cases accounted 9,180 (17%) of the total patients off which 1,285 (14%) are major surgical cases. Most of the case loads 44,830 (83%) are outpatients and 9,180 patients got inpatients service. Similarly, 112 births have been performed in the field hospitals. Inpatient mortality rate remained 1.5% (21 deaths), many of them are presented with critical injuries or illnesses. No outbreak has been seen during the ERU operation. Deployment of ERUs together with national health workers are very important to address the immediate health needs of the affected communities. This will ease for transition and handover of emergency service and equipments to local provider. Likewise, capacity building of local staff as on the job training on various clinical teachings would be another important issue to look at before phasing out such services.Keywords: trauma management, critical injuries, earthquake, health
Procedia PDF Downloads 241137 Game-Theory-Based on Downlink Spectrum Allocation in Two-Tier Networks
Authors: Yu Zhang, Ye Tian, Fang Ye Yixuan Kang
Abstract:
The capacity of conventional cellular networks has reached its upper bound and it can be well handled by introducing femtocells with low-cost and easy-to-deploy. Spectrum interference issue becomes more critical in peace with the value-added multimedia services growing up increasingly in two-tier cellular networks. Spectrum allocation is one of effective methods in interference mitigation technology. This paper proposes a game-theory-based on OFDMA downlink spectrum allocation aiming at reducing co-channel interference in two-tier femtocell networks. The framework is formulated as a non-cooperative game, wherein the femto base stations are players and frequency channels available are strategies. The scheme takes full account of competitive behavior and fairness among stations. In addition, the utility function reflects the interference from the standpoint of channels essentially. This work focuses on co-channel interference and puts forward a negative logarithm interference function on distance weight ratio aiming at suppressing co-channel interference in the same layer network. This scenario is more suitable for actual network deployment and the system possesses high robustness. According to the proposed mechanism, interference exists only when players employ the same channel for data communication. This paper focuses on implementing spectrum allocation in a distributed fashion. Numerical results show that signal to interference and noise ratio can be obviously improved through the spectrum allocation scheme and the users quality of service in downlink can be satisfied. Besides, the average spectrum efficiency in cellular network can be significantly promoted as simulations results shown.Keywords: femtocell networks, game theory, interference mitigation, spectrum allocation
Procedia PDF Downloads 156136 Stressors Faced by Border Security Officers: The Singapore Experience
Authors: Jansen Ang, Andrew Neo, Dawn Chia
Abstract:
Border Security is unlike mainstream policing in that officers are essentially in static deployment, working round the clock every day and every hour of the year looking for illegitimate entry of persons and goods. In Singapore, Border Security officers perform multiple functions to ensure the nation’s safety and security. They are responsible for safeguarding the borders of Singapore to prevent threats from entering the country. Being the first line of defence in ensuring the nation’s border security officers are entrusted with the responsibility of screening travellers inbound and outbound of Singapore daily. They examined 99 million arrivals and departures at the various checkpoints in 2014, which is a considerable volume compared to most immigration agencies. The officers’ work scopes also include cargo clearance, protective and security functions of checkpoints. The officers work in very demanding environment which can range from the smog at the land checkpoints to the harshness of the ports at the sea checkpoints. In addition, all immigration checkpoints are located at the boundaries, posing commuting challenges for officers. At the land checkpoints, festive seasons and school breaks are peak periods as given the surge of inbound and outbound travellers at the various checkpoints. Such work provides unique challenges in comparison to other law enforcement duties. This paper assesses the current stressors faced by officers of a border security agency through the conduct of ground observations as well as a perceived stress survey as well as recommendations in combating stressors faced by border security officers. The findings from the field observations and surveys indicate organisational and operational stressors that are unique to border security and recommends interventions in managing these stressors. Understanding these stressors would better inform border security agencies on the interventions needed to enhance the resilience of border security officers.Keywords: border security, Singapore, stress, operations
Procedia PDF Downloads 325135 Development of Mixed Matrix Membranes by Using NH₂-Functionalized UiO-66 and [APTMS][AC] Ionic Liquid for the Separation of CO₂
Authors: Hafiza Mamoona Khalid, Afshan Mujahid, Asif Ali, Asim Laeeq Khan, Mahmood Saleem, Rafael M. Santos
Abstract:
The ever-escalating CO₂ concentration in the atmosphere calls for accelerated development and deployment of carbon capture processes to reduce emissions. Mixed matrix membranes (MMMs), which are fabricated by incorporating the beneficial properties of highly selective inorganic fillers into a polymer matrix, have exhibited significant progress and the ability to enhance the performance of a membrane for gas separation. In this research, an amine-based ionic liquid (IL) [APTMS][AC] was prepared, which has greater CO₂ affinity and greater solubility due to its amine moiety. The metal–organic framework (MOF) UiO-66 with a multidimensional crystalline structure was used as a filler due to its appropriate porosity and tunable properties, and it was functionalized with NH₂. MOFs were further modified with an IL to prepare UiO-66@IL and UiO-66-NH₂@IL, and MMMs incorporating each MOF were fabricated with the polymer Pebax-1657. All the prepared membranes and MOFs were characterized to predict their separation efficiency. Several characterization techniques, namely, FTIR spectroscopy, XRD, and SEM, were used to successfully synthesize UiO-66@IL and UiO-66-NH₂@IL composites and confirmed proper dispersion and excellent polymer‒ filler compatibility at filler loadings ranging from 0 to 30 wt.%. The separation performances were investigated, and the results showed that the incorporation of RTIL with the highly crystalline structure and large surface area of UiO-66 enhanced the separation efficiency of the membrane. The permeability of CO₂ for all fabricated membranes continuously increased with increasing filler concentration, wherein the permeability was comparatively high for the UiO-66-NH₂ MMMs. The CO₂/CH₄ selectivity improved by 35%, 54%, and 60%, respectively, for UiO-66@IL, UiO-66-NH₂, and UiO-66-NH₂@IL MMMs compared to simple UiO-66 for CO₂/CH₄ and by 28%, 36%, and 63%, respectively, for CO₂/N₂, with an increase in filler loading in the MMMs.Keywords: gas separation, mixed matrix membranes, CO₂ sequestration, climate change, global warming
Procedia PDF Downloads 13134 Comparative Analysis of Data Gathering Protocols with Multiple Mobile Elements for Wireless Sensor Network
Authors: Bhat Geetalaxmi Jairam, D. V. Ashoka
Abstract:
Wireless Sensor Networks are used in many applications to collect sensed data from different sources. Sensed data has to be delivered through sensors wireless interface using multi-hop communication towards the sink. The data collection in wireless sensor networks consumes energy. Energy consumption is the major constraints in WSN .Reducing the energy consumption while increasing the amount of generated data is a great challenge. In this paper, we have implemented two data gathering protocols with multiple mobile sinks/elements to collect data from sensor nodes. First, is Energy-Efficient Data Gathering with Tour Length-Constrained Mobile Elements in Wireless Sensor Networks (EEDG), in which mobile sinks uses vehicle routing protocol to collect data. Second is An Intelligent Agent-based Routing Structure for Mobile Sinks in WSNs (IAR), in which mobile sinks uses prim’s algorithm to collect data. Authors have implemented concepts which are common to both protocols like deployment of mobile sinks, generating visiting schedule, collecting data from the cluster member. Authors have compared the performance of both protocols by taking statistics based on performance parameters like Delay, Packet Drop, Packet Delivery Ratio, Energy Available, Control Overhead. Authors have concluded this paper by proving EEDG is more efficient than IAR protocol but with few limitations which include unaddressed issues likes Redundancy removal, Idle listening, Mobile Sink’s pause/wait state at the node. In future work, we plan to concentrate more on these limitations to avail a new energy efficient protocol which will help in improving the life time of the WSN.Keywords: aggregation, consumption, data gathering, efficiency
Procedia PDF Downloads 497133 Cognitive Science Based Scheduling in Grid Environment
Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya
Abstract:
Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence
Procedia PDF Downloads 394132 Study on the Situation between France and the South China Sea from the Perspective of Balance of Power Theory
Authors: Zhenyi Chen
Abstract:
With the rise of China and the escalation of tension between China and the United States, European countries led by Great Britain, France, and Germany pay increasing attention to the regional situation in the Asia-Pacific (now known as "Indo-Pacific"). Among them, the South China Sea (SCS) is one of the main areas disputed by China, the United States, Southeast Asian countries and some European countries. Western countries are worried that the rise of China's military power will break the stability of the situation in SCS and alter the balance of power among major powers. Therefore, they tried to balance China's rise through alliance. In France's Indo-Pacific strategy, France aims to build a regional order with the alliance of France, India and Australia as the core, and regularly carry out military exercises targeting SCS with the United States, Japan and Southeast Asian countries. For China, the instability of the situation in SCS could also threaten the security of the southeast coastal areas and Taiwan, affect China's peaceful development process, and pose a threat to China's territorial sovereignty. This paper aims to study the activities and motivation of France in the South China Sea, and put the situation in SCS under the perspective of Balance of Power Theory, focusing on China, America and France. To be more specific, this paper will first briefly introduce Balance of Power Theory, then describe the new trends of France in recent years, followed with the analysis on the motivation of the increasing trend of France's involvement in SCS, and finally analyze the situation in SCS from the perspective of "balance of power" theory. It will be argued that great powers are carefully maintaining the balance of military power in SCS, and it is highly possible that this trend would still last in the middle and long term, particularly via military deployment and strategic alliances.Keywords: South China Sea, France, China, balance of power theory, Indo-Pacific
Procedia PDF Downloads 175131 Reinforcement-Learning Based Handover Optimization for Cellular Unmanned Aerial Vehicles Connectivity
Authors: Mahmoud Almasri, Xavier Marjou, Fanny Parzysz
Abstract:
The demand for services provided by Unmanned Aerial Vehicles (UAVs) is increasing pervasively across several sectors including potential public safety, economic, and delivery services. As the number of applications using UAVs grows rapidly, more and more powerful, quality of service, and power efficient computing units are necessary. Recently, cellular technology draws more attention to connectivity that can ensure reliable and flexible communications services for UAVs. In cellular technology, flying with a high speed and altitude is subject to several key challenges, such as frequent handovers (HOs), high interference levels, connectivity coverage holes, etc. Additional HOs may lead to “ping-pong” between the UAVs and the serving cells resulting in a decrease of the quality of service and energy consumption. In order to optimize the number of HOs, we develop in this paper a Q-learning-based algorithm. While existing works focus on adjusting the number of HOs in a static network topology, we take into account the impact of cells deployment for three different simulation scenarios (Rural, Semi-rural and Urban areas). We also consider the impact of the decision distance, where the drone has the choice to make a switching decision on the number of HOs. Our results show that a Q-learning-based algorithm allows to significantly reduce the average number of HOs compared to a baseline case where the drone always selects the cell with the highest received signal. Moreover, we also propose which hyper-parameters have the largest impact on the number of HOs in the three tested environments, i.e. Rural, Semi-rural, or Urban.Keywords: drones connectivity, reinforcement learning, handovers optimization, decision distance
Procedia PDF Downloads 108130 The Social Psychology of Illegal Game Room Addiction in the Historic Chinatown District of Honolulu, Hawaii: Illegal Compulsive Gambling, Chinese-Polynesian Organized Crime Syndicates, Police Corruption, and Loan Sharking Rings
Authors: Gordon James Knowles
Abstract:
Historically the Chinatown district in Sandwich Islands has been plagued with the traditional vice crimes of illegal drugs, gambling, and prostitution since the early 1800s. However, a new form of psychologically addictive arcade style table gambling machines has become the dominant form of illegal revenue made in Honolulu, Hawaii. This study attempts to document the drive, desire, or will to play and wager with arcade style video gaming and understand the role of illegal game rooms in facilitating pathological gambling addiction. Indicators of police corruption by Chinese organized crime syndicates related to protection rackets, bribery, and pay-offs were revealed. Information fusion from a police science and sociological intelligence perspective indicates insurgent warfare is being waged on the streets of Honolulu by the People’s Republic of China. This state-sponsored communist terrorism in the Hawaiian Islands used “contactless” irregular warfare entailing: (1) the deployment of psychologically addictive gambling machines, (2) the distribution of the physically addictive fentanyl drug as a lethal chemical weapon, and (3) psychological warfare by circulating pro-China anti-American propaganda newspapers targeted at the small island populace.Keywords: Chinese and Polynesian organized crime, china daily newspaper, electronic arcade style table games, gaming technology addiction, illegal compulsive gambling, and police intelligence
Procedia PDF Downloads 74129 Wake Effects of Wind Turbines and Its Impacts on Power Curve Measurements
Authors: Sajan Antony Mathew, Bhukya Ramdas
Abstract:
Abstract—The impetus of wind energy deployment over the last few decades has seen potential sites being harvested very actively for wind farm development. Due to the scarce availability of highly potential sites, the turbines are getting more optimized in its location wherein minimum spacing between the turbines are resorted without comprising on the optimization of its energy yield. The optimization of the energy yield from a wind turbine is achieved by effective micrositing techniques. These time-tested techniques which are applied from site to site on terrain conditions that meet the requirements of the International standard for power performance measurements of wind turbines result in the positioning of wind turbines for optimized energy yields. The international standard for Power Curve Measurements has rules of procedure and methodology to evaluate the terrain, obstacles and sector for measurements. There are many challenges at the sites for complying with the requirements for terrain, obstacles and sector for measurements. Studies are being attempted to carry out these measurements within the scope of the international standard as various other procedures specified in alternate standards or the integration of LIDAR for Power Curve Measurements are in the nascent stage. The paper strives to assist in the understanding of the fact that if positioning of a wind turbine at a site is based on an optimized output, then there are no wake effects seen on the power curve of an adjacent wind turbine. The paper also demonstrates that an invalid sector for measurements could be used in the analysis in alteration to the requirement as per the international standard for power performance measurements. Therefore the paper strives firstly to demonstrate that if a wind turbine is optimally positioned, no wake effects are seen and secondly the sector for measurements in such a case could include sectors which otherwise would have to be excluded as per the requirements of International standard for power performance measurements.Keywords: micrositing, optimization, power performance, wake effects
Procedia PDF Downloads 461128 World Peace and Conflict Resolution: A Solution from a Buddhist Point of View
Authors: Samitharathana R. Wadigala
Abstract:
The peace will not be established until the self-consciousness would reveal in the human beings. In this nuclear age, the establishment of a lasting peace on the earth represents the primary condition for the preservation of human civilization and survival of human beings. Nothing perhaps is so important and indispensable as the achievement and maintenance of peace in the modern world today. Peace in today’s world implies much more than the mere absence of war and violence. In the interdependent world of today the United Nations needs to be representative of the modern world and democratic in its functioning because it came into existence to save the generations from the scourge of war and conflict. Buddhism is the religion of peaceful co-existence and philosophy of enlightenment. Violence and conflict from the perspective of the Buddhist theory of interdependent origination (Paṭiccasamuppāda) are same with everything else in the world a product of causes and conditions. Buddhism is totally compatible with the congenial and peaceful global order. The canonical literature, doctrines, and philosophy of Buddhism are the best suited for inter-faith dialogue, harmony, and universal peace. Even today Buddhism can resurrect the universal brotherhood, peaceful co-existence and harmonious surroundings in the comity of nations. With its increasing vitality in regions around the world, many people today turn to Buddhism for relief and guidance at the time when peace seems to be a deferred dream more than ever. From a Buddhist point of view the roots of all unwholesome actions (Conflict) i. e. greed, hatred and delusion are viewed as the root cause of all human conflicts. Conflict often emanates from attachment to material things: pleasures, property, territory, wealth, economic dominance or political superiority. Buddhism has some particularly rich resources for deployment in dissolving conflict. Buddhism addresses the Buddhist perspective on the causes of conflict and ways to resolve conflict to realize world peace. The world has enough to satisfy every body’s needs but not every body’s greed.Keywords: Buddhism, conflict-violence, peace, self-consciousness
Procedia PDF Downloads 208127 Developing an Automated Protocol for the Wristband Extraction Process Using Opentrons
Authors: Tei Kim, Brooklynn McNeil, Kathryn Dunn, Douglas I. Walker
Abstract:
To better characterize the relationship between complex chemical exposures and disease, our laboratory uses an approach that combines low-cost, polydimethylsiloxane (silicone) wristband samplers that absorb many of the chemicals we are exposed to with untargeted high-resolution mass spectrometry (HRMS) to characterize 1000’s of chemicals at a time. In studies with human populations, these wristbands can provide an important measure of our environment: however, there is a need to use this approach in large cohorts to study exposures associated with the disease. To facilitate the use of silicone samplers in large scale population studies, the goal of this research project was to establish automated sample preparation methods that improve throughput, robustness, and scalability of analytical methods for silicone wristbands. Using the Opentron OT2 automated liquid platform, which provides a low-cost and opensource framework for automated pipetting, we created two separate workflows that translate the manual wristband preparation method to a fully automated protocol that requires minor intervention by the operator. These protocols include a sequence generation step, which defines the location of all plates and labware according to user-specified settings, and a transfer protocol that includes all necessary instrument parameters and instructions for automated solvent extraction of wristband samplers. These protocols were written in Python and uploaded to GitHub for use by others in the research community. Results from this project show it is possible to establish automated and open source methods for the preparation of silicone wristband samplers to support profiling of many environmental exposures. Ongoing studies include deployment in longitudinal cohort studies to investigate the relationship between personal chemical exposure and disease.Keywords: bioinformatics, automation, opentrons, research
Procedia PDF Downloads 115126 Cosmic Muon Tomography at the Wylfa Reactor Site Using an Anti-Neutrino Detector
Authors: Ronald Collins, Jonathon Coleman, Joel Dasari, George Holt, Carl Metelko, Matthew Murdoch, Alexander Morgan, Yan-Jie Schnellbach, Robert Mills, Gareth Edwards, Alexander Roberts
Abstract:
At the Wylfa Magnox Power Plant between 2014–2016, the VIDARR prototype anti-neutrino detector was deployed. It is comprised of extruded plastic scintillating bars measuring 4 cm × 1 cm × 152 cm and utilised wavelength shifting fibres (WLS) and multi-pixel photon counters (MPPCs) to detect and quantify radiation. During deployment, it took cosmic muon data in accidental coincidence with the anti-neutrino measurements with the power plant site buildings obscuring the muon sky. Cosmic muons have a significantly higher probability of being attenuated and/or absorbed by denser objects, and so one-sided cosmic muon tomography was utilised to image the reactor site buildings. In order to achieve clear building outlines, a control data set was taken at the University of Liverpool from 2016 – 2018, which had minimal occlusion of the cosmic muon flux by dense objects. By taking the ratio of these two data sets and using GEANT4 simulations, it is possible to perform a one-sided cosmic muon tomography analysis. This analysis can be used to discern specific buildings, building heights, and features at the Wylfa reactor site, including the reactor core/reactor core shielding using ∼ 3 hours worth of cosmic-ray detector live time. This result demonstrates the feasibility of using cosmic muon analysis to determine a segmented detector’s location with respect to surrounding buildings, assisted by aerial photography or satellite imagery.Keywords: anti-neutrino, GEANT4, muon, tomography, occlusion
Procedia PDF Downloads 186125 Deployment of Information and Communication Technology (ICT) to Reduce Occurrences of Terrorism in Nigeria
Authors: Okike Benjamin
Abstract:
Terrorism is the use of violence and threat to intimidate or coerce a person, group, society or even government especially for political purposes. Terrorism may be a way of resisting government by some group who may feel marginalized. It could also be a way of expressing displeasure over the activities of government. On 26th December, 2009, US placed Nigeria as a terrorist nation. Recently, the occurrences of terrorism in Nigeria have increased considerably. In Jos, Plateau state, Nigeria, there was a bomb blast which claimed many lives on the eve of 2010 Christmas. Similarly, there was another bomb blast in Mugadishi (Sani Abacha) Barracks Mammy market on the eve of 2011 New Year. For some time now, it is no longer news that bomb exploded in some Northern part of Nigeria. About 25 years ago, stopping terrorism in America by the Americans relied on old-fashioned tools such as strict physical security at vulnerable places, intelligence gathering by government agents, or individuals, vigilance on the part of all citizens, and a sense of community in which citizens do what could be done to protect each other. Just as technology has virtually been used to better the way many other things are done, so also this powerful new weapon called computer technology can be used to detect and prevent terrorism not only in Nigeria, but all over the world. This paper will x-ray the possible causes and effects of bomb blast, which is an act of terrorism and suggest ways in which Explosive Detection Devices (EDDs) and computer software technology could be deployed to reduce the occurrences of terrorism in Nigeria. This become necessary with the abduction of over 200 schoolgirls in Chibok, Borno State from their hostel by members of Boko Haram sect members on 14th April, 2014. Presently, Barrack Obama and other world leaders have sent some of their military personnel to help rescue those innocent schoolgirls whose offence is simply seeking to acquire western education which the sect strongly believe is forbidden.Keywords: terrorism, bomb blast, computer technology, explosive detection devices, Nigeria
Procedia PDF Downloads 261124 A Wearable Device to Overcome Post–Stroke Learned Non-Use; The Rehabilitation Gaming System for wearables: Methodology, Design and Usability
Authors: Javier De La Torre Costa, Belen Rubio Ballester, Martina Maier, Paul F. M. J. Verschure
Abstract:
After a stroke, a great number of patients experience persistent motor impairments such as hemiparesis or weakness in one entire side of the body. As a result, the lack of use of the paretic limb might be one of the main contributors to functional loss after clinical discharge. We aim to reverse this cycle by promoting the use of the paretic limb during activities of daily living (ADLs). To do so, we describe the key components of a system that is composed of a wearable bracelet (i.e., a smartwatch) and a mobile phone, designed to bring a set of neurorehabilitation principles that promote acquisition, retention and generalization of skills to the home of the patient. A fundamental question is whether the loss in motor function derived from learned–non–use may emerge as a consequence of decision–making processes for motor optimization. Our system is based on well-established rehabilitation strategies that aim to reverse this behaviour by increasing the reward associated with action execution as well as implicitly reducing the expected cost associated with the use of the paretic limb, following the notion of the reinforcement–induced movement therapy (RIMT). Here we validate an accelerometer–based measure of arm use, and its capacity to discriminate different activities that require increasing movement of the arm. We also show how the system can act as a personalized assistant by providing specific goals and adjusting them depending on the performance of the patients. The usability and acceptance of the device as a rehabilitation tool is tested using a battery of self–reported and objective measurements obtained from acute/subacute patients and healthy controls. We believe that an extension of these technologies will allow for the deployment of unsupervised rehabilitation paradigms during and beyond the hospitalization time.Keywords: stroke, wearables, learned non use, hemiparesis, ADLs
Procedia PDF Downloads 217123 Low Overhead Dynamic Channel Selection with Cluster-Based Spatial-Temporal Station Reporting in Wireless Networks
Authors: Zeyad Abdelmageid, Xianbin Wang
Abstract:
Choosing the operational channel for a WLAN access point (AP) in WLAN networks has been a static channel assignment process initiated by the user during the deployment process of the AP, which fails to cope with the dynamic conditions of the assigned channel at the station side afterward. However, the dramatically growing number of Wi-Fi APs and stations operating in the unlicensed band has led to dynamic, distributed, and often severe interference. This highlights the urgent need for the AP to dynamically select the best overall channel of operation for the basic service set (BSS) by considering the distributed and changing channel conditions at all stations. Consequently, dynamic channel selection algorithms which consider feedback from the station side have been developed. Despite the significant performance improvement, existing channel selection algorithms suffer from very high feedback overhead. Feedback latency from the STAs, due to the high overhead, can cause the eventually selected channel to no longer be optimal for operation due to the dynamic sharing nature of the unlicensed band. This has inspired us to develop our own dynamic channel selection algorithm with reduced overhead through the proposed low-overhead, cluster-based station reporting mechanism. The main idea behind the cluster-based station reporting is the observation that STAs which are very close to each other tend to have very similar channel conditions. Instead of requesting each STA to report on every candidate channel while causing high overhead, the AP divides STAs into clusters then assigns each STA in each cluster one channel to report feedback on. With the proper design of the cluster based reporting, the AP does not lose any information about the channel conditions at the station side while reducing feedback overhead. The simulation results show equal performance and, at times, better performance with a fraction of the overhead. We believe that this algorithm has great potential in designing future dynamic channel selection algorithms with low overhead.Keywords: channel assignment, Wi-Fi networks, clustering, DBSCAN, overhead
Procedia PDF Downloads 118122 Ensuring Quality in DevOps Culture
Authors: Sagar Jitendra Mahendrakar
Abstract:
Integrating quality assurance (QA) practices into DevOps culture has become increasingly important in modern software development environments. Collaboration, automation and continuous feedback characterize the seamless integration of DevOps development and operations teams to achieve rapid and reliable software delivery. In this context, quality assurance plays a key role in ensuring that software products meet the highest quality, performance and reliability standards throughout the development life cycle. This brief explores key principles, challenges, and best practices related to quality assurance in a DevOps culture. This emphasizes the importance of quality transfer in the development process, as quality control processes are integrated in every step of the DevOps process. Automation is the cornerstone of DevOps quality assurance, enabling continuous testing, integration and deployment and providing rapid feedback for early problem identification and resolution. In addition, the summary addresses the cultural and organizational challenges of implementing quality assurance in DevOps, emphasizing the need to foster collaboration, break down silos, and promote a culture of continuous improvement. It also discusses the importance of toolchain integration and capability development to support effective QA practices in DevOps environments. Moreover, the abstract discusses the cultural and organizational challenges in implementing QA within DevOps, emphasizing the need for fostering collaboration, breaking down silos, and nurturing a culture of continuous improvement. It also addresses the importance of toolchain integration and skills development to support effective QA practices within DevOps environments. Overall, this collection works at the intersection of QA and DevOps culture, providing insights into how organizations can use DevOps principles to improve software quality, accelerate delivery, and meet the changing demands of today's dynamic software. landscape.Keywords: quality engineer, devops, automation, tool
Procedia PDF Downloads 58121 Factors Affecting M-Government Deployment and Adoption
Authors: Saif Obaid Alkaabi, Nabil Ayad
Abstract:
Governments constantly seek to offer faster, more secure, efficient and effective services for their citizens. Recent changes and developments to communication services and technologies, mainly due the Internet, have led to immense improvements in the way governments of advanced countries carry out their interior operations Therefore, advances in e-government services have been broadly adopted and used in various developed countries, as well as being adapted to developing countries. The implementation of advances depends on the utilization of the most innovative structures of data techniques, mainly in web dependent applications, to enhance the main functions of governments. These functions, in turn, have spread to mobile and wireless techniques, generating a new advanced direction called m-government. This paper discusses a selection of available m-government applications and several business modules and frameworks in various fields. Practically, the m-government models, techniques and methods have become the improved version of e-government. M-government offers the potential for applications which will work better, providing citizens with services utilizing mobile communication and data models incorporating several government entities. Developing countries can benefit greatly from this innovation due to the fact that a large percentage of their population is young and can adapt to new technology and to the fact that mobile computing devices are more affordable. The use of models of mobile transactions encourages effective participation through the use of mobile portals by businesses, various organizations, and individual citizens. Although the application of m-government has great potential, it does have major limitations. The limitations include: the implementation of wireless networks and relative communications, the encouragement of mobile diffusion, the administration of complicated tasks concerning the protection of security (including the ability to offer privacy for information), and the management of the legal issues concerning mobile applications and the utilization of services.Keywords: e-government, m-government, system dependability, system security, trust
Procedia PDF Downloads 381120 Neuropsychological Disabilities in Executive Functions and Visuospatial Skills of Juvenile Offenders in a Half-Open Program in Santiago De Chile
Authors: Gabriel Sepulveda Navarro
Abstract:
Traditional interventions for young offenders are necessary but not sufficient to tackle the multiple causes of juvenile crime. For instance, interventions offered to young offenders often are verbally mediated and dialogue based, requiring important metacognitive abilities as well as abstract thinking, assuming average performance in a wide variety of skills. It seems necessary to assess a broader set of abilities and functions in order to increase the efficiency of interventions while addressing offending. In order to clarify these assumptions, Stroop Test, as well as Rey-Osterrieth Complex Figure Test were applied to juvenile offenders tried and sentenced for violent crimes in Santiago de Chile. A random sample was drawn from La Cisterna Half-Open Program, consisting of 50 young males between 18 and 24 years old, residing in different districts of Santiago de Chile. The analysis of results suggests a disproportionately elevated incidence of impairments in executive functions and visuospatial skills. As an outcome, over 40% of the sample shows a significant low performance in both assessments, exceeding four times the same prevalence rates among young people in the general population. While executive functions entail working memory (being able to keep information and use it in some way), cognitive flexibility (to think about something in more than one way) and inhibitory control (being able to self-control, ignore distractions and delay immediate gratification), visuospatial skills permit to orientate and organize a planned conduct. All of these abilities are fundamental to the skill of avoiding violent behaviour and abiding by social rules. Understanding the relevance of neurodevelopmental impairments in the onset of violent and criminal behaviour, as well as recidivism, eventually may guide the deployment of a more comprehensive assessment and treatment for juvenile offenders.Keywords: executive functions, half-open program, juvenile offenders, neurodisabilities, visuospatial skills
Procedia PDF Downloads 148119 Transformer-Driven Multi-Category Classification for an Automated Academic Strand Recommendation Framework
Authors: Ma Cecilia Siva
Abstract:
This study introduces a Bidirectional Encoder Representations from Transformers (BERT)-based machine learning model aimed at improving educational counseling by automating the process of recommending academic strands for students. The framework is designed to streamline and enhance the strand selection process by analyzing students' profiles and suggesting suitable academic paths based on their interests, strengths, and goals. Data was gathered from a sample of 200 grade 10 students, which included personal essays and survey responses relevant to strand alignment. After thorough preprocessing, the text data was tokenized, label-encoded, and input into a fine-tuned BERT model set up for multi-label classification. The model was optimized for balanced accuracy and computational efficiency, featuring a multi-category classification layer with sigmoid activation for independent strand predictions. Performance metrics showed an F1 score of 88%, indicating a well-balanced model with precision at 80% and recall at 100%, demonstrating its effectiveness in providing reliable recommendations while reducing irrelevant strand suggestions. To facilitate practical use, the final deployment phase created a recommendation framework that processes new student data through the trained model and generates personalized academic strand suggestions. This automated recommendation system presents a scalable solution for academic guidance, potentially enhancing student satisfaction and alignment with educational objectives. The study's findings indicate that expanding the data set, integrating additional features, and refining the model iteratively could improve the framework's accuracy and broaden its applicability in various educational contexts.Keywords: tokenized, sigmoid activation, transformer, multi category classification
Procedia PDF Downloads 8118 Safeguarding the Construction Industry: Interrogating and Mitigating Emerging Risks from AI in Construction
Authors: Abdelrhman Elagez, Rolla Monib
Abstract:
This empirical study investigates the observed risks associated with adopting Artificial Intelligence (AI) technologies in the construction industry and proposes potential mitigation strategies. While AI has transformed several industries, the construction industry is slowly adopting advanced technologies like AI, introducing new risks that lack critical analysis in the current literature. A comprehensive literature review identified a research gap, highlighting the lack of critical analysis of risks and the need for a framework to measure and mitigate the risks of AI implementation in the construction industry. Consequently, an online survey was conducted with 24 project managers and construction professionals, possessing experience ranging from 1 to 30 years (with an average of 6.38 years), to gather industry perspectives and concerns relating to AI integration. The survey results yielded several significant findings. Firstly, respondents exhibited a moderate level of familiarity (66.67%) with AI technologies, while the industry's readiness for AI deployment and current usage rates remained low at 2.72 out of 5. Secondly, the top-ranked barriers to AI adoption were identified as lack of awareness, insufficient knowledge and skills, data quality concerns, high implementation costs, absence of prior case studies, and the uncertainty of outcomes. Thirdly, the most significant risks associated with AI use in construction were perceived to be a lack of human control (decision-making), accountability, algorithm bias, data security/privacy, and lack of legislation and regulations. Additionally, the participants acknowledged the value of factors such as education, training, organizational support, and communication in facilitating AI integration within the industry. These findings emphasize the necessity for tailored risk assessment frameworks, guidelines, and governance principles to address the identified risks and promote the responsible adoption of AI technologies in the construction sector.Keywords: risk management, construction, artificial intelligence, technology
Procedia PDF Downloads 98117 Genetic and Virulence Diversity among Alternaria carthami Isolates of India
Authors: Garima Anand, Rupam Kapoor
Abstract:
Alternaria leaf spot caused by Alternaria carthami is one of the most devastating diseases of safflower. It has resulted in huge losses in crop production and cultivation leading to a fall out of India’s rank as the leading producer of safflower in the world. Understanding the diversity of any pathogen is essential for its management and for the development of disease control strategies. The diversity of A. carthami was therefore analysed on the basis of biochemical, pathogenicity and genetic lines using ISSR markers. Collections and isolations of 95 isolates of A. carthami were made from major safflower producing states of India. Virulence was analysed to evaluate the pathogenic potential of these isolates. The isolates from Bijapur, Dharwad districts (Karnataka), and Parbhani and Solapur districts (Maharashtra) were found to be highly virulent. The virulence assays showed low virulence levels (42%) for the largest part of the population. Biochemical characterization to assess aggressiveness of these isolates was done by estimating the activity of cell wall degrading enzymes where isolates from districts Dharwad, Bijapur of Karnataka and districts Parbhani and Latur of Maharashtra were found to be most aggressive. Genetic diversity among isolates of A. carthami was determined using eighteen ISSR markers. Distance analysis using neighbour joining method and PCoA analysis of the ISSR profiles divided the isolates into three sub-populations. The most virulent isolates clustered in one group in the dendrogram. The study provided no evidence for geographical clustering indicating that isolates are randomly spread across the states, signifying the high potential of the fungus to adapt to diverse regions. The study can, therefore, aid in the breeding and deployment of A. carthami resistant safflower varieties and in the management of Alternaria leaf spot disease.Keywords: alternaria leaf spot, genetic diversity, pathogenic potential, virulence
Procedia PDF Downloads 254116 Explanatory Variables for Crash Injury Risk Analysis
Authors: Guilhermina Torrao
Abstract:
An extensive number of studies have been conducted to determine the factors which influence crash injury risk (CIR); however, uncertainties inherent to selected variables have been neglected. A review of existing literature is required to not only obtain an overview of the variables and measures but also ascertain the implications when comparing studies without a systematic view of variable taxonomy. Therefore, the aim of this literature review is to examine and report on peer-reviewed studies in the field of crash analysis and to understand the implications of broad variations in variable selection in CIR analysis. The objective of this study is to demonstrate the variance in variable selection and classification when modeling injury risk involving occupants of light vehicles by presenting an analytical review of the literature. Based on data collected from 64 journal publications reported over the past 21 years, the analytical review discusses the variables selected by each study across an organized list of predictors for CIR analysis and provides a better understanding of the contribution of accident and vehicle factors to injuries acquired by occupants of light vehicles. A cross-comparison analysis demonstrates that almost half the studies (48%) did not consider vehicle design specifications (e.g., vehicle weight), whereas, for those that did, the vehicle age/model year was the most selected explanatory variable used by 41% of the literature studies. For those studies that included speed risk factor in their analyses, the majority (64%) used the legal speed limit data as a ‘proxy’ of vehicle speed at the moment of a crash, imposing limitations for CIR analysis and modeling. Despite the proven efficiency of airbags in minimizing injury impact following a crash, only 22% of studies included airbag deployment data. A major contribution of this study is to highlight the uncertainty linked to explanatory variable selection and identify opportunities for improvements when performing future studies in the field of road injuries.Keywords: crash, exploratory, injury, risk, variables, vehicle
Procedia PDF Downloads 135115 Valorisation of Mango Seed: Response Surface Methodology Based Optimization of Starch Extraction from Mango Seeds
Authors: Tamrat Tesfaye, Bruce Sithole
Abstract:
Box-Behnken Response surface methodology was used to determine the optimum processing conditions that give maximum extraction yield and whiteness index from mango seed. The steeping time ranges from 2 to 12 hours and slurring of the steeped seed in sodium metabisulphite solution (0.1 to 0.5 w/v) was carried out. Experiments were designed according to Box-Behnken Design with these three factors and a total of 15 runs experimental variables of were analyzed. At linear level, the concentration of sodium metabisulphite had significant positive influence on percentage yield and whiteness index at p<0.05. At quadratic level, sodium metabisulphite concentration and sodium metabisulphite concentration2 had a significant negative influence on starch yield; sodium metabisulphite concentration and steeping time*temperature had significant (p<0.05) positive influence on whiteness index. The adjusted R2 above 0.8 for starch yield (0.906465) and whiteness index (0.909268) showed a good fit of the model with the experimental data. The optimum sodium metabisulphite concentration, steeping hours, and temperature for starch isolation with maximum starch yield (66.428%) and whiteness index (85%) as set goals for optimization with the desirability of 0.91939 was 0.255w/v concentration, 2hrs and 50 °C respectively. The determined experimental value of each response based on optimal condition was statistically in accordance with predicted levels at p<0.05. The Mango seeds are the by-products obtained during mango processing and possess disposal problem if not handled properly. The substitution of food based sizing agents with mango seed starch can contribute as pertinent resource deployment for value-added product manufacturing and waste utilization which might play significance role of food security in Ethiopia.Keywords: mango, synthetic sizing agent, starch, extraction, textile, sizing
Procedia PDF Downloads 231114 The LNG Paradox: The Role of Gas in the Energy Transition
Authors: Ira Joseph
Abstract:
The LNG paradox addresses the issue of how the most expensive form of gas supply, which is LNG, will grow in an end user market where demand is most competitive, which is power generation. In this case, LNG demand growth is under siege from two entirely different directions. At one end is price; it will be extremely difficult for gas to replace coal in Asia due to the low price of coal and the age of the generation plants. Asia's coal fleet, on average, is less than two decades old and will need significant financial incentives to retire before its state lifespan. While gas would cut emissions in half relative to coal, it would also more than double the price of the fuel source for power generation, which puts it in a precarious position. In most countries in Asia other than China, this cost increase, particularly from imports, is simply not realistic when it is also necessary to focus on economic growth and social welfare. On the other end, renewables are growing at an exponential rate for three reasons. One is that prices are dropping. Two is that policy incentives are driving deployment, and three is that China is forcing renewables infrastructure into the market to take a political seat at the global energy table with Saudi Arabia, the US, and Russia. Plus, more renewables will lower import growth of oil and gas in China, if not end it altogether. Renewables are the predator at the gate of gas demand in power generation and in every year that passes, renewables cut into demand growth projections for gas; in particular, the type of gas that is most expensive, which is LNG. Gas does have a role in the future, particularly within a domestic market. Once it crosses borders in the form of LNG or even pipeline gas, it quickly becomes a premium fuel and must be marketed and used this way. Our research shows that gas will be able to compete with batteries as an intermittency and storage tool and does offer a method to harmonize with renewables as part of the energy transition. As a baseload fuel, however, the role of gas, particularly, will be limited by cost once it needs to cross a border. Gas converted into blue or green hydrogen or ammonia is also an option for storage depending on the location. While this role is much reduced from the primary baseload role that gas once aspired to land, it still offers a credible option for decades to come.Keywords: natural gas, LNG, demand, price, intermittency, storage, renewables
Procedia PDF Downloads 61113 Monitoring Deforestation Using Remote Sensing And GIS
Authors: Tejaswi Agarwal, Amritansh Agarwal
Abstract:
Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from Indian institute of remote Sensing (IIRS), Dehradoon in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud free and did not belong to dry and leafless season. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean, we have analysed the change in ground biomass. Through this paper, we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques, it is clearly shown that the total forest cover is continuously degrading and transforming into various land use/land cover category.Keywords: remote sensing, deforestation, supervised classification, NDVI, change detection
Procedia PDF Downloads 1202112 NDVI as a Measure of Change in Forest Biomass
Authors: Amritansh Agarwal, Tejaswi Agarwal
Abstract:
Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000 km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from USGS website in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud and aerosol free by making using of FLAASH atmospheric correction technique. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean we have analysed the change in ground biomass. Through this paper we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques it is clearly shows that the total forest cover is continuously degrading and transforming into various land use/land cover category.Keywords: remote sensing, deforestation, supervised classification, NDVI change detection
Procedia PDF Downloads 402111 Technology Futures in Global Militaries: A Forecasting Method Using Abstraction Hierarchies
Authors: Mark Andrew
Abstract:
Geopolitical tensions are at a thirty-year high, and the pace of technological innovation is driving asymmetry in force capabilities between nation states and between non-state actors. Technology futures are a vital component of defence capability growth, and investments in technology futures need to be informed by accurate and reliable forecasts of the options for ‘systems of systems’ innovation, development, and deployment. This paper describes a method for forecasting technology futures developed through an analysis of four key systems’ development stages, namely: technology domain categorisation, scanning results examining novel systems’ signals and signs, potential system-of systems’ implications in warfare theatres, and political ramifications in terms of funding and development priorities. The method has been applied to several technology domains, including physical systems (e.g., nano weapons, loitering munitions, inflight charging, and hypersonic missiles), biological systems (e.g., molecular virus weaponry, genetic engineering, brain-computer interfaces, and trans-human augmentation), and information systems (e.g., sensor technologies supporting situation awareness, cyber-driven social attacks, and goal-specification challenges to proliferation and alliance testing). Although the current application of the method has been team-centred using paper-based rapid prototyping and iteration, the application of autonomous language models (such as GPT-3) is anticipated as a next-stage operating platform. The importance of forecasting accuracy and reliability is considered a vital element in guiding technology development to afford stronger contingencies as ideological changes are forecast to expand threats to ecology and earth systems, possibly eclipsing the traditional vulnerabilities of nation states. The early results from the method will be subjected to ground truthing using longitudinal investigation.Keywords: forecasting, technology futures, uncertainty, complexity
Procedia PDF Downloads 114110 Scenario Analysis to Assess the Competitiveness of Hydrogen in Securing the Italian Energy System
Authors: Gianvito Colucci, Valeria Di Cosmo, Matteo Nicoli, Orsola Maria Robasto, Laura Savoldi
Abstract:
The hydrogen value chain deployment is likely to be boosted in the near term by the energy security measures planned by European countries to face the recent energy crisis. In this context, some countries are recognized to have a crucial role in the geopolitics of hydrogen as importers, consumers and exporters. According to the European Hydrogen Backbone Initiative, Italy would be part of one of the 5 corridors that will shape the European hydrogen market. However, the set targets are very ambitious and require large investments to rapidly develop effective hydrogen policies: in this regard, scenario analysis is becoming increasingly important to support energy planning, and energy system optimization models appear to be suitable tools to quantitively carry on that kind of analysis. The work aims to assess the competitiveness of hydrogen in contributing to the Italian energy security in the coming years, under different price and import conditions, using the energy system model TEMOA-Italy. A wide spectrum of hydrogen technologies is included in the analysis, covering the production, storage, delivery, and end-uses stages. National production from fossil fuels with and without CCS, as well as electrolysis and import of low-carbon hydrogen from North Africa, are the supply solutions that would compete with other ones, such as natural gas, biomethane and electricity value chains, to satisfy sectoral energy needs (transport, industry, buildings, agriculture). Scenario analysis is then used to study the competition under different price and import conditions. The use of TEMOA-Italy allows the work to catch the interaction between the economy and technological detail, which is much needed in the energy policies assessment, while the transparency of the analysis and of the results is ensured by the full accessibility of the TEMOA open-source modeling framework.Keywords: energy security, energy system optimization models, hydrogen, natural gas, open-source modeling, scenario analysis, TEMOA
Procedia PDF Downloads 116109 Radio Regulation Development and Radio Spectrum Analysis of Earth Station in Motion Service
Authors: Fei Peng, Jun Yuan, Chen Fan, Fan Jiang, Qian Sun, Yudi Liu
Abstract:
Although Earth Station in Motion (ESIM) services are widely used and there is a huge market demand around the world, International Telecommunication Union (ITU) does not have unified conclusion for the use of ESIM yet. ESIM are Mobile Satellite Services (MSS) due to its mobile-based attributes, while multiple administrations want to use ESIM in Fixed Satellite Service (FSS). However, Radio Regulations (RR) have strict distinction between MSS and FSS. In this case, ITU has been very controversial because this kind of application will violate the RR Article and the conflict will bring risks to the global deployment. Thus, this paper illustrates the development of rules, regulations, standards concerning ESIM and the radio spectrum usage of ESIM in different regions around the world. Firstly, the basic rules, standard and definition of ITU’s Radiocommunication Sector (ITU-R) is introduced. Secondly, the World Radiocommunication Conference (WRC) agenda item on radio spectrum allocation for ESIM, e.g. in C/Ku/Ka band, is introduced and multi-view on the radio spectrum allocation is elaborated, especially on 19.7-20.2 GHz & 29.5-30.0 GHz. Then, some ITU-R Recommendations and Reports are analyzed on the specific technique to enable these ESIM to communicate with Geostationary Earth Orbit Satellite (GSO) space stations in the FSS without causing interference at levels in excess of that caused by conventional FSS earth stations. Meanwhile, the opposite opinion on not allocating EISM service in FSS frequency band is also elaborated. Finally, based on the ESIM’s future application, the ITU-R standards development trend is forecasted. In conclusion, using radio spectrum resource in an equitable, rational and efficient manner is the basic guideline of ITU. Although it is not a good approach to obstruct the revise of RR when there is a large demand for radio spectrum resource in satellite industry, still the propulsion and global demand of the whole industry may face difficulties on the unclear application in modify rules of RR.Keywords: earth station in motion, ITU standards, radio regulations, radio spectrum, satellite communication
Procedia PDF Downloads 288