Search results for: radar warning receiver
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 809

Search results for: radar warning receiver

269 Using Coupled Oscillators for Implementing Frequency Diverse Array

Authors: Maryam Hasheminasab, Ahmed Cheldavi, Ahmed Kishk

Abstract:

Frequency-diverse arrays (FDAs) have garnered significant attention from researchers due to their ability to combine frequency diversity with the inherent spatial diversity of an array. The introduction of frequency diversity in FDAs enables the generation of auto-scanning patterns that are range-dependent, which can have advantageous applications in communication and radar systems. However, the main challenge in implementing FDAs lies in determining the technique for distributing frequencies among the array elements. One approach to address this challenge is by utilizing coupled oscillators, which are a technique commonly employed in active microwave theory. Nevertheless, the limited stability range of coupled oscillators poses another obstacle to effectively utilizing this technique. In this paper, we explore the possibility of employing a coupled oscillator array in the mode lock state (MLS) for implementing frequency distribution in FDAs. Additionally, we propose and simulate the use of a digital phase-locked loop (DPLL) as a backup technique to stabilize the oscillators. Through simulations, we validate the functionality of this technique. This technique holds great promise for advancing the implementation of phased arrays and overcoming current scan rate and phase shifter limitations, especially in millimeter wave frequencies.

Keywords: angle-changing rate, auto scanning beam, pull-in range, hold-in range, locking range, mode locked state, frequency locked state

Procedia PDF Downloads 58
268 Communication Aesthetics of Techno-Scenery and Lighting in Bolanle Austen-Peters Queen Moremi the Musical

Authors: Badeji Adebayo John

Abstract:

Technology has immense contribution in every aspect of human endeavor; it has not only made work easier but also provided exhilarating impression in the mind of the people. Theatre is not exempted from the multifaceted influence of technology on phenomenon. Therefore, theatre performances have experienced the excellence of technology in the contemporary era such that audiences have unforgettable experiences after seeing theatre performances. Some of these technological advancements that have amplified the aesthetics of performances in the theatre are techno-scenery (3D mapping) and lighting. In view of this, the objective of this study is to explore how techno-scenery and lighting technologies were used to communicate messages in the performance of Queen Moremi the Musical. In so doing, Participant-Observation Method and Content Analysis are adopted. Berlo’s model of communication is also employed to explain the communicative aesthetics of these theatre technologies in the performance. Techno-scenery and lighting are communication media modifier that facilitates audiences’ comprehension of the messages in the performance of Queen Moremi the Musical. They also create clear motion pictures of the setting which the performers cannot communicate in their acting, dances and singing, to ease the audiences’ decoding of messages that the performers are sending to the audience. Therefore, consistent incorporation of these technologies to theatre performances will facilitate easy flow of communication in-between the performers who are the sender, the message which is the performance and the audience who are the receiver.

Keywords: communication, aesthetics, techno-scenery, lighting, musical

Procedia PDF Downloads 56
267 Designing Electronic Kanban in Assembly Line Tailboom at XYZ Corp to Reducing Lead Time

Authors: Nadhifah A. Nugraha, Dida D. Damayanti, Widia Juliani

Abstract:

Airplanes manufacturing is growing along with the increasing demand from consumers. The helicopter's tail called Tailboom is a product of the helicopter division at XYZ Corp, where the Tailboom assembly line is a pull system. Based on observations of existing conditions that occur at XYZ Corp, production is still unable to meet the demands of consumers; lead time occurs greater than the plan agreed upon by the consumers. In the assembly process, each work station experiences a lack of parts and components needed to assemble components. This happens because of the delay in getting the required part information, and there is no warning about the availability of parts needed, it makes some parts unavailable in assembly warehouse. The lack of parts and components from the previous work station causes the assembly process to stop, and the assembly line also stops at the next station. In its completion, the production time was late and not on the schedule. In resolving these problems, the controlling process is needed, which is controlling the assembly line to get all components and subassembly in the right amount and at the right time. This study applies one of Just In Time tools, namely Kanban and automation, should be added as efficiently and effectively communication line becomes electronic Kanban. The problem can be solved by reducing non-value added time, such as waiting time and idle time. The proposed results of controlling the assembly line of Tailboom result in a smooth assembly line without waiting, reduced lead time, and achieving production time according to the schedule agreed with the consumers.

Keywords: kanban, e-Kanban, lead time, pull system

Procedia PDF Downloads 85
266 The Ontological Memory in Bergson as a Conceptual Tool for the Analysis of the Digital Conjuncture

Authors: Douglas Rossi Ramos

Abstract:

The current digital conjuncture, called by some authors as 'Internet of Things' (IoT), 'Web 2.0' or even 'Web 3.0', consists of a network that encompasses any communication of objects and entities, such as data, information, technologies, and people. At this juncture, especially characterized by an "object socialization," communication can no longer be represented as a simple informational flow of messages from a sender, crossing a channel or medium, reaching a receiver. The idea of communication must, therefore, be thought of more broadly in which it is possible to analyze the process communicative from interactions between humans and nonhumans. To think about this complexity, a communicative process that encompasses both humans and other beings or entities communicating (objects and things), it is necessary to constitute a new epistemology of communication to rethink concepts and notions commonly attributed to humans such as 'memory.' This research aims to contribute to this epistemological constitution from the discussion about the notion of memory according to the complex ontology of Henri Bergson. Among the results (the notion of memory in Bergson presents itself as a conceptual tool for the analysis of posthumanism and the anthropomorphic conjuncture of the new advent of digital), there was the need to think about an ontological memory, analyzed as a being itself (being itself of memory), as a strategy for understanding the forms of interaction and communication that constitute the new digital conjuncture, in which communicating beings or entities tend to interact with each other. Rethinking the idea of communication beyond the dimension of transmission in informative sequences paves the way for an ecological perspective of the digital dwelling condition.

Keywords: communication, digital, Henri Bergson, memory

Procedia PDF Downloads 132
265 Driver Readiness in Autonomous Vehicle Take-Overs

Authors: Abdurrahman Arslanyilmaz, Salman Al Matouq, Durmus V. Doner

Abstract:

Level 3 autonomous vehicles are able to take full responsibility over the control of the vehicle unless a system boundary is reached or a system failure occurs, in which case, the driver is expected to take-over the control of the vehicle. While this happens, the driver is often not aware of the traffic situation or is engaged in a secondary task. Factors affecting the duration and quality of take-overs in these situations have included secondary task type and nature, traffic density, take-over request (TOR) time, and TOR warning type and modality. However, to the best of the authors’ knowledge, no prior study examined time buffer for TORs when a system failure occurs immediately before intersections. The first objective of this study is to investigate the effect of time buffer (3 and 7 seconds) on the duration and quality of take-overs when a system failure occurs just prior to intersections. In addition, eye-tracking has become one of the most popular methods to report what individuals view, in what order, for how long, and how often, and it has been utilized in driving simulations with various objectives. However, to the extent of authors’ knowledge, none has compared drivers’ eye gaze behavior in the two different time buffers in order to examine drivers’ attention and comprehension of salient information. The second objective is to understand the driver’s attentional focus on comprehension of salient traffic-related information presented on different parts of the dashboard and on the roads.

Keywords: autonomous vehicles, driving simulation, eye gaze, attention, comprehension, take-over duration, take-over quality, time buffer

Procedia PDF Downloads 106
264 Comparative Diagnostic Performance of Diffusion-Weighted Imaging Combined With Microcalcifications on Mammography for Discriminating Malignant From Benign Bi-rads 4 Lesions With the Kaiser Score

Authors: Wangxu Xia

Abstract:

BACKGROUND BI-RADS 4 lesions raise the possibility of malignancy that warrant further clinical and radiologic work-up. This study aimed to evaluate the predictive performance of diffusion-weighted imaging(DWI) and microcalcifications on mammography for predicting malignancy of BI-RADS 4 lesions. In addition, the predictive performance of DWI combined with microcalcifications was alsocompared with the Kaiser score. METHODS During January 2021 and June 2023, 144 patients with 178 BI-RADS 4 lesions underwent conventional MRI, DWI, and mammography were included. The lesions were dichotomized intobenign or malignant according to the pathological results from core needle biopsy or surgical mastectomy. DWI was performed with a b value of 0 and 800s/mm2 and analyzed using theapparent diffusion coefficient, and a Kaiser score > 4 was considered to suggest malignancy. Thediagnostic performances for various diagnostic tests were evaluated with the receiver-operatingcharacteristic (ROC) curve. RESULTS The area under the curve (AUC) for DWI was significantly higher than that of the of mammography (0.86 vs 0.71, P<0.001), but was comparable with that of the Kaiser score (0.86 vs 0.84, P=0.58). However, the AUC for DWI combined with mammography was significantly highthan that of the Kaiser score (0.93 vs 0.84, P=0.007). The sensitivity for discriminating malignant from benign BI-RADS 4 lesions was highest at 89% for Kaiser score, but the highest specificity of 83% can be achieved with DWI combined with mammography. CONCLUSION DWI combined with microcalcifications on mammography could discriminate malignant BI-RADS4 lesions from benign ones with a high AUC and specificity. However, Kaiser score had a better sensitivity for discrimination.

Keywords: MRI, DWI, mammography, breast disease

Procedia PDF Downloads 31
263 Highly Accurate Target Motion Compensation Using Entropy Function Minimization

Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani

Abstract:

One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.

Keywords: automatic target recognition (ATR), high resolution range profile (HRRP), motion compensation, stepped frequency waveform technique (SFW), target motion parameters (TMPs)

Procedia PDF Downloads 131
262 Applying the Regression Technique for ‎Prediction of the Acute Heart Attack ‎

Authors: Paria Soleimani, Arezoo Neshati

Abstract:

Myocardial infarction is one of the leading causes of ‎death in the world. Some of these deaths occur even before the patient ‎reaches the hospital. Myocardial infarction occurs as a result of ‎impaired blood supply. Because the most of these deaths are due to ‎coronary artery disease, hence the awareness of the warning signs of a ‎heart attack is essential. Some heart attacks are sudden and intense, but ‎most of them start slowly, with mild pain or discomfort, then early ‎detection and successful treatment of these symptoms is vital to save ‎them. Therefore, importance and usefulness of a system designing to ‎assist physicians in the early diagnosis of the acute heart attacks is ‎obvious.‎ The purpose of this study is to determine how well a predictive ‎model would perform based on the only patient-reportable clinical ‎history factors, without using diagnostic tests or physical exams. This ‎type of the prediction model might have application outside of the ‎hospital setting to give accurate advice to patients to influence them to ‎seek care in appropriate situations. For this purpose, the data were ‎collected on 711 heart patients in Iran hospitals. 28 attributes of clinical ‎factors can be reported by patients; were studied. Three logistic ‎regression models were made on the basis of the 28 features to predict ‎the risk of heart attacks. The best logistic regression model in terms of ‎performance had a C-index of 0.955 and with an accuracy of 94.9%. ‎The variables, severe chest pain, back pain, cold sweats, shortness of ‎breath, nausea, and vomiting were selected as the main features.‎

Keywords: Coronary heart disease, Acute heart attacks, Prediction, Logistic ‎regression‎

Procedia PDF Downloads 430
261 Working Memory Growth from Kindergarten to First Grade: Considering Impulsivity, Parental Discipline Methods and Socioeconomic Status

Authors: Ayse Cobanoglu

Abstract:

Working memory can be defined as a workspace that holds and regulates active information in mind. This study investigates individual changes in children's working memory from kindergarten to first grade. The main purpose of the study is whether parental discipline methods and child impulsive/overactive behaviors affect children's working memory initial status and growth rate, controlling for gender, minority status, and socioeconomic status (SES). A linear growth curve model with the first four waves of the Early Childhood Longitudinal Study-Kindergarten Cohort of 2011 (ECLS-K:2011) is performed to analyze the individual growth of children's working memory longitudinally (N=3915). Results revealed that there is a significant variation among students' initial status in the kindergarten fall semester as well as the growth rate during the first two years of schooling. While minority status, SES, and children's overactive/impulsive behaviors influenced children's initial status, only SES and minority status were significantly associated with the growth rate of working memory. For parental discipline methods, such as giving a warning and ignoring the child's negative behavior, are also negatively associated with initial working memory scores. Following that, students' working memory growth rate is examined, and students with lower SES as well as minorities showed a faster growth pattern during the first two years of schooling. However, the findings of parental disciplinary methods on working memory growth rates were mixed. It can be concluded that schooling helps low-SES minority students to develop their working memory.

Keywords: growth curve modeling, impulsive/overactive behaviors, parenting, working memory

Procedia PDF Downloads 107
260 Outdoor Visible Light Communication Channel Modeling under Fog and Smoke Conditions

Authors: Véronique Georlette, Sebastien Bette, Sylvain Brohez, Nicolas Point, Veronique Moeyaert

Abstract:

Visible light communication (VLC) is a communication technology that is part of the optical wireless communication (OWC) family. It uses the visible and infrared spectrums to send data. For now, this technology has widely been studied for indoor use-cases, but it is sufficiently mature nowadays to consider the outdoor environment potentials. The main outdoor challenges are the meteorological conditions and the presence of smoke due to fire or pollutants in urban areas. This paper proposes a methodology to assess the robustness of an outdoor VLC system given the outdoor conditions. This methodology is put into practice in two realistic scenarios, a VLC bus stop, and a VLC streetlight. The methodology consists of computing the power margin available in the system, given all the characteristics of the VLC system and its surroundings. This is done thanks to an outdoor VLC communication channel simulator developed in Python. This simulator is able to quantify the effects of fog and smoke thanks to models taken from environmental and fire engineering scientific literature as well as the optical power reaching the receiver. These two phenomena impact the communication by increasing the total attenuation of the medium. The main conclusion drawn in this paper is that the levels of attenuation due to fog and smoke are in the same order of magnitude. The attenuation of fog being the highest under the visibility of 1 km. This gives a promising prospect for the deployment of outdoor VLC uses-cases in the near future.

Keywords: channel modeling, fog modeling, meteorological conditions, optical wireless communication, smoke modeling, visible light communication

Procedia PDF Downloads 125
259 Correlation between Speech Emotion Recognition Deep Learning Models and Noises

Authors: Leah Lee

Abstract:

This paper examines the correlation between deep learning models and emotions with noises to see whether or not noises mask emotions. The deep learning models used are plain convolutional neural networks (CNN), auto-encoder, long short-term memory (LSTM), and Visual Geometry Group-16 (VGG-16). Emotion datasets used are Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS), Crowd-sourced Emotional Multimodal Actors Dataset (CREMA-D), Toronto Emotional Speech Set (TESS), and Surrey Audio-Visual Expressed Emotion (SAVEE). To make it four times bigger, audio set files, stretch, and pitch augmentations are utilized. From the augmented datasets, five different features are extracted for inputs of the models. There are eight different emotions to be classified. Noise variations are white noise, dog barking, and cough sounds. The variation in the signal-to-noise ratio (SNR) is 0, 20, and 40. In summation, per a deep learning model, nine different sets with noise and SNR variations and just augmented audio files without any noises will be used in the experiment. To compare the results of the deep learning models, the accuracy and receiver operating characteristic (ROC) are checked.

Keywords: auto-encoder, convolutional neural networks, long short-term memory, speech emotion recognition, visual geometry group-16

Procedia PDF Downloads 47
258 The Impact of Regulatory Changes on the Development of Mobile Medical Apps

Authors: M. McHugh, D. Lillis

Abstract:

Mobile applications are being used to perform a wide variety of tasks in day-to-day life, ranging from checking email to controlling your home heating. Application developers have recognized the potential to transform a smart device into a medical device, by using a mobile medical application i.e. a mobile phone or a tablet. When initially conceived these mobile medical applications performed basic functions e.g. BMI calculator, accessing reference material etc.; however, increasing complexity offers clinicians and patients a range of functionality. As this complexity and functionality increases, so too does the potential risk associated with using such an application. Examples include any applications that provide the ability to inflate and deflate blood pressure cuffs, as well as applications that use patient-specific parameters and calculate dosage or create a dosage plan for radiation therapy. If an unapproved mobile medical application is marketed by a medical device organization, then they face significant penalties such as receiving an FDA warning letter to cease the prohibited activity, fines and possibility of facing a criminal conviction. Regulatory bodies have finalized guidance intended for mobile application developers to establish if their applications are subject to regulatory scrutiny. However, regulatory controls appear contradictory with the approaches taken by mobile application developers who generally work with short development cycles and very little documentation and as such, there is the potential to stifle further improvements due to these regulations. The research presented as part of this paper details how by adopting development techniques, such as agile software development, mobile medical application developers can meet regulatory requirements whilst still fostering innovation.

Keywords: agile, applications, FDA, medical, mobile, regulations, software engineering, standards

Procedia PDF Downloads 340
257 Adding a Few Language-Level Constructs to Improve OOP Verifiability of Semantic Correctness

Authors: Lian Yang

Abstract:

Object-oriented programming (OOP) is the dominant programming paradigm in today’s software industry and it has literally enabled average software developers to develop millions of commercial strength software applications in the era of INTERNET revolution over the past three decades. On the other hand, the lack of strict mathematical model and domain constraint features at the language level has long perplexed the computer science academia and OOP engineering community. This situation resulted in inconsistent system qualities and hard-to-understand designs in some OOP projects. The difficulties with regards to fix the current situation are also well known. Although the power of OOP lies in its unbridled flexibility and enormously rich data modeling capability, we argue that the ambiguity and the implicit facade surrounding the conceptual model of a class and an object should be eliminated as much as possible. We listed the five major usage of class and propose to separate them by proposing new language constructs. By using well-established theories of set and FSM, we propose to apply certain simple, generic, and yet effective constraints at OOP language level in an attempt to find a possible solution to the above-mentioned issues regarding OOP. The goal is to make OOP more theoretically sound as well as to aid programmers uncover warning signs of irregularities and domain-specific issues in applications early on the development stage and catch semantic mistakes at runtime, improving correctness verifiability of software programs. On the other hand, the aim of this paper is more practical than theoretical.

Keywords: new language constructs, set theory, FSM theory, user defined value type, function groups, membership qualification attribute (MQA), check-constraint (CC)

Procedia PDF Downloads 220
256 Predictive Analytics in Oil and Gas Industry

Authors: Suchitra Chnadrashekhar

Abstract:

Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.

Keywords: hydrocarbon, information technology, SAS, predictive analytics

Procedia PDF Downloads 324
255 Flood Hazard Impact Based on Simulation Model of Potential Flood Inundation in Lamong River, Gresik Regency

Authors: Yunita Ratih Wijayanti, Dwi Rahmawati, Turniningtyas Ayu Rahmawati

Abstract:

Gresik is one of the districts in East Java Province, Indonesia. Gresik Regency has three major rivers, namely Bengawan Solo River, Brantas River, and Lamong River. Lamong River is a tributary of Bengawan Solo River. Flood disasters that occur in Gresik Regency are often caused by the overflow of the Lamong River. The losses caused by the flood were very large and certainly detrimental to the affected people. Therefore, to be able to minimize the impact caused by the flood, it is necessary to take preventive action. However, before taking preventive action, it is necessary to have information regarding potential inundation areas and water levels at various points. For this reason, a flood simulation model is needed. In this study, the simulation was carried out using the Geographic Information System (GIS) method with the help of Global Mapper software. The approach used in this simulation is to use a topographical approach with Digital Elevation Models (DEMs) data. DEMs data have been widely used for various researches to analyze hydrology. The results obtained from this flood simulation are the distribution of flood inundation and water level. The location of the inundation serves to determine the extent of the flooding that occurs by referring to the 50-100 year flood plan, while the water level serves to provide early warning information. Both will be very useful to find out how much loss will be caused in the future due to flooding in Gresik Regency so that the Gresik Regency Regional Disaster Management Agency can take precautions before the flood disaster strikes.

Keywords: flood hazard, simulation model, potential inundation, global mapper, Gresik Regency

Procedia PDF Downloads 62
254 Survey of Hawke's Bay Tourism Based Businesses: Tsunami Understanding and Preparation

Authors: V. A. Ritchie

Abstract:

The loss of life and livelihood experienced after the magnitude 9.3 Sumatra earthquake and tsunami on 26 December 2004 and magnitude 9 earthquake and tsunami in northeastern Japan on 11 March 2011, has raised global awareness and brought tsunami phenomenology, nomenclature, and representation into sharp focus. At the same time, travel and tourism continue to increase, contributing around 1 in 11 jobs worldwide. This increase in tourism is especially true for coastal zones, placing pressure on decision-makers to downplay tsunami risks and at the same time provide adequate tsunami warning so that holidaymakers will feel confident enough to visit places of high tsunami risk. This study investigates how well tsunami preparedness messages are getting through for tourist-based businesses in Hawke’s Bay New Zealand, a region of frequent seismic activity and a high probability of experiencing a nearshore tsunami. The aim of this study is to investigate whether tourists based businesses are well informed about tsunamis, how well they understand that information and to what extent their clients are included in awareness raising and evacuation processes. In high-risk tsunami zones, such as Hawke’s Bay, tourism based businesses face competitive tension between short term business profitability and longer term reputational issues related to preventable loss of life from natural hazards, such as tsunamis. This study will address ways to accommodate culturally and linguistically relevant tourist awareness measures without discouraging tourists or being too costly to implement.

Keywords: tsunami risk and response, travel and tourism, business preparedness, cross cultural knowledge transfer

Procedia PDF Downloads 133
253 Application of RS and GIS Technique for Identifying Groundwater Potential Zone in Gomukhi Nadhi Sub Basin, South India

Authors: Punitha Periyasamy, Mahalingam Sudalaimuthu, Sachikanta Nanda, Arasu Sundaram

Abstract:

India holds 17.5% of the world’s population but has only 2% of the total geographical area of the world where 27.35% of the area is categorized as wasteland due to lack of or less groundwater. So there is a demand for excessive groundwater for agricultural and non agricultural activities to balance its growth rate. With this in mind, an attempt is made to find the groundwater potential zone in Gomukhi river sub basin of Vellar River basin, TamilNadu, India covering an area of 1146.6 Sq.Km consists of 9 blocks from Peddanaickanpalayam to Villupuram fall in the sub basin. The thematic maps such as Geology, Geomorphology, Lineament, Landuse, and Landcover and Drainage are prepared for the study area using IRS P6 data. The collateral data includes rainfall, water level, soil map are collected for analysis and inference. The digital elevation model (DEM) is generated using Shuttle Radar Topographic Mission (SRTM) and the slope of the study area is obtained. ArcGIS 10.1 acts as a powerful spatial analysis tool to find out the ground water potential zones in the study area by means of weighted overlay analysis. Each individual parameter of the thematic maps are ranked and weighted in accordance with their influence to increase the water level in the ground. The potential zones in the study area are classified viz., Very Good, Good, Moderate, Poor with its aerial extent of 15.67, 381.06, 575.38, 174.49 Sq.Km respectively.

Keywords: ArcGIS, DEM, groundwater, recharge, weighted overlay

Procedia PDF Downloads 419
252 Applications of Out-of-Sequence Thrust Movement for Earthquake Mitigation: A Review

Authors: Rajkumar Ghosh

Abstract:

The study presents an overview of the many uses and approaches for estimating out-of-sequence thrust movement in earthquake mitigation. The study investigates how knowing and forecasting thrust movement during seismic occurrences might assist to effective earthquake mitigation measures. The review begins by discussing out-of-sequence thrust movement and its importance in earthquake mitigation strategies. It explores how typical techniques of estimating thrust movement may not capture the full complexity of seismic occurrences and emphasizes the benefits of include out-of-sequence data in the analysis. A thorough review of existing research and studies on out-of-sequence thrust movement estimates for earthquake mitigation. The study demonstrates how to estimate out-of-sequence thrust movement using multiple data sources such as GPS measurements, satellite imagery, and seismic recordings. The study also examines the use of out-of-sequence thrust movement estimates in earthquake mitigation measures. It investigates how precise calculation of thrust movement may help improve structural design, analyse infrastructure risk, and develop early warning systems. The potential advantages of using out-of-sequence data in these applications to improve the efficiency of earthquake mitigation techniques. The difficulties and limits of estimating out-of-sequence thrust movement for earthquake mitigation. It addresses data quality difficulties, modelling uncertainties, and computational complications. To address these obstacles and increase the accuracy and reliability of out-of-sequence thrust movement estimates, the authors recommend topics for additional study and improvement. The study is a helpful resource for seismic monitoring and earthquake risk assessment researchers, engineers, and policymakers, supporting innovations in earthquake mitigation measures based on a better knowledge of thrust movement dynamics.

Keywords: earthquake mitigation, out-of-sequence thrust, satellite imagery, seismic recordings, GPS measurements

Procedia PDF Downloads 62
251 Security Analysis of Mod. S Transponder Technology and Attack Examples

Authors: M. Rutkowski, J. Cwiklak, M. Grzegorzewski, M. Adamski

Abstract:

All class A Airplanes have to be equipped with Mod. S transponder for ATC surveillance purposes. This technology was designed to provide a robust and dependable solution to localize, identify and exchange data with the airplane. The purpose of this paper is to analyze potential hazards that are a result of lack of any security or encryption on a design level. Secondary Surveillance Radars rely on an active response from an airplane. SSR radar installation is broadcasting a directional interrogation signal to the planes in range on 1030MHz frequency with DPSK modulation. If the interrogation is correctly received by the transponder located on the plane, a proper answer is sent on 1090MHz with PPM modulation containing plane’s SQUAWK, barometric altitude, GPS coordinates and 24bit unique address code. This technology does not use any kind of encryption. All of the specifications from the previous chapter can be found easily on the internet. Since there is no encryption or security measure to ensure the credibility of the sender and message, it is highly hazardous to use such technology to ensure the safety of the air traffic. The only thing that identifies the airplane is the 24-bit unique address. Most of the planes have been sniffed by aviation enthusiasts and cataloged in web databases. In the moment of writing this article, The PoFung Technologies has announced that they are planning to release all band SDR transceiver – this device would be more than enough to build your own Mod. S Transponder. With fake transponder, a potential terrorist can identify as a different airplane. By replacing the transponder in a poorly controlled airspace, hijackers can enter another airspace identifying themselves as another plane and land in the desired area.

Keywords: flight safety, hijack, mod S transponder, security analysis

Procedia PDF Downloads 274
250 The Survey of Sports Injuries in Ten Sports

Authors: Najmeh Arabnejad, Mohammad Hossein Yousefi

Abstract:

The risk of injuring is possible in most sports. These injuries are inevitable in contact sports. Since sports injuries result in financial, physical, physiological and social problems for most athletes and endanger their professional future, studying the happening of sports injuries in sports changes to an important issue. This study can be conducted through different aspects including psychological, pathological, social, managerial, etc. Therefore, the present study was designed and conducted with the aim Survey of Sports Injuries In Ten Sports from 2006 to 2011.This descriptive study was carried out in a documentary form. Thus, data related to sports insurance and sport injuries happened in soccer, volleyball, basketball, handball, badminton, karate, track and field, taekwondo, gymnastics and wrestling was collected from Sports Medical Board of Kerman Province, as the largest province in Iran, and then was analyzed. Data collection method was library one. Furthermore, information related to 210406 insured athletes was analyzed using Descriptive Statistical Indexes in the level mean and SPSS20 Software. The research findings showed that the number of male athletes who injured was higher than female athletes in most sports within various years. Soccer, karate, volleyball, wrestling, handball, taekwondo, gymnastics, basketball, track and field, and badminton had the most injuries, respectively. Moreover, the number of injured athletes and their ratio to insured ones during six years were studied; in general, an increase in ratio of sports injuries was observed. Thus, upward movement of sports injuries in different sports, as the results of this study confirm it, is a warning which results in losing young forces and wasting of sports potential in Iran.

Keywords: sports, sports injuries, survey, Kerman

Procedia PDF Downloads 346
249 Integration of GIS with Remote Sensing and GPS for Disaster Mitigation

Authors: Sikander Nawaz Khan

Abstract:

Natural disasters like flood, earthquake, cyclone, volcanic eruption and others are causing immense losses to the property and lives every year. Current status and actual loss information of natural hazards can be determined and also prediction for next probable disasters can be made using different remote sensing and mapping technologies. Global Positioning System (GPS) calculates the exact position of damage. It can also communicate with wireless sensor nodes embedded in potentially dangerous places. GPS provide precise and accurate locations and other related information like speed, track, direction and distance of target object to emergency responders. Remote Sensing facilitates to map damages without having physical contact with target area. Now with the addition of more remote sensing satellites and other advancements, early warning system is used very efficiently. Remote sensing is being used both at local and global scale. High Resolution Satellite Imagery (HRSI), airborne remote sensing and space-borne remote sensing is playing vital role in disaster management. Early on Geographic Information System (GIS) was used to collect, arrange, and map the spatial information but now it has capability to analyze spatial data. This analytical ability of GIS is the main cause of its adaption by different emergency services providers like police and ambulance service. Full potential of these so called 3S technologies cannot be used in alone. Integration of GPS and other remote sensing techniques with GIS has pointed new horizons in modeling of earth science activities. Many remote sensing cases including Asian Ocean Tsunami in 2004, Mount Mangart landslides and Pakistan-India earthquake in 2005 are described in this paper.

Keywords: disaster mitigation, GIS, GPS, remote sensing

Procedia PDF Downloads 441
248 The Relationship between Human Neutrophil Elastase Levels and Acute Respiratory Distress Syndrome in Patients with Thoracic Trauma

Authors: Wahyu Purnama Putra, Artono Isharanto

Abstract:

Thoracic trauma is trauma that hits the thoracic wall or intrathoracic organs, either due to blunt trauma or sharp trauma. Thoracic trauma often causes impaired ventilation-perfusion due to damage to the lung parenchyma. This results in impaired tissue oxygenation, which is one of the causes of acute respiratory distress syndrome (ARDS). These changes are caused by the release of pro-inflammatory mediators, plasmatic proteins, and proteases into the alveolar space associated with ongoing edema, as well as oxidative products that ultimately result in severe inhibition of the surfactant system. This study aims to predict the incidence of acute respiratory distress syndrome (ARDS) through human neutrophil elastase levels. This study examines the relationship between plasma elastase levels as a predictor of the incidence of ARDS in thoracic trauma patients in Malang. This study is an observational cohort study. Data analysis uses the Pearson correlation test and ROC curve (receiver operating characteristic curve). It can be concluded that there is a significant (p= 0.000, r= -0.988) relationship between elastase levels and BGA-3. If the value of elastase levels is limited to 23.79 ± 3.95, the patient will experience mild ARDS. While if the value of elastase levels is limited to 57.68 ± 18.55, in the future, the patient will experience moderate ARDS. Meanwhile, if the elastase level is between 107.85 ± 5.04, the patient will likely experience severe ARDS. Neutrophil elastase levels correlate with the degree of severity of ARDS incidence.

Keywords: ARDS, human neutrophil elastase, severity, thoracic trauma

Procedia PDF Downloads 115
247 Prognostic Value of C-Reactive Protein (CRP) in SARS-CoV-2 Infection: A Simplified Biomarker of COVID-19 Severity in Sub-Saharan Africa

Authors: Teklay Gebrecherkos, Mahmud Abdulkader, Tobias Rinke De Wit, Britta C. Urban, Feyissa Chala, Yazezew Kebede, Dawit Welday

Abstract:

Background: C-reactive protein (CRP) levels are a reliable surrogate for interleukin-6 bioactivity that plays a pivotal role in the pathogenesis of cytokine storm associated with severe COVID-19. There is a lack of data on the role of CRP as a determinant of COVID-19 severity status in the African context. Methods: We determined the longitudinal kinetics of CRP levels on 78 RT-PCR-confirmed COVID-19 patients (49 non-severe and 29 severe cases) and 50 PCR-negative controls. Results: COVID-19 patients had overall significantly elevated CRP at baseline when compared to PCR-negative controls [median 11.1 (IQR: 2.0-127.8) mg/L vs. 0.9 (IQR: 0.5-1.9) mg/L; p=0.0004)]. Moreover, severe COVID-19 patients had significantly higher median CRP levels than non-severe cases [166.1 (IQR: 48.6-332.5) mg/L vs. 2.4 (IQR: 1.2-7.6) mg/L; p<0.00001)]. In addition, persistently elevated levels of CRP were exhibited among those with comorbidities and higher age groups. Area under receiver operating characteristic curve (AUC) analysis of CRP levels distinguished PCR-confirmed COVID-19 patients from the ones with PCR-negative non-COVID-19 individuals, with an AUC value of 0.77 (95% CI: 0.68-0.84; p=0.001). Moreover, it clearly distinguished severe from non-severe COVID-19 patients, with an AUC value of 0.83 (95% CI: 0.73-0.91). After adjusting for age and the presence of comorbidities, CRP levels above 30 mg/L were significantly associated with an increased risk of developing severe COVID-19 (adjusted relative risk 3.99 (95%CI: 1.35-11.82; p=0.013). Conclusions: Determining CRP levels in COVID-19 patients in African settings may provide a simple, prompt, and inexpensive assessment of the severity status at baseline and monitoring of treatment outcomes.

Keywords: CRP, COVID-19, SARS-CoV-2, biomarker

Procedia PDF Downloads 52
246 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering

Authors: Sara Hasani

Abstract:

This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.

Keywords: disaster management, natural disaster, pattern recognition, prediction

Procedia PDF Downloads 133
245 Global Navigation Satellite System and Precise Point Positioning as Remote Sensing Tools for Monitoring Tropospheric Water Vapor

Authors: Panupong Makvichian

Abstract:

Global Navigation Satellite System (GNSS) is nowadays a common technology that improves navigation functions in our life. Additionally, GNSS is also being employed on behalf of an accurate atmospheric sensor these times. Meteorology is a practical application of GNSS, which is unnoticeable in the background of people’s life. GNSS Precise Point Positioning (PPP) is a positioning method that requires data from a single dual-frequency receiver and precise information about satellite positions and satellite clocks. In addition, careful attention to mitigate various error sources is required. All the above data are combined in a sophisticated mathematical algorithm. At this point, the research is going to demonstrate how GNSS and PPP method is capable to provide high-precision estimates, such as 3D positions or Zenith tropospheric delays (ZTDs). ZTDs combined with pressure and temperature information allows us to estimate the water vapor in the atmosphere as precipitable water vapor (PWV). If the process is replicated for a network of GNSS sensors, we can create thematic maps that allow extract water content information in any location within the network area. All of the above are possible thanks to the advances in GNSS data processing. Therefore, we are able to use GNSS data for climatic trend analysis and acquisition of the further knowledge about the atmospheric water content.

Keywords: GNSS, precise point positioning, Zenith tropospheric delays, precipitable water vapor

Procedia PDF Downloads 173
244 Aluminum Based Hexaferrite and Reduced Graphene Oxide a Suitable Microwave Absorber for Microwave Application

Authors: Sanghamitra Acharya, Suwarna Datar

Abstract:

Extensive use of digital and smart communication createsprolong expose of unwanted electromagnetic (EM) radiations. This harmful radiation creates not only malfunctioning of nearby electronic gadgets but also severely affects a human being. So, a suitable microwave absorbing material (MAM) becomes a necessary urge in the field of stealth and radar technology. Initially, Aluminum based hexa ferrite was prepared by sol-gel technique and for carbon derived composite was prepared by the simple one port chemical reduction method. Finally, composite films of Poly (Vinylidene) Fluoride (PVDF) are prepared by simple gel casting technique. Present work demands that aluminum-based hexaferrite phase conjugated with graphene in PVDF matrix becomes a suitable candidate both in commercially important X and Ku band. The structural and morphological nature was characterized by X-Ray diffraction (XRD), Field emission-scanning electron microscope (FESEM) and Raman spectra which conforms that 30-40 nm particles are well decorated over graphene sheet. Magnetic force microscopy (MFM) and conducting force microscopy (CFM) study further conforms the magnetic and conducting nature of composite. Finally, shielding effectiveness (SE) of the composite film was studied by using Vector network analyzer (VNA) both in X band and Ku band frequency range and found to be more than 30 dB and 40 dB, respectively. As prepared composite films are excellent microwave absorbers.

Keywords: carbon nanocomposite, microwave absorbing material, electromagnetic shielding, hexaferrite

Procedia PDF Downloads 155
243 Protecting Migrants at Risk as Internally Displaced Persons: State Responses to Foreign Immigrants Displaced by Natural Disasters in Thailand, The United States, and Japan

Authors: Toake Endoh

Abstract:

Cross-border migration of people is a critical driver for sustainable economic development in the Asia-Pacific region. Meanwhile, the region is susceptible to mega-scale natural disasters, such as tsunami, earthquakes, and typhoons. When migrants are stranded in a foreign country by a disaster, who should be responsible for their safety and security? What legal or moral foundation is there to advocate for the protection and assistance of “migrants at risk (M@R)”? How can the states practice “good governance” in their response to displacement of the foreign migrants? This paper inquires how to protect foreign migrants displaced by a natural disaster under international law and proposes protective actions to be taken by of migrant-receiver governments. First, the paper discusses the theoretical foundation for protection of M@R and argues that the nation-states are charged of responsibility to protect at-risk foreigners as “internally displaced persons” in the light of the United Nations’ Guiding Principles of Internal Displacement (1998). Second, through the case study of the Kobe Earthquake in Japan (1995), the Tsunami in Thailand (2004), and the Hurricane Katrina in the U.S. (2005), the paper evaluates how effectively (or poorly) institutions and state actors addressed the specific vulnerability felt by M@R in these crises.

Keywords: internal displaced persons, natural disaster, international migration, responsibility to protect

Procedia PDF Downloads 290
242 Gis Database Creation for Impacts of Domestic Wastewater Disposal on BIDA Town, Niger State Nigeria

Authors: Ejiobih Hyginus Chidozie

Abstract:

Geographic Information System (GIS) is a configuration of computer hardware and software specifically designed to effectively capture, store, update, manipulate, analyse and display and display all forms of spatially referenced information. GIS database is referred to as the heart of GIS. It has location data, attribute data and spatial relationship between the objects and their attributes. Sewage and wastewater management have assumed increased importance lately as a result of general concern expressed worldwide about the problems of pollution of the environment contamination of the atmosphere, rivers, lakes, oceans and ground water. In this research GIS database was created to study the impacts of domestic wastewater disposal methods on Bida town, Niger State as a model for investigating similar impacts on other cities in Nigeria. Results from GIS database are very useful to decision makers and researchers. Bida Town was subdivided into four regions, eight zones, and 24 sectors based on the prevailing natural morphology of the town. GIS receiver and structured questionnaire were used to collect information and attribute data from 240 households of the study area. Domestic wastewater samples were collected from twenty four sectors of the study area for laboratory analysis. ArcView 3.2a GIS software, was used to create the GIS databases for ecological, health and socioeconomic impacts of domestic wastewater disposal methods in Bida town.

Keywords: environment, GIS, pollution, software, wastewater

Procedia PDF Downloads 395
241 Intelligent Fault Diagnosis for the Connection Elements of Modular Offshore Platforms

Authors: Jixiang Lei, Alexander Fuchs, Franz Pernkopf, Katrin Ellermann

Abstract:

Within the Space@Sea project, funded by the Horizon 2020 program, an island consisting of multiple platforms was designed. The platforms are connected by ropes and fenders. The connection is critical with respect to the safety of the whole system. Therefore, fault detection systems are investigated, which could detect early warning signs for a possible failure in the connection elements. Previously, a model-based method called Extended Kalman Filter was developed to detect the reduction of rope stiffness. This method detected several types of faults reliably, but some types of faults were much more difficult to detect. Furthermore, the model-based method is sensitive to environmental noise. When the wave height is low, a long time is needed to detect a fault and the accuracy is not always satisfactory. In this sense, it is necessary to develop a more accurate and robust technique that can detect all rope faults under a wide range of operational conditions. Inspired by this work on the Space at Sea design, we introduce a fault diagnosis method based on deep neural networks. Our method cannot only detect rope degradation by using the acceleration data from each platform but also estimate the contributions of the specific acceleration sensors using methods from explainable AI. In order to adapt to different operational conditions, the domain adaptation technique DANN is applied. The proposed model can accurately estimate rope degradation under a wide range of environmental conditions and help users understand the relationship between the output and the contributions of each acceleration sensor.

Keywords: fault diagnosis, deep learning, domain adaptation, explainable AI

Procedia PDF Downloads 155
240 Electroencephalography (EEG) Analysis of Alcoholic and Control Subjects Using Multiscale Permutation Entropy

Authors: Lal Hussain, Wajid Aziz, Sajjad Ahmed Nadeem, Saeed Arif Shah, Abdul Majid

Abstract:

Brain electrical activity as reflected in Electroencephalography (EEG) have been analyzed and diagnosed using various techniques. Among them, complexity measure, nonlinearity, disorder, and unpredictability play vital role due to the nonlinear interconnection between functional and anatomical subsystem emerged in brain in healthy state and during various diseases. There are many social and economical issues of alcoholic abuse as memory weakness, decision making, impairments, and concentrations etc. Alcoholism not only defect the brains but also associated with emotional, behavior, and cognitive impairments damaging the white and gray brain matters. A recently developed signal analysis method i.e. Multiscale Permutation Entropy (MPE) is proposed to estimate the complexity of long-range temporal correlation time series EEG of Alcoholic and Control subjects acquired from University of California Machine Learning repository and results are compared with MSE. Using MPE, coarsed grained series is first generated and the PE is computed for each coarsed grained time series against the electrodes O1, O2, C3, C4, F2, F3, F4, F7, F8, Fp1, Fp2, P3, P4, T7, and T8. The results computed against each electrode using MPE gives higher significant values as compared to MSE as well as mean rank differences accordingly. Likewise, ROC and Area under the ROC also gives higher separation against each electrode using MPE in comparison to MSE.

Keywords: electroencephalogram (EEG), multiscale permutation entropy (MPE), multiscale sample entropy (MSE), permutation entropy (PE), mann whitney test (MMT), receiver operator curve (ROC), complexity measure

Procedia PDF Downloads 465