Search results for: missile warning receiver
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 572

Search results for: missile warning receiver

212 Communication Aesthetics of Techno-Scenery and Lighting in Bolanle Austen-Peters Queen Moremi the Musical

Authors: Badeji Adebayo John

Abstract:

Technology has immense contribution in every aspect of human endeavor; it has not only made work easier but also provided exhilarating impression in the mind of the people. Theatre is not exempted from the multifaceted influence of technology on phenomenon. Therefore, theatre performances have experienced the excellence of technology in the contemporary era such that audiences have unforgettable experiences after seeing theatre performances. Some of these technological advancements that have amplified the aesthetics of performances in the theatre are techno-scenery (3D mapping) and lighting. In view of this, the objective of this study is to explore how techno-scenery and lighting technologies were used to communicate messages in the performance of Queen Moremi the Musical. In so doing, Participant-Observation Method and Content Analysis are adopted. Berlo’s model of communication is also employed to explain the communicative aesthetics of these theatre technologies in the performance. Techno-scenery and lighting are communication media modifier that facilitates audiences’ comprehension of the messages in the performance of Queen Moremi the Musical. They also create clear motion pictures of the setting which the performers cannot communicate in their acting, dances and singing, to ease the audiences’ decoding of messages that the performers are sending to the audience. Therefore, consistent incorporation of these technologies to theatre performances will facilitate easy flow of communication in-between the performers who are the sender, the message which is the performance and the audience who are the receiver.

Keywords: communication, aesthetics, techno-scenery, lighting, musical

Procedia PDF Downloads 59
211 Designing Electronic Kanban in Assembly Line Tailboom at XYZ Corp to Reducing Lead Time

Authors: Nadhifah A. Nugraha, Dida D. Damayanti, Widia Juliani

Abstract:

Airplanes manufacturing is growing along with the increasing demand from consumers. The helicopter's tail called Tailboom is a product of the helicopter division at XYZ Corp, where the Tailboom assembly line is a pull system. Based on observations of existing conditions that occur at XYZ Corp, production is still unable to meet the demands of consumers; lead time occurs greater than the plan agreed upon by the consumers. In the assembly process, each work station experiences a lack of parts and components needed to assemble components. This happens because of the delay in getting the required part information, and there is no warning about the availability of parts needed, it makes some parts unavailable in assembly warehouse. The lack of parts and components from the previous work station causes the assembly process to stop, and the assembly line also stops at the next station. In its completion, the production time was late and not on the schedule. In resolving these problems, the controlling process is needed, which is controlling the assembly line to get all components and subassembly in the right amount and at the right time. This study applies one of Just In Time tools, namely Kanban and automation, should be added as efficiently and effectively communication line becomes electronic Kanban. The problem can be solved by reducing non-value added time, such as waiting time and idle time. The proposed results of controlling the assembly line of Tailboom result in a smooth assembly line without waiting, reduced lead time, and achieving production time according to the schedule agreed with the consumers.

Keywords: kanban, e-Kanban, lead time, pull system

Procedia PDF Downloads 88
210 Driver Readiness in Autonomous Vehicle Take-Overs

Authors: Abdurrahman Arslanyilmaz, Salman Al Matouq, Durmus V. Doner

Abstract:

Level 3 autonomous vehicles are able to take full responsibility over the control of the vehicle unless a system boundary is reached or a system failure occurs, in which case, the driver is expected to take-over the control of the vehicle. While this happens, the driver is often not aware of the traffic situation or is engaged in a secondary task. Factors affecting the duration and quality of take-overs in these situations have included secondary task type and nature, traffic density, take-over request (TOR) time, and TOR warning type and modality. However, to the best of the authors’ knowledge, no prior study examined time buffer for TORs when a system failure occurs immediately before intersections. The first objective of this study is to investigate the effect of time buffer (3 and 7 seconds) on the duration and quality of take-overs when a system failure occurs just prior to intersections. In addition, eye-tracking has become one of the most popular methods to report what individuals view, in what order, for how long, and how often, and it has been utilized in driving simulations with various objectives. However, to the extent of authors’ knowledge, none has compared drivers’ eye gaze behavior in the two different time buffers in order to examine drivers’ attention and comprehension of salient information. The second objective is to understand the driver’s attentional focus on comprehension of salient traffic-related information presented on different parts of the dashboard and on the roads.

Keywords: autonomous vehicles, driving simulation, eye gaze, attention, comprehension, take-over duration, take-over quality, time buffer

Procedia PDF Downloads 108
209 The Ontological Memory in Bergson as a Conceptual Tool for the Analysis of the Digital Conjuncture

Authors: Douglas Rossi Ramos

Abstract:

The current digital conjuncture, called by some authors as 'Internet of Things' (IoT), 'Web 2.0' or even 'Web 3.0', consists of a network that encompasses any communication of objects and entities, such as data, information, technologies, and people. At this juncture, especially characterized by an "object socialization," communication can no longer be represented as a simple informational flow of messages from a sender, crossing a channel or medium, reaching a receiver. The idea of communication must, therefore, be thought of more broadly in which it is possible to analyze the process communicative from interactions between humans and nonhumans. To think about this complexity, a communicative process that encompasses both humans and other beings or entities communicating (objects and things), it is necessary to constitute a new epistemology of communication to rethink concepts and notions commonly attributed to humans such as 'memory.' This research aims to contribute to this epistemological constitution from the discussion about the notion of memory according to the complex ontology of Henri Bergson. Among the results (the notion of memory in Bergson presents itself as a conceptual tool for the analysis of posthumanism and the anthropomorphic conjuncture of the new advent of digital), there was the need to think about an ontological memory, analyzed as a being itself (being itself of memory), as a strategy for understanding the forms of interaction and communication that constitute the new digital conjuncture, in which communicating beings or entities tend to interact with each other. Rethinking the idea of communication beyond the dimension of transmission in informative sequences paves the way for an ecological perspective of the digital dwelling condition.

Keywords: communication, digital, Henri Bergson, memory

Procedia PDF Downloads 134
208 Comparative Diagnostic Performance of Diffusion-Weighted Imaging Combined With Microcalcifications on Mammography for Discriminating Malignant From Benign Bi-rads 4 Lesions With the Kaiser Score

Authors: Wangxu Xia

Abstract:

BACKGROUND BI-RADS 4 lesions raise the possibility of malignancy that warrant further clinical and radiologic work-up. This study aimed to evaluate the predictive performance of diffusion-weighted imaging(DWI) and microcalcifications on mammography for predicting malignancy of BI-RADS 4 lesions. In addition, the predictive performance of DWI combined with microcalcifications was alsocompared with the Kaiser score. METHODS During January 2021 and June 2023, 144 patients with 178 BI-RADS 4 lesions underwent conventional MRI, DWI, and mammography were included. The lesions were dichotomized intobenign or malignant according to the pathological results from core needle biopsy or surgical mastectomy. DWI was performed with a b value of 0 and 800s/mm2 and analyzed using theapparent diffusion coefficient, and a Kaiser score > 4 was considered to suggest malignancy. Thediagnostic performances for various diagnostic tests were evaluated with the receiver-operatingcharacteristic (ROC) curve. RESULTS The area under the curve (AUC) for DWI was significantly higher than that of the of mammography (0.86 vs 0.71, P<0.001), but was comparable with that of the Kaiser score (0.86 vs 0.84, P=0.58). However, the AUC for DWI combined with mammography was significantly highthan that of the Kaiser score (0.93 vs 0.84, P=0.007). The sensitivity for discriminating malignant from benign BI-RADS 4 lesions was highest at 89% for Kaiser score, but the highest specificity of 83% can be achieved with DWI combined with mammography. CONCLUSION DWI combined with microcalcifications on mammography could discriminate malignant BI-RADS4 lesions from benign ones with a high AUC and specificity. However, Kaiser score had a better sensitivity for discrimination.

Keywords: MRI, DWI, mammography, breast disease

Procedia PDF Downloads 37
207 The Sea Striker: The Relevance of Small Assets Using an Integrated Conception with Operational Performance Computations

Authors: Gaëtan Calvar, Christophe Bouvier, Alexis Blasselle

Abstract:

This paper presents the Sea Striker, a compact hydrofoil designed with the goal to address some of the issues raised by the recent evolutions of naval missions, threats and operation theatres in modern warfare. Able to perform a wide range of operations, the Sea Striker is a 40-meter stealth surface combatant equipped with a gas turbine and aft and forward foils to reach high speeds. The Sea Striker's stealthiness is enabled by the combination of composite structure, exterior design, and the advanced integration of sensors. The ship is fitted with a powerful and adaptable combat system, ensuring a versatile and efficient response to modern threats. Lightly Manned with a core crew of 10, this hydrofoil is highly automated and can be remoted pilote for special force operation or transit. Such a kind of ship is not new: it has been used in the past by different navies, for example, by the US Navy with the USS Pegasus. Nevertheless, the recent evolutions in science and technologies on the one hand, and the emergence of new missions, threats and operation theatres, on the other hand, put forward its concept as an answer to nowadays operational challenges. Indeed, even if multiples opinions and analyses can be given regarding the modern warfare and naval surface operations, general observations and tendencies can be drawn such as the major increase in the sensors and weapons types and ranges and, more generally, capacities; the emergence of new versatile and evolving threats and enemies, such as asymmetric groups, swarm drones or hypersonic missile; or the growing number of operation theatres located in more coastal and shallow waters. These researches were performed with a complete study of the ship after several operational performance computations in order to justify the relevance of using ships like the Sea Striker in naval surface operations. For the selected scenarios, the conception process enabled to measure the performance, namely a “Measure of Efficiency” in the NATO framework for 2 different kinds of models: A centralized, classic model, using large and powerful ships; and A distributed model relying on several Sea Strikers. After this stage, a was performed. Lethal, agile, stealth, compact and fitted with a complete set of sensors, the Sea Striker is a new major player in modern warfare and constitutes a very attractive response between the naval unit and the combat helicopter, enabling to reach high operational performances at a reduced cost.

Keywords: surface combatant, compact, hydrofoil, stealth, velocity, lethal

Procedia PDF Downloads 96
206 Applying the Regression Technique for ‎Prediction of the Acute Heart Attack ‎

Authors: Paria Soleimani, Arezoo Neshati

Abstract:

Myocardial infarction is one of the leading causes of ‎death in the world. Some of these deaths occur even before the patient ‎reaches the hospital. Myocardial infarction occurs as a result of ‎impaired blood supply. Because the most of these deaths are due to ‎coronary artery disease, hence the awareness of the warning signs of a ‎heart attack is essential. Some heart attacks are sudden and intense, but ‎most of them start slowly, with mild pain or discomfort, then early ‎detection and successful treatment of these symptoms is vital to save ‎them. Therefore, importance and usefulness of a system designing to ‎assist physicians in the early diagnosis of the acute heart attacks is ‎obvious.‎ The purpose of this study is to determine how well a predictive ‎model would perform based on the only patient-reportable clinical ‎history factors, without using diagnostic tests or physical exams. This ‎type of the prediction model might have application outside of the ‎hospital setting to give accurate advice to patients to influence them to ‎seek care in appropriate situations. For this purpose, the data were ‎collected on 711 heart patients in Iran hospitals. 28 attributes of clinical ‎factors can be reported by patients; were studied. Three logistic ‎regression models were made on the basis of the 28 features to predict ‎the risk of heart attacks. The best logistic regression model in terms of ‎performance had a C-index of 0.955 and with an accuracy of 94.9%. ‎The variables, severe chest pain, back pain, cold sweats, shortness of ‎breath, nausea, and vomiting were selected as the main features.‎

Keywords: Coronary heart disease, Acute heart attacks, Prediction, Logistic ‎regression‎

Procedia PDF Downloads 431
205 Working Memory Growth from Kindergarten to First Grade: Considering Impulsivity, Parental Discipline Methods and Socioeconomic Status

Authors: Ayse Cobanoglu

Abstract:

Working memory can be defined as a workspace that holds and regulates active information in mind. This study investigates individual changes in children's working memory from kindergarten to first grade. The main purpose of the study is whether parental discipline methods and child impulsive/overactive behaviors affect children's working memory initial status and growth rate, controlling for gender, minority status, and socioeconomic status (SES). A linear growth curve model with the first four waves of the Early Childhood Longitudinal Study-Kindergarten Cohort of 2011 (ECLS-K:2011) is performed to analyze the individual growth of children's working memory longitudinally (N=3915). Results revealed that there is a significant variation among students' initial status in the kindergarten fall semester as well as the growth rate during the first two years of schooling. While minority status, SES, and children's overactive/impulsive behaviors influenced children's initial status, only SES and minority status were significantly associated with the growth rate of working memory. For parental discipline methods, such as giving a warning and ignoring the child's negative behavior, are also negatively associated with initial working memory scores. Following that, students' working memory growth rate is examined, and students with lower SES as well as minorities showed a faster growth pattern during the first two years of schooling. However, the findings of parental disciplinary methods on working memory growth rates were mixed. It can be concluded that schooling helps low-SES minority students to develop their working memory.

Keywords: growth curve modeling, impulsive/overactive behaviors, parenting, working memory

Procedia PDF Downloads 109
204 Outdoor Visible Light Communication Channel Modeling under Fog and Smoke Conditions

Authors: Véronique Georlette, Sebastien Bette, Sylvain Brohez, Nicolas Point, Veronique Moeyaert

Abstract:

Visible light communication (VLC) is a communication technology that is part of the optical wireless communication (OWC) family. It uses the visible and infrared spectrums to send data. For now, this technology has widely been studied for indoor use-cases, but it is sufficiently mature nowadays to consider the outdoor environment potentials. The main outdoor challenges are the meteorological conditions and the presence of smoke due to fire or pollutants in urban areas. This paper proposes a methodology to assess the robustness of an outdoor VLC system given the outdoor conditions. This methodology is put into practice in two realistic scenarios, a VLC bus stop, and a VLC streetlight. The methodology consists of computing the power margin available in the system, given all the characteristics of the VLC system and its surroundings. This is done thanks to an outdoor VLC communication channel simulator developed in Python. This simulator is able to quantify the effects of fog and smoke thanks to models taken from environmental and fire engineering scientific literature as well as the optical power reaching the receiver. These two phenomena impact the communication by increasing the total attenuation of the medium. The main conclusion drawn in this paper is that the levels of attenuation due to fog and smoke are in the same order of magnitude. The attenuation of fog being the highest under the visibility of 1 km. This gives a promising prospect for the deployment of outdoor VLC uses-cases in the near future.

Keywords: channel modeling, fog modeling, meteorological conditions, optical wireless communication, smoke modeling, visible light communication

Procedia PDF Downloads 127
203 Correlation between Speech Emotion Recognition Deep Learning Models and Noises

Authors: Leah Lee

Abstract:

This paper examines the correlation between deep learning models and emotions with noises to see whether or not noises mask emotions. The deep learning models used are plain convolutional neural networks (CNN), auto-encoder, long short-term memory (LSTM), and Visual Geometry Group-16 (VGG-16). Emotion datasets used are Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS), Crowd-sourced Emotional Multimodal Actors Dataset (CREMA-D), Toronto Emotional Speech Set (TESS), and Surrey Audio-Visual Expressed Emotion (SAVEE). To make it four times bigger, audio set files, stretch, and pitch augmentations are utilized. From the augmented datasets, five different features are extracted for inputs of the models. There are eight different emotions to be classified. Noise variations are white noise, dog barking, and cough sounds. The variation in the signal-to-noise ratio (SNR) is 0, 20, and 40. In summation, per a deep learning model, nine different sets with noise and SNR variations and just augmented audio files without any noises will be used in the experiment. To compare the results of the deep learning models, the accuracy and receiver operating characteristic (ROC) are checked.

Keywords: auto-encoder, convolutional neural networks, long short-term memory, speech emotion recognition, visual geometry group-16

Procedia PDF Downloads 51
202 The Impact of Regulatory Changes on the Development of Mobile Medical Apps

Authors: M. McHugh, D. Lillis

Abstract:

Mobile applications are being used to perform a wide variety of tasks in day-to-day life, ranging from checking email to controlling your home heating. Application developers have recognized the potential to transform a smart device into a medical device, by using a mobile medical application i.e. a mobile phone or a tablet. When initially conceived these mobile medical applications performed basic functions e.g. BMI calculator, accessing reference material etc.; however, increasing complexity offers clinicians and patients a range of functionality. As this complexity and functionality increases, so too does the potential risk associated with using such an application. Examples include any applications that provide the ability to inflate and deflate blood pressure cuffs, as well as applications that use patient-specific parameters and calculate dosage or create a dosage plan for radiation therapy. If an unapproved mobile medical application is marketed by a medical device organization, then they face significant penalties such as receiving an FDA warning letter to cease the prohibited activity, fines and possibility of facing a criminal conviction. Regulatory bodies have finalized guidance intended for mobile application developers to establish if their applications are subject to regulatory scrutiny. However, regulatory controls appear contradictory with the approaches taken by mobile application developers who generally work with short development cycles and very little documentation and as such, there is the potential to stifle further improvements due to these regulations. The research presented as part of this paper details how by adopting development techniques, such as agile software development, mobile medical application developers can meet regulatory requirements whilst still fostering innovation.

Keywords: agile, applications, FDA, medical, mobile, regulations, software engineering, standards

Procedia PDF Downloads 341
201 Adding a Few Language-Level Constructs to Improve OOP Verifiability of Semantic Correctness

Authors: Lian Yang

Abstract:

Object-oriented programming (OOP) is the dominant programming paradigm in today’s software industry and it has literally enabled average software developers to develop millions of commercial strength software applications in the era of INTERNET revolution over the past three decades. On the other hand, the lack of strict mathematical model and domain constraint features at the language level has long perplexed the computer science academia and OOP engineering community. This situation resulted in inconsistent system qualities and hard-to-understand designs in some OOP projects. The difficulties with regards to fix the current situation are also well known. Although the power of OOP lies in its unbridled flexibility and enormously rich data modeling capability, we argue that the ambiguity and the implicit facade surrounding the conceptual model of a class and an object should be eliminated as much as possible. We listed the five major usage of class and propose to separate them by proposing new language constructs. By using well-established theories of set and FSM, we propose to apply certain simple, generic, and yet effective constraints at OOP language level in an attempt to find a possible solution to the above-mentioned issues regarding OOP. The goal is to make OOP more theoretically sound as well as to aid programmers uncover warning signs of irregularities and domain-specific issues in applications early on the development stage and catch semantic mistakes at runtime, improving correctness verifiability of software programs. On the other hand, the aim of this paper is more practical than theoretical.

Keywords: new language constructs, set theory, FSM theory, user defined value type, function groups, membership qualification attribute (MQA), check-constraint (CC)

Procedia PDF Downloads 222
200 Predictive Analytics in Oil and Gas Industry

Authors: Suchitra Chnadrashekhar

Abstract:

Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.

Keywords: hydrocarbon, information technology, SAS, predictive analytics

Procedia PDF Downloads 327
199 Flood Hazard Impact Based on Simulation Model of Potential Flood Inundation in Lamong River, Gresik Regency

Authors: Yunita Ratih Wijayanti, Dwi Rahmawati, Turniningtyas Ayu Rahmawati

Abstract:

Gresik is one of the districts in East Java Province, Indonesia. Gresik Regency has three major rivers, namely Bengawan Solo River, Brantas River, and Lamong River. Lamong River is a tributary of Bengawan Solo River. Flood disasters that occur in Gresik Regency are often caused by the overflow of the Lamong River. The losses caused by the flood were very large and certainly detrimental to the affected people. Therefore, to be able to minimize the impact caused by the flood, it is necessary to take preventive action. However, before taking preventive action, it is necessary to have information regarding potential inundation areas and water levels at various points. For this reason, a flood simulation model is needed. In this study, the simulation was carried out using the Geographic Information System (GIS) method with the help of Global Mapper software. The approach used in this simulation is to use a topographical approach with Digital Elevation Models (DEMs) data. DEMs data have been widely used for various researches to analyze hydrology. The results obtained from this flood simulation are the distribution of flood inundation and water level. The location of the inundation serves to determine the extent of the flooding that occurs by referring to the 50-100 year flood plan, while the water level serves to provide early warning information. Both will be very useful to find out how much loss will be caused in the future due to flooding in Gresik Regency so that the Gresik Regency Regional Disaster Management Agency can take precautions before the flood disaster strikes.

Keywords: flood hazard, simulation model, potential inundation, global mapper, Gresik Regency

Procedia PDF Downloads 65
198 Survey of Hawke's Bay Tourism Based Businesses: Tsunami Understanding and Preparation

Authors: V. A. Ritchie

Abstract:

The loss of life and livelihood experienced after the magnitude 9.3 Sumatra earthquake and tsunami on 26 December 2004 and magnitude 9 earthquake and tsunami in northeastern Japan on 11 March 2011, has raised global awareness and brought tsunami phenomenology, nomenclature, and representation into sharp focus. At the same time, travel and tourism continue to increase, contributing around 1 in 11 jobs worldwide. This increase in tourism is especially true for coastal zones, placing pressure on decision-makers to downplay tsunami risks and at the same time provide adequate tsunami warning so that holidaymakers will feel confident enough to visit places of high tsunami risk. This study investigates how well tsunami preparedness messages are getting through for tourist-based businesses in Hawke’s Bay New Zealand, a region of frequent seismic activity and a high probability of experiencing a nearshore tsunami. The aim of this study is to investigate whether tourists based businesses are well informed about tsunamis, how well they understand that information and to what extent their clients are included in awareness raising and evacuation processes. In high-risk tsunami zones, such as Hawke’s Bay, tourism based businesses face competitive tension between short term business profitability and longer term reputational issues related to preventable loss of life from natural hazards, such as tsunamis. This study will address ways to accommodate culturally and linguistically relevant tourist awareness measures without discouraging tourists or being too costly to implement.

Keywords: tsunami risk and response, travel and tourism, business preparedness, cross cultural knowledge transfer

Procedia PDF Downloads 134
197 Applications of Out-of-Sequence Thrust Movement for Earthquake Mitigation: A Review

Authors: Rajkumar Ghosh

Abstract:

The study presents an overview of the many uses and approaches for estimating out-of-sequence thrust movement in earthquake mitigation. The study investigates how knowing and forecasting thrust movement during seismic occurrences might assist to effective earthquake mitigation measures. The review begins by discussing out-of-sequence thrust movement and its importance in earthquake mitigation strategies. It explores how typical techniques of estimating thrust movement may not capture the full complexity of seismic occurrences and emphasizes the benefits of include out-of-sequence data in the analysis. A thorough review of existing research and studies on out-of-sequence thrust movement estimates for earthquake mitigation. The study demonstrates how to estimate out-of-sequence thrust movement using multiple data sources such as GPS measurements, satellite imagery, and seismic recordings. The study also examines the use of out-of-sequence thrust movement estimates in earthquake mitigation measures. It investigates how precise calculation of thrust movement may help improve structural design, analyse infrastructure risk, and develop early warning systems. The potential advantages of using out-of-sequence data in these applications to improve the efficiency of earthquake mitigation techniques. The difficulties and limits of estimating out-of-sequence thrust movement for earthquake mitigation. It addresses data quality difficulties, modelling uncertainties, and computational complications. To address these obstacles and increase the accuracy and reliability of out-of-sequence thrust movement estimates, the authors recommend topics for additional study and improvement. The study is a helpful resource for seismic monitoring and earthquake risk assessment researchers, engineers, and policymakers, supporting innovations in earthquake mitigation measures based on a better knowledge of thrust movement dynamics.

Keywords: earthquake mitigation, out-of-sequence thrust, satellite imagery, seismic recordings, GPS measurements

Procedia PDF Downloads 63
196 The Survey of Sports Injuries in Ten Sports

Authors: Najmeh Arabnejad, Mohammad Hossein Yousefi

Abstract:

The risk of injuring is possible in most sports. These injuries are inevitable in contact sports. Since sports injuries result in financial, physical, physiological and social problems for most athletes and endanger their professional future, studying the happening of sports injuries in sports changes to an important issue. This study can be conducted through different aspects including psychological, pathological, social, managerial, etc. Therefore, the present study was designed and conducted with the aim Survey of Sports Injuries In Ten Sports from 2006 to 2011.This descriptive study was carried out in a documentary form. Thus, data related to sports insurance and sport injuries happened in soccer, volleyball, basketball, handball, badminton, karate, track and field, taekwondo, gymnastics and wrestling was collected from Sports Medical Board of Kerman Province, as the largest province in Iran, and then was analyzed. Data collection method was library one. Furthermore, information related to 210406 insured athletes was analyzed using Descriptive Statistical Indexes in the level mean and SPSS20 Software. The research findings showed that the number of male athletes who injured was higher than female athletes in most sports within various years. Soccer, karate, volleyball, wrestling, handball, taekwondo, gymnastics, basketball, track and field, and badminton had the most injuries, respectively. Moreover, the number of injured athletes and their ratio to insured ones during six years were studied; in general, an increase in ratio of sports injuries was observed. Thus, upward movement of sports injuries in different sports, as the results of this study confirm it, is a warning which results in losing young forces and wasting of sports potential in Iran.

Keywords: sports, sports injuries, survey, Kerman

Procedia PDF Downloads 347
195 Integration of GIS with Remote Sensing and GPS for Disaster Mitigation

Authors: Sikander Nawaz Khan

Abstract:

Natural disasters like flood, earthquake, cyclone, volcanic eruption and others are causing immense losses to the property and lives every year. Current status and actual loss information of natural hazards can be determined and also prediction for next probable disasters can be made using different remote sensing and mapping technologies. Global Positioning System (GPS) calculates the exact position of damage. It can also communicate with wireless sensor nodes embedded in potentially dangerous places. GPS provide precise and accurate locations and other related information like speed, track, direction and distance of target object to emergency responders. Remote Sensing facilitates to map damages without having physical contact with target area. Now with the addition of more remote sensing satellites and other advancements, early warning system is used very efficiently. Remote sensing is being used both at local and global scale. High Resolution Satellite Imagery (HRSI), airborne remote sensing and space-borne remote sensing is playing vital role in disaster management. Early on Geographic Information System (GIS) was used to collect, arrange, and map the spatial information but now it has capability to analyze spatial data. This analytical ability of GIS is the main cause of its adaption by different emergency services providers like police and ambulance service. Full potential of these so called 3S technologies cannot be used in alone. Integration of GPS and other remote sensing techniques with GIS has pointed new horizons in modeling of earth science activities. Many remote sensing cases including Asian Ocean Tsunami in 2004, Mount Mangart landslides and Pakistan-India earthquake in 2005 are described in this paper.

Keywords: disaster mitigation, GIS, GPS, remote sensing

Procedia PDF Downloads 445
194 The Relationship between Human Neutrophil Elastase Levels and Acute Respiratory Distress Syndrome in Patients with Thoracic Trauma

Authors: Wahyu Purnama Putra, Artono Isharanto

Abstract:

Thoracic trauma is trauma that hits the thoracic wall or intrathoracic organs, either due to blunt trauma or sharp trauma. Thoracic trauma often causes impaired ventilation-perfusion due to damage to the lung parenchyma. This results in impaired tissue oxygenation, which is one of the causes of acute respiratory distress syndrome (ARDS). These changes are caused by the release of pro-inflammatory mediators, plasmatic proteins, and proteases into the alveolar space associated with ongoing edema, as well as oxidative products that ultimately result in severe inhibition of the surfactant system. This study aims to predict the incidence of acute respiratory distress syndrome (ARDS) through human neutrophil elastase levels. This study examines the relationship between plasma elastase levels as a predictor of the incidence of ARDS in thoracic trauma patients in Malang. This study is an observational cohort study. Data analysis uses the Pearson correlation test and ROC curve (receiver operating characteristic curve). It can be concluded that there is a significant (p= 0.000, r= -0.988) relationship between elastase levels and BGA-3. If the value of elastase levels is limited to 23.79 ± 3.95, the patient will experience mild ARDS. While if the value of elastase levels is limited to 57.68 ± 18.55, in the future, the patient will experience moderate ARDS. Meanwhile, if the elastase level is between 107.85 ± 5.04, the patient will likely experience severe ARDS. Neutrophil elastase levels correlate with the degree of severity of ARDS incidence.

Keywords: ARDS, human neutrophil elastase, severity, thoracic trauma

Procedia PDF Downloads 118
193 Prognostic Value of C-Reactive Protein (CRP) in SARS-CoV-2 Infection: A Simplified Biomarker of COVID-19 Severity in Sub-Saharan Africa

Authors: Teklay Gebrecherkos, Mahmud Abdulkader, Tobias Rinke De Wit, Britta C. Urban, Feyissa Chala, Yazezew Kebede, Dawit Welday

Abstract:

Background: C-reactive protein (CRP) levels are a reliable surrogate for interleukin-6 bioactivity that plays a pivotal role in the pathogenesis of cytokine storm associated with severe COVID-19. There is a lack of data on the role of CRP as a determinant of COVID-19 severity status in the African context. Methods: We determined the longitudinal kinetics of CRP levels on 78 RT-PCR-confirmed COVID-19 patients (49 non-severe and 29 severe cases) and 50 PCR-negative controls. Results: COVID-19 patients had overall significantly elevated CRP at baseline when compared to PCR-negative controls [median 11.1 (IQR: 2.0-127.8) mg/L vs. 0.9 (IQR: 0.5-1.9) mg/L; p=0.0004)]. Moreover, severe COVID-19 patients had significantly higher median CRP levels than non-severe cases [166.1 (IQR: 48.6-332.5) mg/L vs. 2.4 (IQR: 1.2-7.6) mg/L; p<0.00001)]. In addition, persistently elevated levels of CRP were exhibited among those with comorbidities and higher age groups. Area under receiver operating characteristic curve (AUC) analysis of CRP levels distinguished PCR-confirmed COVID-19 patients from the ones with PCR-negative non-COVID-19 individuals, with an AUC value of 0.77 (95% CI: 0.68-0.84; p=0.001). Moreover, it clearly distinguished severe from non-severe COVID-19 patients, with an AUC value of 0.83 (95% CI: 0.73-0.91). After adjusting for age and the presence of comorbidities, CRP levels above 30 mg/L were significantly associated with an increased risk of developing severe COVID-19 (adjusted relative risk 3.99 (95%CI: 1.35-11.82; p=0.013). Conclusions: Determining CRP levels in COVID-19 patients in African settings may provide a simple, prompt, and inexpensive assessment of the severity status at baseline and monitoring of treatment outcomes.

Keywords: CRP, COVID-19, SARS-CoV-2, biomarker

Procedia PDF Downloads 53
192 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering

Authors: Sara Hasani

Abstract:

This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.

Keywords: disaster management, natural disaster, pattern recognition, prediction

Procedia PDF Downloads 136
191 Global Navigation Satellite System and Precise Point Positioning as Remote Sensing Tools for Monitoring Tropospheric Water Vapor

Authors: Panupong Makvichian

Abstract:

Global Navigation Satellite System (GNSS) is nowadays a common technology that improves navigation functions in our life. Additionally, GNSS is also being employed on behalf of an accurate atmospheric sensor these times. Meteorology is a practical application of GNSS, which is unnoticeable in the background of people’s life. GNSS Precise Point Positioning (PPP) is a positioning method that requires data from a single dual-frequency receiver and precise information about satellite positions and satellite clocks. In addition, careful attention to mitigate various error sources is required. All the above data are combined in a sophisticated mathematical algorithm. At this point, the research is going to demonstrate how GNSS and PPP method is capable to provide high-precision estimates, such as 3D positions or Zenith tropospheric delays (ZTDs). ZTDs combined with pressure and temperature information allows us to estimate the water vapor in the atmosphere as precipitable water vapor (PWV). If the process is replicated for a network of GNSS sensors, we can create thematic maps that allow extract water content information in any location within the network area. All of the above are possible thanks to the advances in GNSS data processing. Therefore, we are able to use GNSS data for climatic trend analysis and acquisition of the further knowledge about the atmospheric water content.

Keywords: GNSS, precise point positioning, Zenith tropospheric delays, precipitable water vapor

Procedia PDF Downloads 177
190 Protecting Migrants at Risk as Internally Displaced Persons: State Responses to Foreign Immigrants Displaced by Natural Disasters in Thailand, The United States, and Japan

Authors: Toake Endoh

Abstract:

Cross-border migration of people is a critical driver for sustainable economic development in the Asia-Pacific region. Meanwhile, the region is susceptible to mega-scale natural disasters, such as tsunami, earthquakes, and typhoons. When migrants are stranded in a foreign country by a disaster, who should be responsible for their safety and security? What legal or moral foundation is there to advocate for the protection and assistance of “migrants at risk (M@R)”? How can the states practice “good governance” in their response to displacement of the foreign migrants? This paper inquires how to protect foreign migrants displaced by a natural disaster under international law and proposes protective actions to be taken by of migrant-receiver governments. First, the paper discusses the theoretical foundation for protection of M@R and argues that the nation-states are charged of responsibility to protect at-risk foreigners as “internally displaced persons” in the light of the United Nations’ Guiding Principles of Internal Displacement (1998). Second, through the case study of the Kobe Earthquake in Japan (1995), the Tsunami in Thailand (2004), and the Hurricane Katrina in the U.S. (2005), the paper evaluates how effectively (or poorly) institutions and state actors addressed the specific vulnerability felt by M@R in these crises.

Keywords: internal displaced persons, natural disaster, international migration, responsibility to protect

Procedia PDF Downloads 292
189 Gis Database Creation for Impacts of Domestic Wastewater Disposal on BIDA Town, Niger State Nigeria

Authors: Ejiobih Hyginus Chidozie

Abstract:

Geographic Information System (GIS) is a configuration of computer hardware and software specifically designed to effectively capture, store, update, manipulate, analyse and display and display all forms of spatially referenced information. GIS database is referred to as the heart of GIS. It has location data, attribute data and spatial relationship between the objects and their attributes. Sewage and wastewater management have assumed increased importance lately as a result of general concern expressed worldwide about the problems of pollution of the environment contamination of the atmosphere, rivers, lakes, oceans and ground water. In this research GIS database was created to study the impacts of domestic wastewater disposal methods on Bida town, Niger State as a model for investigating similar impacts on other cities in Nigeria. Results from GIS database are very useful to decision makers and researchers. Bida Town was subdivided into four regions, eight zones, and 24 sectors based on the prevailing natural morphology of the town. GIS receiver and structured questionnaire were used to collect information and attribute data from 240 households of the study area. Domestic wastewater samples were collected from twenty four sectors of the study area for laboratory analysis. ArcView 3.2a GIS software, was used to create the GIS databases for ecological, health and socioeconomic impacts of domestic wastewater disposal methods in Bida town.

Keywords: environment, GIS, pollution, software, wastewater

Procedia PDF Downloads 400
188 Intelligent Fault Diagnosis for the Connection Elements of Modular Offshore Platforms

Authors: Jixiang Lei, Alexander Fuchs, Franz Pernkopf, Katrin Ellermann

Abstract:

Within the Space@Sea project, funded by the Horizon 2020 program, an island consisting of multiple platforms was designed. The platforms are connected by ropes and fenders. The connection is critical with respect to the safety of the whole system. Therefore, fault detection systems are investigated, which could detect early warning signs for a possible failure in the connection elements. Previously, a model-based method called Extended Kalman Filter was developed to detect the reduction of rope stiffness. This method detected several types of faults reliably, but some types of faults were much more difficult to detect. Furthermore, the model-based method is sensitive to environmental noise. When the wave height is low, a long time is needed to detect a fault and the accuracy is not always satisfactory. In this sense, it is necessary to develop a more accurate and robust technique that can detect all rope faults under a wide range of operational conditions. Inspired by this work on the Space at Sea design, we introduce a fault diagnosis method based on deep neural networks. Our method cannot only detect rope degradation by using the acceleration data from each platform but also estimate the contributions of the specific acceleration sensors using methods from explainable AI. In order to adapt to different operational conditions, the domain adaptation technique DANN is applied. The proposed model can accurately estimate rope degradation under a wide range of environmental conditions and help users understand the relationship between the output and the contributions of each acceleration sensor.

Keywords: fault diagnosis, deep learning, domain adaptation, explainable AI

Procedia PDF Downloads 156
187 Electroencephalography (EEG) Analysis of Alcoholic and Control Subjects Using Multiscale Permutation Entropy

Authors: Lal Hussain, Wajid Aziz, Sajjad Ahmed Nadeem, Saeed Arif Shah, Abdul Majid

Abstract:

Brain electrical activity as reflected in Electroencephalography (EEG) have been analyzed and diagnosed using various techniques. Among them, complexity measure, nonlinearity, disorder, and unpredictability play vital role due to the nonlinear interconnection between functional and anatomical subsystem emerged in brain in healthy state and during various diseases. There are many social and economical issues of alcoholic abuse as memory weakness, decision making, impairments, and concentrations etc. Alcoholism not only defect the brains but also associated with emotional, behavior, and cognitive impairments damaging the white and gray brain matters. A recently developed signal analysis method i.e. Multiscale Permutation Entropy (MPE) is proposed to estimate the complexity of long-range temporal correlation time series EEG of Alcoholic and Control subjects acquired from University of California Machine Learning repository and results are compared with MSE. Using MPE, coarsed grained series is first generated and the PE is computed for each coarsed grained time series against the electrodes O1, O2, C3, C4, F2, F3, F4, F7, F8, Fp1, Fp2, P3, P4, T7, and T8. The results computed against each electrode using MPE gives higher significant values as compared to MSE as well as mean rank differences accordingly. Likewise, ROC and Area under the ROC also gives higher separation against each electrode using MPE in comparison to MSE.

Keywords: electroencephalogram (EEG), multiscale permutation entropy (MPE), multiscale sample entropy (MSE), permutation entropy (PE), mann whitney test (MMT), receiver operator curve (ROC), complexity measure

Procedia PDF Downloads 469
186 Polarisation in Latin America: Examining the Role of Social Media in Ideological Positioning Based on 2018 Census Data

Authors: Sarah Ledoux

Abstract:

This paper analyses the quantitative effects of political content consumption in social media platforms on self-reported ideological preference across the Latin American region. Initially praising the democratic potential of the internet and its social networking websites, digital politics scholars have transitioned their discourse to warning against the undemocratic side-effects it cultivates, such as hate speech, filter bubbles, and ideological polarisation. Holding technology solely responsible for political trends worldwide is an oversimplification of the factors influencing social change. Nonetheless, widespread use of social media in new democracies raises questions on the reproduction of recent trends that have been observed in the US and Western Europe. Through the analysis of ordered logistic regressions on data from the 2018 AmericasBarometer survey, this study examines the extent to which the relationship between the consumption of political content on social media is related to ideological polarisation in Latin America. The findings indicate that there is a close link between consumption of political information on social media, specifically on Facebook and WhatsApp, and ideological positioning on the extremes of the political left- and right-wings. This relation holds when controlling for individual-level demographic and attitudinal factors, as well as country-level effects. These results demonstrate with empirical evidence that viewing political content on social media has a significant positive effect on the likelihood that citizens position themselves on the extreme ends of the left-right ideological spectrum and implies that political polarisation is a phenomenon that accompanies politically driven social media use.

Keywords: Latin America, polarisation, political consumption, political ideology, social media, survey

Procedia PDF Downloads 126
185 Research on Pilot Sequence Design Method of Multiple Input Multiple Output Orthogonal Frequency Division Multiplexing System Based on High Power Joint Criterion

Authors: Linyu Wang, Jiahui Ma, Jianhong Xiang, Hanyu Jiang

Abstract:

For the pilot design of the sparse channel estimation model in Multiple Input Multiple Output Orthogonal Frequency Division Multiplexing (MIMO-OFDM) systems, the observation matrix constructed according to the matrix cross-correlation criterion, total correlation criterion and other optimization criteria are not optimal, resulting in inaccurate channel estimation and high bit error rate at the receiver. This paper proposes a pilot design method combining high-power sum and high-power variance criteria, which can more accurately estimate the channel. First, the pilot insertion position is designed according to the high-power variance criterion under the condition of equal power. Then, according to the high power sum criterion, the pilot power allocation is converted into a cone programming problem, and the power allocation is carried out. Finally, the optimal pilot is determined by calculating the weighted sum of the high power sum and the high power variance. Compared with the traditional pilot frequency, under the same conditions, the constructed MIMO-OFDM system uses the optimal pilot frequency for channel estimation, and the communication bit error rate performance obtains a gain of 6~7dB.

Keywords: MIMO-OFDM, pilot optimization, compressed sensing, channel estimation

Procedia PDF Downloads 127
184 Rank-Based Chain-Mode Ensemble for Binary Classification

Authors: Chongya Song, Kang Yen, Alexander Pons, Jin Liu

Abstract:

In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve.

Keywords: consensus, curse of correlation, imbalance classification, rank-based chain-mode ensemble

Procedia PDF Downloads 113
183 The Neutrophil-to-Lymphocyte Ratio after Surgery for Hip Fracture in a New, Simple, and Objective Score to Predict Postoperative Mortality

Authors: Philippe Dillien, Patrice Forget, Harald Engel, Olivier Cornu, Marc De Kock, Jean Cyr Yombi

Abstract:

Introduction: Hip fracture precedes commonly death in elderly people. Identification of high-risk patients may contribute to target patients in whom optimal management, resource allocation and trials efficiency is needed. The aim of this study is to construct a predictive score of mortality after hip fracture on the basis of the objective prognostic factors available: Neutrophil-to-lymphocyte ratio (NLR), age, and sex. C-Reactive Protein (CRP), is also considered as an alternative to the NLR. Patients and methods: After the IRB approval, we analyzed our prospective database including 286 consecutive patients with hip fracture. A score was constructed combining age (1 point per decade above 74 years), sex (1 point for males), and NLR at postoperative day+5 (1 point if >5). A receiver-operating curve (ROC) curve analysis was performed. Results: From the 286 patients included, 235 were analyzed (72 males and 163 females, 30.6%/69.4%), with a median age of 84 (range: 65 to 102) years, mean NLR values of 6.47+/-6.07. At one year, 82/280 patients died (29.3%). Graphical analysis and log-rank test confirm a highly statistically significant difference (P<0.001). Performance analysis shows an AUC of 0.72 [95%CI 0.65-0.79]. CRP shows no advantage on NLR. Conclusion: We have developed a score based on age, sex and the NLR to predict the risk of mortality at one year in elderly patients after surgery for a hip fracture. After external validation, it may be included in clinical practice as in clinical research to stratify the risk of postoperative mortality.

Keywords: neutrophil-to-lymphocyte ratio, hip fracture, postoperative mortality, medical and health sciences

Procedia PDF Downloads 393