Search results for: R data science
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26780

Search results for: R data science

25100 Optimum Design of Dual-Purpose Outriggers in Tall Buildings

Authors: Jiwon Park, Jihae Hur, Kukjae Kim, Hansoo Kim

Abstract:

In this study, outriggers, which are horizontal structures connecting a building core to distant columns to increase the lateral stiffness of a tall building, are used to reduce differential axial shortening in a tall building. Therefore, the outriggers in tall buildings are used to serve the dual purposes of reducing the lateral displacement and reducing the differential axial shortening. Since the location of the outrigger greatly affects the effectiveness of the outrigger in terms of the lateral displacement at the top of the tall building and the maximum differential axial shortening, the optimum locations of the dual-purpose outriggers can be determined by an optimization method. Because the floors where the outriggers are installed are given as integer numbers, the conventional gradient-based optimization methods cannot be directly used. In this study, a piecewise quadratic interpolation method is used to resolve the integrality requirement posed by the optimum locations of the dual-purpose outriggers. The optimal solutions for the dual-purpose outriggers are searched by linear scalarization which is a popular method for multi-objective optimization problems. It was found that increasing the number of outriggers reduced the maximum lateral displacement and the maximum differential axial shortening. It was also noted that the optimum locations for reducing the lateral displacement and reducing the differential axial shortening were different. Acknowledgment: This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science and ICT (NRF-2017R1A2B4010043) and financially supported by Korea Ministry of Land, Infrastructure and Transport(MOLIT) as U-City Master and Doctor Course Grant Program.

Keywords: concrete structure, optimization, outrigger, tall building

Procedia PDF Downloads 177
25099 Measurement of Qashqaeian Sheep Fetus Parameters by Ultrasonography

Authors: Aboozar Dehghan, S. Sharifi, S. A. Dehghan, Ali Aliabadi, Arash Esfandiari

Abstract:

Ultrasonography is a safe, available and particular method in diagnostic imaging science. In ultrasonography most of body soft tissue imaged in B mode display. Iranian Qashqaeian sheep is an old and domestic breed in Zagros mountain area in central plateau of Iran. Population of this breed in Fars state (study location) is 250000 animals. Gestation age detection in sheep was performed by ultarasonography in Kivircik breed in 2010 in turkey. In this study 5 adult, clinically healthy, Iranian ewes and 1 Iranian ram were selected. We measured biparital diameter that thickened part of fetal skull include (BPD), trunk diameter (TD), fetal heart diameter(FHD), intercostals space of fetus (ICS) and fetal heart rate per minute (FHR) weekly after day 60 after pregnancy. Inguinal area in both sides shaved and cleaned by alcohol 70 degree and covered by enough copulating gel. Trans abdominal Ultarasonography was performed by a convex multi frequency transducer with 2.5-5 MHz frequency. Data were collected and analyzed by on way Annova method in Spss15 software. Mean of BPD, TD, FHD and ICS in day 60 were 14.58, 25.92, 3.53, 2.3mm. FHR can measure on day 109 to 150. TD after day 109 cannot displayed in 1 frame in scanning. Ultrasonography in sheep pregnancy is a particular method. Using this study can help in theriogeniologic disease that affected fetal growth. Differentiating between various sheep breed is a functional result of this study.

Keywords: qashqaeian sheep, fetometry, ultrasonography

Procedia PDF Downloads 545
25098 Meteorological Risk Assessment for Ships with Fuzzy Logic Designer

Authors: Ismail Karaca, Ridvan Saracoglu, Omer Soner

Abstract:

Fuzzy Logic, an advanced method to support decision-making, is used by various scientists in many disciplines. Fuzzy programming is a product of fuzzy logic, fuzzy rules, and implication. In marine science, fuzzy programming for ships is dramatically increasing together with autonomous ship studies. In this paper, a program to support the decision-making process for ship navigation has been designed. The program is produced in fuzzy logic and rules, by taking the marine accidents and expert opinions into account. After the program was designed, the program was tested by 46 ship accidents reported by the Transportation Safety Investigation Center of Turkey. Wind speed, sea condition, visibility, day/night ratio have been used as input data. They have been converted into a risk factor within the Fuzzy Logic Designer application and fuzzy rules set by marine experts. Finally, the expert's meteorological risk factor for each accident is compared with the program's risk factor, and the error rate was calculated. The main objective of this study is to improve the navigational safety of ships, by using the advance decision support model. According to the study result, fuzzy programming is a robust model that supports safe navigation.

Keywords: calculation of risk factor, fuzzy logic, fuzzy programming for ship, safety navigation of ships

Procedia PDF Downloads 189
25097 Data Security and Privacy Challenges in Cloud Computing

Authors: Amir Rashid

Abstract:

Cloud Computing frameworks empower organizations to cut expenses by outsourcing computation resources on-request. As of now, customers of Cloud service providers have no methods for confirming the privacy and ownership of their information and data. To address this issue we propose the platform of a trusted cloud computing program (TCCP). TCCP empowers Infrastructure as a Service (IaaS) suppliers, for example, Amazon EC2 to give a shout box execution condition that ensures secret execution of visitor virtual machines. Also, it permits clients to bear witness to the IaaS supplier and decide if the administration is secure before they dispatch their virtual machines. This paper proposes a Trusted Cloud Computing Platform (TCCP) for guaranteeing the privacy and trustworthiness of computed data that are outsourced to IaaS service providers. The TCCP gives the deliberation of a shut box execution condition for a client's VM, ensuring that no cloud supplier's authorized manager can examine or mess up with its data. Furthermore, before launching the VM, the TCCP permits a client to dependably and remotely acknowledge that the provider at backend is running a confided in TCCP. This capacity extends the verification of whole administration, and hence permits a client to confirm the data operation in secure mode.

Keywords: cloud security, IaaS, cloud data privacy and integrity, hybrid cloud

Procedia PDF Downloads 299
25096 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 86
25095 A Corporate Social Responsibility Project to Improve the Democratization of Scientific Education in Brazil

Authors: Denise Levy

Abstract:

Nuclear technology is part of our everyday life and its beneficial applications help to improve the quality of our lives. Nevertheless, in Brazil, most often the media and social networks tend to associate radiation to nuclear weapons and major accidents, and there is still great misunderstanding about the peaceful applications of nuclear science. The Educational Portal Radioatividades (Radioactivities) is a corporate social responsibility initiative that takes advantage of the growing impact of Internet to offer high quality scientific information for teachers and students throughout Brazil. This web-based initiative focusses on the positive applications of nuclear technology, presenting the several contributions of ionizing radiation in different contexts, such as nuclear medicine, agriculture techniques, food safety and electric power generation, proving nuclear technology as part of modern life and a must to improve the quality of our lifestyle. This educational project aims to contribute for democratization of scientific education and social inclusion, approaching society to scientific knowledge, promoting critical thinking and inspiring further reflections. The website offers a wide variety of ludic activities such as curiosities, interactive exercises and short courses. Moreover, teachers are offered free web-based material with full instructions to be developed in class. Since year 2013, the project has been developed and improved according to a comprehensive study about the realistic scenario of ICTs infrastructure in Brazilian schools and in full compliance with the best e-learning national and international recommendations.

Keywords: information and communication technologies, nuclear technology, science communication, society and education

Procedia PDF Downloads 326
25094 A Proposal to Tackle Security Challenges of Distributed Systems in the Healthcare Sector

Authors: Ang Chia Hong, Julian Khoo Xubin, Burra Venkata Durga Kumar

Abstract:

Distributed systems offer many benefits to the healthcare industry. From big data analysis to business intelligence, the increased computational power and efficiency from distributed systems serve as an invaluable resource in the healthcare sector to utilize. However, as the usage of these distributed systems increases, many issues arise. The main focus of this paper will be on security issues. Many security issues stem from distributed systems in the healthcare industry, particularly information security. The data of people is especially sensitive in the healthcare industry. If important information gets leaked (Eg. IC, credit card number, address, etc.), a person’s identity, financial status, and safety might get compromised. This results in the responsible organization losing a lot of money in compensating these people and even more resources expended trying to fix the fault. Therefore, a framework for a blockchain-based healthcare data management system for healthcare was proposed. In this framework, the usage of a blockchain network is explored to store the encryption key of the patient’s data. As for the actual data, it is encrypted and its encrypted data, called ciphertext, is stored in a cloud storage platform. Furthermore, there are some issues that have to be emphasized and tackled for future improvements, such as a multi-user scheme that could be proposed, authentication issues that have to be tackled or migrating the backend processes into the blockchain network. Due to the nature of blockchain technology, the data will be tamper-proof, and its read-only function can only be accessed by authorized users such as doctors and nurses. This guarantees the confidentiality and immutability of the patient’s data.

Keywords: distributed, healthcare, efficiency, security, blockchain, confidentiality and immutability

Procedia PDF Downloads 184
25093 Design and Implementation of a Geodatabase and WebGIS

Authors: Sajid Ali, Dietrich Schröder

Abstract:

The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.

Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application

Procedia PDF Downloads 341
25092 Industrial Ergonomics Improvement at a Refrigerator Manufacturing Company in Iran: An Approach on Interventional Ergonomics

Authors: Hassan S. Naeini

Abstract:

Nowadays a lot of people are working in several sorts of industrial sectors in which there are some risk factors which threaten human being especially in developing countries. One of the main problems which effect on workers’ health refers to Ergonomics. Ergonomics as multidisciplinary science concerns workers’ health and safety in terms of somatic and mental concepts. Surely ergonomics interventions and improvement make a better condition for workers and change the quality of working life to better condition. In this study, one of the factories in Iran which is producing some kinds of small and medium size of refrigerators was chosen as the sample. The preliminary ergonomics observation of the mentioned factory showed that there are some risk factors in terms of ergonomics aspects, so an ergonomic intervention was defined, then some ergonomic assessment methods such as NMQ,OWAS, and Environmental Ergonomic Assessment were used. Also Anthropometric measurement was done. This study shows that there are some workstations and plants which suffer some degrees of ergonomic problems. Considering with the gathered data, illumination, noise control and workstation design in metal workstation are known as the priority actions. Some parts of the mentioned interventions are ongoing actions. it seems that the mentioned intervention and workstations design make a better condition for workers, because ergonomics make a safer and more sustainable environments for human being.

Keywords: anthropometry, ergonomics, health, NMQ, OWAS

Procedia PDF Downloads 755
25091 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 68
25090 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study

Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos

Abstract:

This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.

Keywords: in-place devices, IoT, human-centred data-analytics, spatial design

Procedia PDF Downloads 197
25089 A Unique Multi-Class Support Vector Machine Algorithm Using MapReduce

Authors: Aditi Viswanathan, Shree Ranjani, Aruna Govada

Abstract:

With data sizes constantly expanding, and with classical machine learning algorithms that analyze such data requiring larger and larger amounts of computation time and storage space, the need to distribute computation and memory requirements among several computers has become apparent. Although substantial work has been done in developing distributed binary SVM algorithms and multi-class SVM algorithms individually, the field of multi-class distributed SVMs remains largely unexplored. This research seeks to develop an algorithm that implements the Support Vector Machine over a multi-class data set and is efficient in a distributed environment. For this, we recursively choose the best binary split of a set of classes using a greedy technique. Much like the divide and conquer approach. Our algorithm has shown better computation time during the testing phase than the traditional sequential SVM methods (One vs. One, One vs. Rest) and out-performs them as the size of the data set grows. This approach also classifies the data with higher accuracy than the traditional multi-class algorithms.

Keywords: distributed algorithm, MapReduce, multi-class, support vector machine

Procedia PDF Downloads 401
25088 Information Management Approach in the Prediction of Acute Appendicitis

Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki

Abstract:

This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.

Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree

Procedia PDF Downloads 350
25087 Robust Inference with a Skew T Distribution

Authors: M. Qamarul Islam, Ergun Dogan, Mehmet Yazici

Abstract:

There is a growing body of evidence that non-normal data is more prevalent in nature than the normal one. Examples can be quoted from, but not restricted to, the areas of Economics, Finance and Actuarial Science. The non-normality considered here is expressed in terms of fat-tailedness and asymmetry of the relevant distribution. In this study a skew t distribution that can be used to model a data that exhibit inherent non-normal behavior is considered. This distribution has tails fatter than a normal distribution and it also exhibits skewness. Although maximum likelihood estimates can be obtained by solving iteratively the likelihood equations that are non-linear in form, this can be problematic in terms of convergence and in many other respects as well. Therefore, it is preferred to use the method of modified maximum likelihood in which the likelihood estimates are derived by expressing the intractable non-linear likelihood equations in terms of standardized ordered variates and replacing the intractable terms by their linear approximations obtained from the first two terms of a Taylor series expansion about the quantiles of the distribution. These estimates, called modified maximum likelihood estimates, are obtained in closed form. Hence, they are easy to compute and to manipulate analytically. In fact the modified maximum likelihood estimates are equivalent to maximum likelihood estimates, asymptotically. Even in small samples the modified maximum likelihood estimates are found to be approximately the same as maximum likelihood estimates that are obtained iteratively. It is shown in this study that the modified maximum likelihood estimates are not only unbiased but substantially more efficient than the commonly used moment estimates or the least square estimates that are known to be biased and inefficient in such cases. Furthermore, in conventional regression analysis, it is assumed that the error terms are distributed normally and, hence, the well-known least square method is considered to be a suitable and preferred method for making the relevant statistical inferences. However, a number of empirical researches have shown that non-normal errors are more prevalent. Even transforming and/or filtering techniques may not produce normally distributed residuals. Here, a study is done for multiple linear regression models with random error having non-normal pattern. Through an extensive simulation it is shown that the modified maximum likelihood estimates of regression parameters are plausibly robust to the distributional assumptions and to various data anomalies as compared to the widely used least square estimates. Relevant tests of hypothesis are developed and are explored for desirable properties in terms of their size and power. The tests based upon modified maximum likelihood estimates are found to be substantially more powerful than the tests based upon least square estimates. Several examples are provided from the areas of Economics and Finance where such distributions are interpretable in terms of efficient market hypothesis with respect to asset pricing, portfolio selection, risk measurement and capital allocation, etc.

Keywords: least square estimates, linear regression, maximum likelihood estimates, modified maximum likelihood method, non-normality, robustness

Procedia PDF Downloads 397
25086 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene

Abstract:

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Keywords: multi-objective, analysis, data flow, freight delivery, methodology

Procedia PDF Downloads 180
25085 A Literature Review on the Use of Information and Communication Technology within and between Emergency Medical Teams during a Disaster

Authors: Badryah Alshehri, Kevin Gormley, Gillian Prue, Karen McCutcheon

Abstract:

In a disaster event, sharing patient information between the pre-hospitals Emergency Medical Services (EMS) and Emergency Department (ED) hospitals is a complex process during which important information may be altered or lost due to poor communication. The aim of this study was to critically discuss the current evidence base in relation to communication between pre-EMS hospital and ED hospital professionals by the use of Information and Communication Systems (ICT). This study followed the systematic approach; six electronic databases were searched: CINAHL, Medline, Embase, PubMed, Web of Science, and IEEE Xplore Digital Library were comprehensively searched in January 2018 and a second search was completed in April 2020 to capture more recent publications. The study selection process was undertaken independently by the study authors. Both qualitative and quantitative studies were chosen that focused on factors which are positively or negatively associated with coordinated communication between pre-hospital EMS and ED teams in a disaster event. These studies were assessed for quality and the data were analysed according to the key screening themes which emerged from the literature search. Twenty-two studies were included. Eleven studies employed quantitative methods, seven studies used qualitative methods, and four studies used mixed methods. Four themes emerged on communication between EMTs (pre-hospital EMS and ED staff) in a disaster event using the ICT. (1) Disaster preparedness plans and coordination. This theme reported that disaster plans are in place in hospitals, and in some cases, there are interagency agreements with pre-hospital and relevant stakeholders. However, the findings showed that the disaster plans highlighted in these studies lacked information regarding coordinated communications within and between the pre-hospital and hospital. (2) Communication systems used in the disaster. This theme highlighted that although various communication systems are used between and within hospitals and pre-hospitals, technical issues have influenced communication between teams during disasters. (3) Integrated information management systems. This theme suggested the need for an integrated health information system which can help pre-hospital and hospital staff to record patient data and ensure the data is shared. (4) Disaster training and drills. While some studies analysed disaster drills and training, the majority of these studies were focused on hospital departments other than EMTs. These studies suggest the need for simulation disaster training and drills, including EMTs. This review demonstrates that considerable gaps remain in the understanding of the communication between the EMS and ED hospitals staff in relation to response in disasters. The review shows that although different types of ICTs are used, various issues remain which affect coordinated communication among the relevant professionals.

Keywords: communication, emergency communication services, emergency medical teams, emergency physicians, emergency nursing, paramedics, information and communication technology, communication systems

Procedia PDF Downloads 86
25084 Minimization of Denial of Services Attacks in Vehicular Adhoc Networking by Applying Different Constraints

Authors: Amjad Khan

Abstract:

The security of Vehicular ad hoc networking is of great importance as it involves serious life threats. Thus to provide secure communication amongst Vehicles on road, the conventional security system is not enough. It is necessary to prevent the network resources from wastage and give them protection against malicious nodes so that to ensure the data bandwidth availability to the legitimate nodes of the network. This work is related to provide a non conventional security system by introducing some constraints to minimize the DoS (Denial of services) especially data and bandwidth. The data packets received by a node in the network will pass through a number of tests and if any of the test fails, the node will drop those data packets and will not forward it anymore. Also if a node claims to be the nearest node for forwarding emergency messages then the sender can effectively identify the true or false status of the claim by using these constraints. Consequently the DoS(Denial of Services) attack is minimized by the instant availability of data without wasting the network resources.

Keywords: black hole attack, grey hole attack, intransient traffic tempering, networking

Procedia PDF Downloads 284
25083 Traffic Prediction with Raw Data Utilization and Context Building

Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao

Abstract:

Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.

Keywords: traffic prediction, raw data utilization, context building, data reduction

Procedia PDF Downloads 128
25082 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya

Authors: Abdalla Abdelnabi, Yousf Abushalah

Abstract:

The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.

Keywords: 3D seismic data, well logging, petrel, kingdom suite

Procedia PDF Downloads 150
25081 Analysis of Spatial and Temporal Data Using Remote Sensing Technology

Authors: Kapil Pandey, Vishnu Goyal

Abstract:

Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes.

Keywords: GIS, landuse/landcover, spatial and temporal data, remote sensing

Procedia PDF Downloads 433
25080 A Study of the Effect of the Flipped Classroom on Mixed Abilities Classes in Compulsory Secondary Education in Italy

Authors: Giacoma Pace

Abstract:

The research seeks to evaluate whether students with impairments can achieve enhanced academic progress by actively engaging in collaborative problem-solving activities with teachers and peers, to overcome the obstacles rooted in socio-economic disparities. Furthermore, the research underscores the significance of fostering students' self-awareness regarding their learning process and encourages teachers to adopt a more interactive teaching approach. The research also posits that reducing conventional face-to-face lessons can motivate students to explore alternative learning methods, such as collaborative teamwork and peer education within the classroom. To address socio-cultural barriers it is imperative to assess their internet access and possession of technological devices, as these factors can contribute to a digital divide. The research features a case study of a Flipped Classroom Learning Unit, administered to six third-year high school classes: Scientific Lyceum, Technical School, and Vocational School, within the city of Turin, Italy. Data are about teachers and the students involved in the case study, some impaired students in each class, level of entry, students’ performance and attitude before using Flipped Classrooms, level of motivation, family’s involvement level, teachers’ attitude towards Flipped Classroom, goal obtained, the pros and cons of such activities, technology availability. The selected schools were contacted; meetings for the English teachers to gather information about their attitude and knowledge of the Flipped Classroom approach. Questionnaires to teachers and IT staff were administered. The information gathered, was used to outline the profile of the subjects involved in the study and was further compared with the second step of the study made up of a study conducted with the classes of the selected schools. The learning unit is the same, structure and content are decided together with the English colleagues of the classes involved. The pacing and content are matched in every lesson and all the classes participate in the same labs, use the same materials, homework, same assessment by summative and formative testing. Each step follows a precise scheme, in order to be as reliable as possible. The outcome of the case study will be statistically organised. The case study is accompanied by a study on the literature concerning EFL approaches and the Flipped Classroom. Document analysis method was employed, i.e. a qualitative research method in which printed and/or electronic documents containing information about the research subject are reviewed and evaluated with a systematic procedure. Articles in the Web of Science Core Collection, Education Resources Information Center (ERIC), Scopus and Science Direct databases were searched in order to determine the documents to be examined (years considered 2000-2022).

Keywords: flipped classroom, impaired, inclusivity, peer instruction

Procedia PDF Downloads 53
25079 The Importance of Value Added Services Provided by Science and Technology Parks to Boost Entrepreneurship Ecosystem in Turkey

Authors: Faruk Inaltekin, Imran Gurakan

Abstract:

This paper will aim to discuss the importance of value-added services provided by Science and Technology Parks for entrepreneurship development in Turkey. Entrepreneurship is vital subject for all countries. It has not only fostered economic development but also promoted innovation at local and international levels. To foster high tech entrepreneurship ecosystem, Technopark (Science and Technology Park/STP) concept was initiated with the establishment of Silicon Valley in the 1950s. The success and rise of Silicon Valley led to the spread of technopark activities. Developed economies have been setting up projects to plan and build STPs since the 1960s and 1970s. To promote the establishment of STPs, necessary legislations were made by Ministry of Science, Industry, and Technology in 2001, Technology Development Zones Law (No. 4691) and it has been revised in 2016 to provide more supports. STPs’ basic aim is to provide customers high-quality office spaces with various 'value added services' such as business development, network connections, cooperation programs, investor/customers meetings and internationalization services. For this aim, STPs should help startups deal with difficulties in the early stages and to support mature companies’ export activities in the foreign market. STPs should support the production, commercialization and more significantly internationalization of technology-intensive business and foster growth of companies. Nowadays within this value-added services, internationalization is very popular subject in the world. Most of STPs design clusters or accelerator programs in order to support their companies in the foreign market penetration. If startups are not ready for international competition, STPs should help them to get ready for foreign market with training and mentoring sessions. These training and mentoring sessions should take a goal based approach to working with companies. Each company has different needs and goals. Therefore the definition of ‘success' varies for each company. For this reason, it is very important to create customized value added services to meet the needs of startups. After local supports, STPs should also be able to support their startups in foreign market. Organizing well defined international accelerator program plays an important role in this mission. Turkey is strategically placed between key markets in Europe, Russia, Central Asia and the Middle East. Its population is young and well educated. So both government agencies and the private sectors endeavor to foster and encourage entrepreneurship ecosystem with many supports. In sum, the task of technoparks with these and similar value added services is very important for developing entrepreneurship ecosystem. The priorities of all value added services are to identify the commercialization and growth obstacles faced by entrepreneurs and get rid of them with the one-to-one customized services. Also, in order to have a healthy startup ecosystem and create sustainable entrepreneurship, stakeholders (technoparks, incubators, accelerators, investors, universities, governmental organizations etc.) should fulfill their roles and/or duties and collaborate with each other. STPs play an important role as bridge for these stakeholders & entrepreneurs. STPs always should benchmark and renew services offered to how to help the start-ups to survive, develop their business and benefit from these stakeholders.

Keywords: accelerator, cluster, entrepreneurship, startup, technopark, value added services

Procedia PDF Downloads 143
25078 Effects of Transcranial Direct Current Stimulation on Post-Stroke Dysphagia

Authors: Ehsan Kaviani, Azin Golmoradizade

Abstract:

Introduction: Traditionally, tendons are considered to only contain tenocytes that are responsible for the maintenance, repair, and remodeling of tendons. Stem cells, which are termed tendon-derived stem cells, so this study we investigate the effect of transcranial direct current stimulation combined with swallowing training on post-stroke dysphagia. Methods: This review article is about effects of transcranial direct current stimulation (tDCS) on post-stroke dysphagia that were extracted from Science Direct, Pro quest, and Pub med Data Bases. 15 articles had been selected according to inclusion criteria from 2014 to 2019, and 6 of them had been deleted by exclusion criteria. Results: The results of our systematic review suggest that tDCS may represent a promising novel treatment for post-stroke dysphagia. However, to date, little is known about the optimal parameters of tDCS for relieving post-stroke dysphagia. Further studies are warranted to refine this promising intervention by exploring the optimal parameters of tDCS. Conclusion: anodal tDCS over the affected hemisphere may be as effective as cathodal tDCS on the unaffected hemisphere to enhance recovery after subacute ischemic stroke and anodal tdcs applied over the affected pharyngeal motor cortex can enhance the outcome of swallowing training in post-stroke dysphagia.

Keywords: dysphagia, stroke, cortical stimulation, transcranial direct current stimulation

Procedia PDF Downloads 135
25077 An Empirical Investigation of the Challenges of Secure Edge Computing Adoption in Organizations

Authors: Hailye Tekleselassie

Abstract:

Edge computing is a spread computing outline that transports initiative applications closer to data sources such as IoT devices or local edge servers, and possible happenstances would skull the action of new technologies. However, this investigation was attained to investigation the consciousness of technology and communications organization workers and computer users who support the service cloud. Surveys were used to achieve these objectives. Surveys were intended to attain these aims, and it is the functional using survey. Enquiries about confidence are also a key question. Problems like data privacy, integrity, and availability are the factors affecting the company’s acceptance of the service cloud.

Keywords: IoT, data, security, edge computing

Procedia PDF Downloads 83
25076 Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks

Authors: Amirhossein Mohajerzadeh, Abolghasem Mohajerzadeh

Abstract:

In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.

Keywords: aggregation, estimation, queuing, wireless sensor network

Procedia PDF Downloads 186
25075 Research and Application of Consultative Committee for Space Data Systems Wireless Communications Standards for Spacecraft

Authors: Cuitao Zhang, Xiongwen He

Abstract:

According to the new requirements of the future spacecraft, such as networking, modularization and non-cable, this paper studies the CCSDS wireless communications standards, and focuses on the low data-rate wireless communications for spacecraft monitoring and control. The application fields and advantages of wireless communications are analyzed. Wireless communications technology has significant advantages in reducing the weight of the spacecraft, saving time in spacecraft integration, etc. Based on this technology, a scheme for spacecraft data system is put forward. The corresponding block diagram and key wireless interface design of the spacecraft data system are given. The design proposal of the wireless node and information flow of the spacecraft are also analyzed. The results show that the wireless communications scheme is reasonable and feasible. The wireless communications technology can meet the future spacecraft demands in networking, modularization and non-cable.

Keywords: Consultative Committee for Space Data Systems (CCSDS) standards, information flow, non-cable, spacecraft, wireless communications

Procedia PDF Downloads 329
25074 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: inversion, limitations, optimization, resistivity

Procedia PDF Downloads 365
25073 Analysis of Lesotho Wool Production and Quality Trends 2008-2018

Authors: Papali Maqalika

Abstract:

Lesotho farmers produce significant quantities of Merino wool of a quality competitive on the global market and make a substantial impact on the economy of Lesotho. However, even with the economic contribution, the production and quality information and trends of this fibre has been recognised nor documented. This is a sombre shortcoming as Lesotho wool is unknown on international markets. The situation is worsened by the fact that Lesotho wool is auction together with South African wool, trading and benchmarking Lesotho wool are difficult not to mention attempts to advance its production and quality. Based on the information above, available data on Lesotho wool for 10 years were collected and analysed for trends to used in benchmarking where applicable. The fibre properties analysed include fibre diameter (fineness), vegetable matter and yield, application and price. These were selected because they are fundamental in determining fibre quality and price. Production of wool in Lesotho has increased slightly over the ten years covered by this study. It also became apparent that production and quality trends of Lesotho wool are greatly influenced by the farming practices, breed of sheep and climatic conditions. Greater adoption of the merino sheep breed, sheds/barns and sheep coats are suggested as ways to reduce mortality rate (due to extremely cold temperatures), to reduce the vegetable matter on the fibre thus improving the quality and increase yield per sheep and production as a whole. Some farming practices such as the lack of barns, supplementary feeding and veterinary care present constraints in wool production. The districts in the Highlands region were found to have the highest production of mostly wool, this being ascribed to better pastures, climatic, social and other conditions conducive to wool production. The production of Lesotho wool and its quality can be improved further, possibly because of the interventions the Ministry of Agriculture introduced through the Small Agricultural and Development Project (SADP) and other appropriate initiatives by the National Wool and Mohair Growers Association (NWMGA). The challenge however, remains the lack of direct involvement of the wool growers (farmers) in decisions making and policy development, this potentially influences and may lead to the reluctance to adopt the strategies. In some cases, the wool growers do not receive the benefits associated with the interventions immediately. Based on these discoveries; it is recommended that the relevant educators and researchers in wool and textile science, as well as the local wool farmers in Lesotho, be represented in policy and other decision making forums relating to these interventions. In this way, educational campaigns and training workshops will be demand driven with a better chance of adoption and success. This is because the direct beneficiaries will have been involved at inception and they will have a sense of ownership as well as intent to see them through successfully.

Keywords: lesotho wool, wool quality, wool production, lesotho economy, global market, apparel wool, database, textile science, exports, animal farming practices, intimate apparel, interventions

Procedia PDF Downloads 91
25072 Exploring the Correlation between Population Distribution and Urban Heat Island under Urban Data: Taking Shenzhen Urban Heat Island as an Example

Authors: Wang Yang

Abstract:

Shenzhen is a modern city of China's reform and opening-up policy, the development of urban morphology has been established on the administration of the Chinese government. This city`s planning paradigm is primarily affected by the spatial structure and human behavior. The subjective urban agglomeration center is divided into several groups and centers. In comparisons of this effect, the city development law has better to be neglected. With the continuous development of the internet, extensive data technology has been introduced in China. Data mining and data analysis has become important tools in municipal research. Data mining has been utilized to improve data cleaning such as receiving business data, traffic data and population data. Prior to data mining, government data were collected by traditional means, then were analyzed using city-relationship research, delaying the timeliness of urban development, especially for the contemporary city. Data update speed is very fast and based on the Internet. The city's point of interest (POI) in the excavation serves as data source affecting the city design, while satellite remote sensing is used as a reference object, city analysis is conducted in both directions, the administrative paradigm of government is broken and urban research is restored. Therefore, the use of data mining in urban analysis is very important. The satellite remote sensing data of the Shenzhen city in July 2018 were measured by the satellite Modis sensor and can be utilized to perform land surface temperature inversion, and analyze city heat island distribution of Shenzhen. This article acquired and classified the data from Shenzhen by using Data crawler technology. Data of Shenzhen heat island and interest points were simulated and analyzed in the GIS platform to discover the main features of functional equivalent distribution influence. Shenzhen is located in the east-west area of China. The city’s main streets are also determined according to the direction of city development. Therefore, it is determined that the functional area of the city is also distributed in the east-west direction. The urban heat island can express the heat map according to the functional urban area. Regional POI has correspondence. The research result clearly explains that the distribution of the urban heat island and the distribution of urban POIs are one-to-one correspondence. Urban heat island is primarily influenced by the properties of the underlying surface, avoiding the impact of urban climate. Using urban POIs as analysis object, the distribution of municipal POIs and population aggregation are closely connected, so that the distribution of the population corresponded with the distribution of the urban heat island.

Keywords: POI, satellite remote sensing, the population distribution, urban heat island thermal map

Procedia PDF Downloads 104
25071 A Proposal of Ontology about Brazilian Government Transparency Portal

Authors: Estela Mayra de Moura Vianna, Thiago José Tavares Ávila, Bruno Morais Silva, Diego Henrique Bezerra, Paulo Henrique Gomes Silva, Alan Pedro da Silva

Abstract:

The Brazilian Federal Constitution defines the access to information as a crucial right of the citizen and the Law on Access to Public Information, which regulates this right. Accordingly, the Fiscal Responsibility Act, 2000, amended in 2009 by the “Law of Transparency”, began demanding a wider disclosure of public accounts for the society, including electronic media for public access. Thus, public entities began to create "Transparency Portals," which aim to gather a diversity of data and information. However, this information, in general, is still published in formats that do not simplify understanding of the data by citizens and that could be better especially available for audit purposes. In this context, a proposal of ontology about Brazilian Transparency Portal can play a key role in how these data will be better available. This study aims to identify and implement in ontology, the data model about Transparency Portal ecosystem, with emphasis in activities that use these data for some applications, like audits, press activities, social government control, and others.

Keywords: audit, government transparency, ontology, public sector

Procedia PDF Downloads 506