Search results for: multivariate failure-time data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25437

Search results for: multivariate failure-time data

24087 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models

Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu

Abstract:

Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.

Keywords: DTM, Unmanned Aerial Vehicle (UAV), uniform, random, kriging

Procedia PDF Downloads 156
24086 Exploring Influence Range of Tainan City Using Electronic Toll Collection Big Data

Authors: Chen Chou, Feng-Tyan Lin

Abstract:

Big Data has been attracted a lot of attentions in many fields for analyzing research issues based on a large number of maternal data. Electronic Toll Collection (ETC) is one of Intelligent Transportation System (ITS) applications in Taiwan, used to record starting point, end point, distance and travel time of vehicle on the national freeway. This study, taking advantage of ETC big data, combined with urban planning theory, attempts to explore various phenomena of inter-city transportation activities. ETC, one of government's open data, is numerous, complete and quick-update. One may recall that living area has been delimited with location, population, area and subjective consciousness. However, these factors cannot appropriately reflect what people’s movement path is in daily life. In this study, the concept of "Living Area" is replaced by "Influence Range" to show dynamic and variation with time and purposes of activities. This study uses data mining with Python and Excel, and visualizes the number of trips with GIS to explore influence range of Tainan city and the purpose of trips, and discuss living area delimited in current. It dialogues between the concepts of "Central Place Theory" and "Living Area", presents the new point of view, integrates the application of big data, urban planning and transportation. The finding will be valuable for resource allocation and land apportionment of spatial planning.

Keywords: Big Data, ITS, influence range, living area, central place theory, visualization

Procedia PDF Downloads 279
24085 Performance Analysis of Hierarchical Agglomerative Clustering in a Wireless Sensor Network Using Quantitative Data

Authors: Tapan Jain, Davender Singh Saini

Abstract:

Clustering is a useful mechanism in wireless sensor networks which helps to cope with scalability and data transmission problems. The basic aim of our research work is to provide efficient clustering using Hierarchical agglomerative clustering (HAC). If the distance between the sensing nodes is calculated using their location then it’s quantitative HAC. This paper compares the various agglomerative clustering techniques applied in a wireless sensor network using the quantitative data. The simulations are done in MATLAB and the comparisons are made between the different protocols using dendrograms.

Keywords: routing, hierarchical clustering, agglomerative, quantitative, wireless sensor network

Procedia PDF Downloads 615
24084 A Novel Hybrid Deep Learning Architecture for Predicting Acute Kidney Injury Using Patient Record Data and Ultrasound Kidney Images

Authors: Sophia Shi

Abstract:

Acute kidney injury (AKI) is the sudden onset of kidney damage in which the kidneys cannot filter waste from the blood, requiring emergency hospitalization. AKI patient mortality rate is high in the ICU and is virtually impossible for doctors to predict because it is so unexpected. Currently, there is no hybrid model predicting AKI that takes advantage of two types of data. De-identified patient data from the MIMIC-III database and de-identified kidney images and corresponding patient records from the Beijing Hospital of the Ministry of Health were collected. Using data features including serum creatinine among others, two numeric models using MIMIC and Beijing Hospital data were built, and with the hospital ultrasounds, an image-only model was built. Convolutional neural networks (CNN) were used, VGG and Resnet for numeric data and Resnet for image data, and they were combined into a hybrid model by concatenating feature maps of both types of models to create a new input. This input enters another CNN block and then two fully connected layers, ending in a binary output after running through Softmax and additional code. The hybrid model successfully predicted AKI and the highest AUROC of the model was 0.953, achieving an accuracy of 90% and F1-score of 0.91. This model can be implemented into urgent clinical settings such as the ICU and aid doctors by assessing the risk of AKI shortly after the patient’s admission to the ICU, so that doctors can take preventative measures and diminish mortality risks and severe kidney damage.

Keywords: Acute kidney injury, Convolutional neural network, Hybrid deep learning, Patient record data, ResNet, Ultrasound kidney images, VGG

Procedia PDF Downloads 132
24083 Development of a Numerical Model to Predict Wear in Grouted Connections for Offshore Wind Turbine Generators

Authors: Paul Dallyn, Ashraf El-Hamalawi, Alessandro Palmeri, Bob Knight

Abstract:

In order to better understand the long term implications of the grout wear failure mode in large-diameter plain-sided grouted connections, a numerical model has been developed and calibrated that can take advantage of existing operational plant data to predict the wear accumulation for the actual load conditions experienced over a given period, thus limiting the need for expensive monitoring systems. This model has been derived and calibrated based on site structural condition monitoring (SCM) data and supervisory control and data acquisition systems (SCADA) data for two operational wind turbine generator substructures afflicted with this challenge, along with experimentally derived wear rates.

Keywords: grouted connection, numerical model, offshore structure, wear, wind energy

Procedia PDF Downloads 454
24082 Multimodal Deep Learning for Human Activity Recognition

Authors: Ons Slimene, Aroua Taamallah, Maha Khemaja

Abstract:

In recent years, human activity recognition (HAR) has been a key area of research due to its diverse applications. It has garnered increasing attention in the field of computer vision. HAR plays an important role in people’s daily lives as it has the ability to learn advanced knowledge about human activities from data. In HAR, activities are usually represented by exploiting different types of sensors, such as embedded sensors or visual sensors. However, these sensors have limitations, such as local obstacles, image-related obstacles, sensor unreliability, and consumer concerns. Recently, several deep learning-based approaches have been proposed for HAR and these approaches are classified into two categories based on the type of data used: vision-based approaches and sensor-based approaches. This research paper highlights the importance of multimodal data fusion from skeleton data obtained from videos and data generated by embedded sensors using deep neural networks for achieving HAR. We propose a deep multimodal fusion network based on a twostream architecture. These two streams use the Convolutional Neural Network combined with the Bidirectional LSTM (CNN BILSTM) to process skeleton data and data generated by embedded sensors and the fusion at the feature level is considered. The proposed model was evaluated on a public OPPORTUNITY++ dataset and produced a accuracy of 96.77%.

Keywords: human activity recognition, action recognition, sensors, vision, human-centric sensing, deep learning, context-awareness

Procedia PDF Downloads 101
24081 Impact of Foreign Trade on Economic Growth: A Panel Data Analysis for OECD Countries

Authors: Burcu Guvenek, Duygu Baysal Kurt

Abstract:

The impact of foreign trade on economic growth has been discussed since the Classical Economists. Today, foreign trade has become more important for the country's economy with the increasing globalization. When it comes to foreign trade, policies which may vary from country to country and from time to time as protectionism or free trade are implemented. In general, the positive effect of foreign trade on economic growth is alleged. However, as studies supporting this general acceptance take place in the economics literature, there are also studies in the opposite direction. In this paper, the impact of foreign trade on economic growth will be investigated with the help of panel data analysis. For this research, 24 OECD countries’ GDP and foreign trade data, including the period of 1990 and 2010, will be used.

Keywords: foreign trade, economic growth, OECD countries, panel data analysis

Procedia PDF Downloads 386
24080 Data-Driven Decision Making: A Reference Model for Organizational, Educational and Competency-Based Learning Systems

Authors: Emanuel Koseos

Abstract:

Data-Driven Decision Making (DDDM) refers to making decisions that are based on historical data in order to inform practice, develop strategies and implement policies that benefit organizational settings. In educational technology, DDDM facilitates the implementation of differential educational learning approaches such as Educational Data Mining (EDM) and Competency-Based Education (CBE), which commonly target university classrooms. There is a current need for DDDM models applied to middle and secondary schools from a concern for assessing the needs, progress and performance of students and educators with respect to regional standards, policies and evolution of curriculums. To address these concerns, we propose a DDDM reference model developed using educational key process initiatives as inputs to a machine learning framework implemented with statistical software (SAS, R) to provide a best-practices, complex-free and automated approach for educators at their regional level. We assessed the efficiency of the model over a six-year period using data from 45 schools and grades K-12 in the Langley, BC, Canada regional school district. We concluded that the model has wider appeal, such as business learning systems.

Keywords: competency-based learning, data-driven decision making, machine learning, secondary schools

Procedia PDF Downloads 174
24079 Data about Loggerhead Sea Turtle (Caretta caretta) and Green Turtle (Chelonia mydas) in Vlora Bay, Albania

Authors: Enerit Sacdanaku, Idriz Haxhiu

Abstract:

This study was conducted in the area of Vlora Bay, Albania. Data about Sea Turtles Caretta caretta and Chelonia mydas, belonging to two periods of time (1984–1991; 2008–2014) are given. All data gathered were analyzed using recent methodologies. For all turtles captured (as by catch), the Curve Carapace Length (CCL) and Curved Carapace Width (CCW) were measured. These data were statistically analyzed, where the mean was 67.11 cm for CCL and 57.57 cm for CCW of all individuals studied (n=13). All untagged individuals of marine turtles were tagged using metallic tags (Stockbrand’s titanium tag) with an Albanian address. Sex was determined and resulted that 45.4% of individuals were females, 27.3% males and 27.3% juveniles. All turtles were studied for the presence of the epibionts. The area of Vlora Bay is used from marine turtles (Caretta caretta) as a migratory corridor to pass from the Mediterranean to the northern part of the Adriatic Sea.

Keywords: Caretta caretta, Chelonia mydas, CCL, CCW, tagging, Vlora Bay

Procedia PDF Downloads 179
24078 Computational Modeling of Load Limits of Carbon Fibre Composite Laminates Subjected to Low-Velocity Impact Utilizing Convolution-Based Fast Fourier Data Filtering Algorithms

Authors: Farhat Imtiaz, Umar Farooq

Abstract:

In this work, we developed a computational model to predict ply level failure in impacted composite laminates. Data obtained from physical testing from flat and round nose impacts of 8-, 16-, 24-ply laminates were considered. Routine inspections of the tested laminates were carried out to approximate ply by ply inflicted damage incurred. Plots consisting of load–time, load–deflection, and energy–time history were drawn to approximate the inflicted damages. Impact test generated unwanted data logged due to restrictions on testing and logging systems were also filtered. Conventional filters (built-in, statistical, and numerical) reliably predicted load thresholds for relatively thin laminates such as eight and sixteen ply panels. However, for relatively thick laminates such as twenty-four ply laminates impacted by flat nose impact generated clipped data which can just be de-noised using oscillatory algorithms. The literature search reveals that modern oscillatory data filtering and extrapolation algorithms have scarcely been utilized. This investigation reports applications of filtering and extrapolation of the clipped data utilising fast Fourier Convolution algorithm to predict load thresholds. Some of the results were related to the impact-induced damage areas identified with Ultrasonic C-scans and found to be in acceptable agreement. Based on consistent findings, utilizing of modern data filtering and extrapolation algorithms to data logged by the existing machines has efficiently enhanced data interpretations without resorting to extra resources. The algorithms could be useful for impact-induced damage approximations of similar cases.

Keywords: fibre reinforced laminates, fast Fourier algorithms, mechanical testing, data filtering and extrapolation

Procedia PDF Downloads 135
24077 Return to Bowel Function after Right versus Extended Right Hemicolectomy: A Retrospective Review

Authors: Zak Maas, Daniel Carson, Rachel McIntyre, Mark Omundsen, Teresa Holm

Abstract:

Aim: After hemicolectomy a period of obligatory bowel dysfunction is expected, termed postoperative ileus (POI). Prolonged postoperative ileus (PPOI), typically four or more days, is associated with higher morbidity and extended inpatient stay. This leads to significant financial and resource-related burdens on healthcare systems. Several studies including a meta-analysis have compared rates of PPOI in left vs right hemicolectomy, which suggest that right-sided resections may be more likely to result in PPOI. Our study aims to further investigate whether significant differences in PPOI and obligatory POI exist between right versus extended right hemicolectomy. Methods: This is a retrospective review assessing rates of PPOI in patients who underwent right vs extended right hemicolectomy at Tauranga Hospital. Patients were divided and compared depending on approach (open versus laparoscopic) and acuity (acute versus elective). Exclusion criteria included synchronous major operations and patients preoperatively on parenteral nutrition. Primary outcome was PPOI as pre-defined in contemporary literature. Secondary outcomes were time to passage of flatus, passage of stool, toleration of oral diet and rate of complications. Results: There were 669 patients identified for analysis (507 laparoscopic vs 162 open; 194 acute vs 475 elective). Early analysis indicates rates of PPOI was significantly increased in patients undergoing extended right hemicolectomy. Factors including age, gender, ethnicity, preoperative haemaglobin, preoperative albumin and diagnosis of inflammatory bowel disease were examined by multivariate analysis to determine correlation with PPOI. Conclusion: PPOI is a common complication of hemicolectomy surgery. Higher rates of PPOI in extended right vs right hemicolectomy warrants further research into determining the cause. This study examines some other factors which may contribute to PPOI.

Keywords: hemicolectomy, colorectal, complications, postoperative ileus

Procedia PDF Downloads 88
24076 Design of Incident Information System in IoT Virtualization Platform

Authors: Amon Olimov, Umarov Jamshid, Dae-Ho Kim, Chol-U Lee, Ryum-Duck Oh

Abstract:

This paper proposes IoT virtualization platform based incident information system. IoT information based environment is the platform that was developed for the purpose of collecting a variety of data by managing regionally scattered IoT devices easily and conveniently in addition to analyzing data collected from roads. Moreover, this paper configured the platform for the purpose of providing incident information based on sensed data. It also provides the same input/output interface as UNIX and Linux by means of matching IoT devices with the directory of file system and also the files. In addition, it has a variety of approaches as to the devices. Thus, it can be applied to not only incident information but also other platforms. This paper proposes the incident information system that identifies and provides various data in real time as to urgent matters on roads based on the existing USN/M2M and IoT visualization platform.

Keywords: incident information system, IoT, virtualization platform, USN, M2M

Procedia PDF Downloads 351
24075 Social Network Analysis as a Research and Pedagogy Tool in Problem-Focused Undergraduate Social Innovation Courses

Authors: Sean McCarthy, Patrice M. Ludwig, Will Watson

Abstract:

This exploratory case study explores the deployment of Social Network Analysis (SNA) in mapping community assets in an interdisciplinary, undergraduate, team-taught course focused on income insecure populations in a rural area in the US. Specifically, it analyzes how students were taught to collect data on community assets and to visualize the connections between those assets using Kumu, an SNA data visualization tool. Further, the case study shows how social network data was also collected about student teams via their written communications in Slack, an enterprise messaging tool, which enabled instructors to manage and guide student research activity throughout the semester. The discussion presents how SNA methods can simultaneously inform both community-based research and social innovation pedagogy through the use of data visualization and collaboration-focused communication technologies.

Keywords: social innovation, social network analysis, pedagogy, problem-based learning, data visualization, information communication technologies

Procedia PDF Downloads 147
24074 Mobile Learning: Toward Better Understanding of Compression Techniques

Authors: Farouk Lawan Gambo

Abstract:

Data compression shrinks files into fewer bits then their original presentation. It has more advantage on internet because the smaller a file, the faster it can be transferred but learning most of the concepts in data compression are abstract in nature therefore making them difficult to digest by some students (Engineers in particular). To determine the best approach toward learning data compression technique, this paper first study the learning preference of engineering students who tend to have strong active, sensing, visual and sequential learning preferences, the paper also study the advantage that mobility of learning have experienced; Learning at the point of interest, efficiency, connection, and many more. A survey is carried out with some reasonable number of students, through random sampling to see whether considering the learning preference and advantages in mobility of learning will give a promising improvement over the traditional way of learning. Evidence from data analysis using Ms-Excel as a point of concern for error-free findings shows that there is significance different in the students after using learning content provided on smart phone, also the result of the findings presented in, bar charts and pie charts interpret that mobile learning has to be promising feature of learning.

Keywords: data analysis, compression techniques, learning content, traditional learning approach

Procedia PDF Downloads 347
24073 Human Immunodeficiency Virus (HIV) Test Predictive Modeling and Identify Determinants of HIV Testing for People with Age above Fourteen Years in Ethiopia Using Data Mining Techniques: EDHS 2011

Authors: S. Abera, T. Gidey, W. Terefe

Abstract:

Introduction: Testing for HIV is the key entry point to HIV prevention, treatment, and care and support services. Hence, predictive data mining techniques can greatly benefit to analyze and discover new patterns from huge datasets like that of EDHS 2011 data. Objectives: The objective of this study is to build a predictive modeling for HIV testing and identify determinants of HIV testing for adults with age above fourteen years using data mining techniques. Methods: Cross-Industry Standard Process for Data Mining (CRISP-DM) was used to predict the model for HIV testing and explore association rules between HIV testing and the selected attributes among adult Ethiopians. Decision tree, Naïve-Bayes, logistic regression and artificial neural networks of data mining techniques were used to build the predictive models. Results: The target dataset contained 30,625 study participants; of which 16, 515 (53.9%) were women. Nearly two-fifth; 17,719 (58%), have never been tested for HIV while the rest 12,906 (42%) had been tested. Ethiopians with higher wealth index, higher educational level, belonging 20 to 29 years old, having no stigmatizing attitude towards HIV positive person, urban residents, having HIV related knowledge, information about family planning on mass media and knowing a place where to get testing for HIV showed an increased patterns with respect to HIV testing. Conclusion and Recommendation: Public health interventions should consider the identified determinants to promote people to get testing for HIV.

Keywords: data mining, HIV, testing, ethiopia

Procedia PDF Downloads 497
24072 Assessing Flood Risk and Mapping Inundation Zones in the Kelantan River Basin: A Hydrodynamic Modeling Approach

Authors: Fatemehsadat Mortazavizadeh, Amin Dehghani, Majid Mirzaei, Nurulhuda Binti Mohammad Ramli, Adnan Dehghani

Abstract:

Flood is Malaysia's most common and serious natural disaster. Kelantan River Basin is a tropical basin that experiences a rainy season during North-East Monsoon from November to March. It is also one of the hardest hit areas in Peninsular Malaysia during the heavy monsoon rainfall. Considering the consequences of the flood events, it is essential to develop the flood inundation map as part of the mitigation approach. In this study, the delineation of flood inundation zone in the area of Kelantan River basin using a hydrodynamic model is done by HEC-RAS, QGIS and ArcMap. The streamflow data has been generated with the weather generator based on the observation data. Then, the data is statistically analyzed with the Extreme Value (EV1) method for 2-, 5-, 25-, 50- and 100-year return periods. The minimum depth, maximum depth, mean depth, and the standard deviation of all the scenarios, including the OBS, are observed and analyzed. Based on the results, generally, the value of the data increases with the return period for all the scenarios. However, there are certain scenarios that have different results, which not all the data obtained are increasing with the return period. Besides, OBS data resulted in the middle range within Scenario 1 to Scenario 40.

Keywords: flood inundation, kelantan river basin, hydrodynamic model, extreme value analysis

Procedia PDF Downloads 70
24071 The Use of Network Tool for Brain Signal Data Analysis: A Case Study with Blind and Sighted Individuals

Authors: Cleiton Pons Ferreira, Diana Francisca Adamatti

Abstract:

Advancements in computers technology have allowed to obtain information for research in biology and neuroscience. In order to transform the data from these surveys, networks have long been used to represent important biological processes, changing the use of this tools from purely illustrative and didactic to more analytic, even including interaction analysis and hypothesis formulation. Many studies have involved this application, but not directly for interpretation of data obtained from brain functions, asking for new perspectives of development in neuroinformatics using existent models of tools already disseminated by the bioinformatics. This study includes an analysis of neurological data through electroencephalogram (EEG) signals, using the Cytoscape, an open source software tool for visualizing complex networks in biological databases. The data were obtained from a comparative case study developed in a research from the University of Rio Grande (FURG), using the EEG signals from a Brain Computer Interface (BCI) with 32 eletrodes prepared in the brain of a blind and a sighted individuals during the execution of an activity that stimulated the spatial ability. This study intends to present results that lead to better ways for use and adapt techniques that support the data treatment of brain signals for elevate the understanding and learning in neuroscience.

Keywords: neuroinformatics, bioinformatics, network tools, brain mapping

Procedia PDF Downloads 182
24070 Analysis of the Impact of Climate Change on Maize (Zea Mays) Yield in Central Ethiopia

Authors: Takele Nemomsa, Girma Mamo, Tesfaye Balemi

Abstract:

Climate change refers to a change in the state of the climate that can be identified (e.g. using statistical tests) by changes in the mean and/or variance of its properties and that persists for an extended period, typically decades or longer. In Ethiopia; Maize production in relation to climate change at regional and sub- regional scales have not been studied in detail. Thus, this study was aimed to analyse the impact of climate change on maize yield in Ambo Districts, Central Ethiopia. To this effect, weather data, soil data and maize experimental data for Arganne hybrid were used. APSIM software was used to investigate the response of maize (Zea mays) yield to different agronomic management practices using current and future (2020s–2080s) climate data. The climate change projections data which were downscaled using SDSM were used as input of climate data for the impact analysis. Compared to agronomic practices the impact of climate change on Arganne in Central Ethiopia is minute. However, within 2020s-2080s in Ambo area; the yield of Arganne hybrid is projected to reduce by 1.06% to 2.02%, and in 2050s it is projected to reduce by 1.56 While in 2080s; it is projected to increase by 1.03% to 2.07%. Thus, to adapt to the changing climate; farmers should consider increasing plant density and fertilizer rate per hectare.

Keywords: APSIM, downscaling, response, SDSM

Procedia PDF Downloads 383
24069 Aerodynamic Modeling Using Flight Data at High Angle of Attack

Authors: Rakesh Kumar, A. K. Ghosh

Abstract:

The paper presents the modeling of linear and nonlinear longitudinal aerodynamics using real flight data of Hansa-3 aircraft gathered at low and high angles of attack. The Neural-Gauss-Newton (NGN) method has been applied to model the linear and nonlinear longitudinal dynamics and estimate parameters from flight data. Unsteady aerodynamics due to flow separation at high angles of attack near stall has been included in the aerodynamic model using Kirchhoff’s quasi-steady stall model. NGN method is an algorithm that utilizes Feed Forward Neural Network (FFNN) and Gauss-Newton optimization to estimate the parameters and it does not require any a priori postulation of mathematical model or solving of equations of motion. NGN method was validated on real flight data generated at moderate angles of attack before application to the data at high angles of attack. The estimates obtained from compatible flight data using NGN method were validated by comparing with wind tunnel values and the maximum likelihood estimates. Validation was also carried out by comparing the response of measured motion variables with the response generated by using estimates a different control input. Next, NGN method was applied to real flight data generated by executing a well-designed quasi-steady stall maneuver. The results obtained in terms of stall characteristics and aerodynamic parameters were encouraging and reasonably accurate to establish NGN as a method for modeling nonlinear aerodynamics from real flight data at high angles of attack.

Keywords: parameter estimation, NGN method, linear and nonlinear, aerodynamic modeling

Procedia PDF Downloads 446
24068 Big Data’s Mechanistic View of Human Behavior May Displace Traditional Library Missions That Empower Users

Authors: Gabriel Gomez

Abstract:

The very concept of information seeking behavior, and the means by which librarians teach users to gain information, that is information literacy, are at the heart of how libraries deliver information, but big data will forever change human interaction with information and the way such behavior is both studied and taught. Just as importantly, big data will orient the study of behavior towards commercial ends because of a tendency towards instrumentalist views of human behavior, something one might also call a trend towards behaviorism. This oral presentation seeks to explore how the impact of big data on understandings of human behavior might impact a library information science (LIS) view of human behavior and information literacy, and what this might mean for social justice aims and concomitant community action normally at the center of librarianship. The methodology employed here is a non-empirical examination of current understandings of LIS in regards to social justice alongside an examination of the benefits and dangers foreseen with the growth of big data analysis. The rise of big data within the ever-changing information environment encapsulates a shift to a more mechanistic view of human behavior, one that can easily encompass information seeking behavior and information use. As commercial aims displace the important political and ethical aims that are often central to the missions espoused by libraries and the social sciences, the very altruism and power relations found in LIS are at risk. In this oral presentation, an examination of the social justice impulses of librarians regarding power and information demonstrates how such impulses can be challenged by big data, particularly as librarians understand user behavior and promote information literacy. The creeping behaviorist impulse inherent in the emphasis big data places on specific solutions, that is answers to question that ask how, as opposed to larger questions that hint at an understanding of why people learn or use information threaten library information science ideals. Together with the commercial nature of most big data, this existential threat can harm the social justice nature of librarianship.

Keywords: big data, library information science, behaviorism, librarianship

Procedia PDF Downloads 383
24067 Data Collection with Bounded-Sized Messages in Wireless Sensor Networks

Authors: Min Kyung An

Abstract:

In this paper, we study the data collection problem in Wireless Sensor Networks (WSNs) adopting the two interference models: The graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR). The main issue of the problem is to compute schedules with the minimum number of timeslots, that is, to compute the minimum latency schedules, such that data from every node can be collected without any collision or interference to a sink node. While existing works studied the problem with unit-sized and unbounded-sized message models, we investigate the problem with the bounded-sized message model, and introduce a constant factor approximation algorithm. To the best known of our knowledge, our result is the first result of the data collection problem with bounded-sized model in both interference models.

Keywords: data collection, collision-free, interference-free, physical interference model, SINR, approximation, bounded-sized message model, wireless sensor networks

Procedia PDF Downloads 222
24066 Neuropsychological Deficits in Drug-Resistant Epilepsy

Authors: Timea Harmath-Tánczos

Abstract:

Drug-resistant epilepsy (DRE) is defined as the persistence of seizures despite at least two syndrome-adapted antiseizure drugs (ASD) used at efficacious daily doses. About a third of patients with epilepsy suffer from drug resistance. Cognitive assessment has a crucial role in the diagnosis and clinical management of epilepsy. Previous studies have addressed the clinical targets and indications for measuring neuropsychological functions; best to our knowledge, no studies have examined it in a Hungarian therapy-resistant population. To fill this gap, we investigated the Hungarian diagnostic protocol between 18 and 65 years of age. This study aimed to describe and analyze neuropsychological functions in patients with drug-resistant epilepsy and identify factors associated with neuropsychology deficits. We perform a prospective case-control study comparing neuropsychological performances in 50 adult patients and 50 healthy individuals between March 2023 and July 2023. Neuropsychological functions were examined in both patients and controls using a full set of specific tests (general performance level, motor functions, attention, executive facts., verbal and visual memory, language, and visual-spatial functions). Potential risk factors for neuropsychological deficit were assessed in the patient group using a multivariate analysis. The two groups did not differ in age, sex, dominant hand and level of education. Compared with the control group, patients with drug-resistant epilepsy showed worse performance on motor functions and visuospatial memory, sustained attention, inhibition and verbal memory. Neuropsychological deficits could therefore be systematically detected in patients with drug-resistant epilepsy in order to provide neuropsychological therapy and improve quality of life. The analysis of the classical and complex indices of the special neuropsychological tasks presented in the presentation can help in the investigation of normal and disrupted memory and executive functions in the DRE.

Keywords: drug-resistant epilepsy, Hungarian diagnostic protocol, memory, executive functions, cognitive neuropsychology

Procedia PDF Downloads 76
24065 The Twin Terminal of Pedestrian Trajectory Based on City Intelligent Model (CIM) 4.0

Authors: Chen Xi, Lao Xuerui, Li Junjie, Jiang Yike, Wang Hanwei, Zeng Zihao

Abstract:

To further promote the development of smart cities, the microscopic "nerve endings" of the City Intelligent Model (CIM) are extended to be more sensitive. In this paper, we develop a pedestrian trajectory twin terminal based on the CIM and CNN technology. It also uses 5G networks, architectural and geoinformatics technologies, convolutional neural networks, combined with deep learning networks for human behaviour recognition models, to provide empirical data such as 'pedestrian flow data and human behavioural characteristics data', and ultimately form spatial performance evaluation criteria and spatial performance warning systems, to make the empirical data accurate and intelligent for prediction and decision making.

Keywords: urban planning, urban governance, CIM, artificial intelligence, convolutional neural network

Procedia PDF Downloads 150
24064 Correlation of Serum Apelin Level with Coronary Calcium Score in Patients with Suspected Coronary Artery Disease

Authors: M. Zeitoun, K. Abdallah, M. Rashwan

Abstract:

Introduction: A growing body of evidence indicates that apelin, a relatively recent member of the adipokines family, has a potential anti-atherogenic effect. An association between low serum apelin state and coronary artery disease (CAD) was previously reported; however, the relationship between apelin and the atherosclerotic burden was unclear. Objectives: Our aim was to explore the correlation of serum apelin level with coronary calcium score (CCS) as a quantitative marker of coronary atherosclerosis. Methods: This observational cross-sectional study enrolled 100 consecutive subjects referred for cardiac multi-detector computed tomography (MDCT) for assessment of CAD (mean age 54 ± 9.7 years, 51 male and 49 females). Clinical parameters, glycemic and lipid profile, high sensitivity CRP (hsCRP), homeostasis model assessment of insulin resistance (HOMA-IR), serum creatinine and complete blood count were assessed. Serum apelin levels were determined using a commercially available Enzyme Immunoassay (EIA) Kit. High-resolution non-contrast CT images were acquired by a 64-raw MDCT and CCS was calculated using the Agatston scoring method. Results: Forty-three percent of the studied subjects had positive coronary artery calcification (CAC). The mean CCS was 79 ± 196.5 Agatston units. Subjects with detectable CAC had significantly higher fasting plasma glucose, HbA1c, and WBCs count than subjects without detectable CAC (p < 0.05). Most importantly, subjects with detectable CAC had significantly lower serum apelin level than subjects without CAC (1.3 ± 0.4 ng/ml vs. 2.8 ± 0.6 ng/ml, p < 0.001). In addition, there was a statistically significant inverse correlation between serum apelin levels and CCS (r = 0.591, p < 0.001); on multivariate analysis this correlation was found to be independent of traditional cardiovascular risk factors and hs-CRP. Conclusion:To the best of our knowledge, this is the first report of an independent association between apelin and CCS in patients with suspected CAD. Apelin emerges as a possible novel biomarker for CAD, but this result remains to be proved prospectively.

Keywords: HbA1c, apelin, adipokines, coronary calcium score (CCS), coronary artery disease (CAD)

Procedia PDF Downloads 342
24063 Understanding Cyber Terrorism from Motivational Perspectives: A Qualitative Data Analysis

Authors: Yunos Zahri, Ariffin Aswami

Abstract:

Cyber terrorism represents the convergence of two worlds: virtual and physical. The virtual world is a place in which computer programs function and data move, whereas the physical world is where people live and function. The merging of these two domains is the interface being targeted in the incidence of cyber terrorism. To better understand why cyber terrorism acts are committed, this study presents the context of cyber terrorism from motivational perspectives. Motivational forces behind cyber terrorism can be social, political, ideological and economic. In this research, data are analyzed using a qualitative method. A semi-structured interview with purposive sampling was used for data collection. With the growing interconnectedness between critical infrastructures and Information & Communication Technology (ICT), selecting targets that facilitate maximum disruption can significantly influence terrorists. This work provides a baseline for defining the concept of cyber terrorism from motivational perspectives.

Keywords: cyber terrorism, terrorism, motivation, qualitative analysis

Procedia PDF Downloads 422
24062 Research Analysis of Urban Area Expansion Based on Remote Sensing

Authors: Sheheryar Khan, Weidong Li, Fanqian Meng

Abstract:

The Urban Heat Island (UHI) effect is one of the foremost problems out of other ecological and socioeconomic issues in urbanization. Due to this phenomenon that human-made urban areas have replaced the rural landscape with the surface that increases thermal conductivity and urban warmth; as a result, the temperature in the city is higher than in the surrounding rural areas. To affect the evidence of this phenomenon in the Zhengzhou city area, an observation of the temperature variations in the urban area is done through a scientific method that has been followed. Landsat 8 satellite images were taken from 2013 to 2015 to calculate the effect of Urban Heat Island (UHI) along with the NPP-VRRIS night-time remote sensing data to analyze the result for a better understanding of the center of the built-up area. To further support the evidence, the correlation between land surface temperatures and the normalized difference vegetation index (NDVI) was calculated using the Red band 4 and Near-infrared band 5 of the Landsat 8 data. Mono-window algorithm was applied to retrieve the land surface temperature (LST) distribution from the Landsat 8 data using Band 10 and 11 accordingly to convert the top-of-atmosphere radiance (TOA) and to convert the satellite brightness temperature. Along with Landsat 8 data, NPP-VIIRS night-light data is preprocessed to get the research area data. The analysis between Landsat 8 data and NPP night-light data was taken to compare the output center of the Built-up area of Zhengzhou city.

Keywords: built-up area, land surface temperature, mono-window algorithm, NDVI, remote sensing, threshold method, Zhengzhou

Procedia PDF Downloads 139
24061 A Comparative Study of the Athlete Health Records' Minimum Data Set in Selected Countries and Presenting a Model for Iran

Authors: Robab Abdolkhani, Farzin Halabchi, Reza Safdari, Goli Arji

Abstract:

Background and purpose: The quality of health record depends on the quality of its content and proper documentation. Minimum data set makes a standard method for collecting key data elements that make them easy to understand and enable comparison. The aim of this study was to determine the minimum data set for Iranian athletes’ health records. Methods: This study is an applied research of a descriptive comparative type which was carried out in 2013. By using internal and external forms of documentation, a checklist was created that included data elements of athletes health record and was subjected to debate in Delphi method by experts in the field of sports medicine and health information management. Results: From 97 elements which were subjected to discussion, 85 elements by more than 75 percent of the participants (as the main elements) and 12 elements by 50 to 75 percent of the participants (as the proposed elements) were agreed upon. In about 97 elements of the case, there was no significant difference between responses of alumni groups of sport pathology and sports medicine specialists with medical record, medical informatics and information management professionals. Conclusion: Minimum data set of Iranian athletes’ health record with four information categories including demographic information, health history, assessment and treatment plan was presented. The proposed model is available for manual and electronic medical records.

Keywords: Documentation, Health record, Minimum data set, Sports medicine

Procedia PDF Downloads 480
24060 Data Collection in Protected Agriculture for Subsequent Big Data Analysis: Methodological Evaluation in Venezuela

Authors: Maria Antonieta Erna Castillo Holly

Abstract:

During the last decade, data analysis, strategic decision making, and the use of artificial intelligence (AI) tools in Latin American agriculture have been a challenge. In some countries, the availability, quality, and reliability of historical data, in addition to the current data recording methodology in the field, makes it difficult to use information systems, complete data analysis, and their support for making the right strategic decisions. This is something essential in Agriculture 4.0. where the increase in the global demand for fresh agricultural products of tropical origin, during all the seasons of the year requires a change in the production model and greater agility in the responses to the consumer market demands of quality, quantity, traceability, and sustainability –that means extensive data-. Having quality information available and updated in real-time on what, how much, how, when, where, at what cost, and the compliance with production quality standards represents the greatest challenge for sustainable and profitable agriculture in the region. The objective of this work is to present a methodological proposal for the collection of georeferenced data from the protected agriculture sector, specifically in production units (UP) with tall structures (Greenhouses), initially for Venezuela, taking the state of Mérida as the geographical framework, and horticultural products as target crops. The document presents some background information and explains the methodology and tools used in the 3 phases of the work: diagnosis, data collection, and analysis. As a result, an evaluation of the process is carried out, relevant data and dashboards are displayed, and the first satellite maps integrated with layers of information in a geographic information system are presented. Finally, some improvement proposals and tentatively recommended applications are added to the process, understanding that their objective is to provide better qualified and traceable georeferenced data for subsequent analysis of the information and more agile and accurate strategic decision making. One of the main points of this study is the lack of quality data treatment in the Latin America area and especially in the Caribbean basin, being one of the most important points how to manage the lack of complete official data. The methodology has been tested with horticultural products, but it can be extended to other tropical crops.

Keywords: greenhouses, protected agriculture, data analysis, geographic information systems, Venezuela

Procedia PDF Downloads 131
24059 Reliable Consensus Problem for Multi-Agent Systems with Sampled-Data

Authors: S. H. Lee, M. J. Park, O. M. Kwon

Abstract:

In this paper, reliable consensus of multi-agent systems with sampled-data is investigated. By using a suitable Lyapunov-Krasovskii functional and some techniques such as Wirtinger Inequality, Schur Complement and Kronecker Product, the results of this systems are obtained by solving a set of Linear Matrix Inequalities(LMIs). One numerical example is included to show the effectiveness of the proposed criteria.

Keywords: multi-agent, linear matrix inequalities (LMIs), kronecker product, sampled-data, Lyapunov method

Procedia PDF Downloads 528
24058 Materialized View Effect on Query Performance

Authors: Yusuf Ziya Ayık, Ferhat Kahveci

Abstract:

Currently, database management systems have various tools such as backup and maintenance, and also provide statistical information such as resource usage and security. In terms of query performance, this paper covers query optimization, views, indexed tables, pre-computation materialized view, query performance analysis in which query plan alternatives can be created and the least costly one selected to optimize a query. Indexes and views can be created for related table columns. The literature review of this study showed that, in the course of time, despite the growing capabilities of the database management system, only database administrators are aware of the need for dealing with archival and transactional data types differently. These data may be constantly changing data used in everyday life, and also may be from the completed questionnaire whose data input was completed. For both types of data, the database uses its capabilities; but as shown in the findings section, instead of repeating similar heavy calculations which are carrying out same results with the same query over a survey results, using materialized view results can be in a more simple way. In this study, this performance difference was observed quantitatively considering the cost of the query.

Keywords: cost of query, database management systems, materialized view, query performance

Procedia PDF Downloads 280