Search results for: manual data inquiry
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25273

Search results for: manual data inquiry

24283 Electroencephalography-Based Intention Recognition and Consensus Assessment during Emergency Response

Authors: Siyao Zhu, Yifang Xu

Abstract:

After natural and man-made disasters, robots can bypass the danger, expedite the search, and acquire unprecedented situational awareness to design rescue plans. The hands-free requirement from the first responders excludes the use of tedious manual control and operation. In unknown, unstructured, and obstructed environments, natural-language-based supervision is not amenable for first responders to formulate, and is difficult for robots to understand. Brain-computer interface is a promising option to overcome the limitations. This study aims to test the feasibility of using electroencephalography (EEG) signals to decode human intentions and detect the level of consensus on robot-provided information. EEG signals were classified using machine-learning and deep-learning methods to discriminate search intentions and agreement perceptions. The results show that the average classification accuracy for intention recognition and consensus assessment is 67% and 72%, respectively, proving the potential of incorporating recognizable users’ bioelectrical responses into advanced robot-assisted systems for emergency response.

Keywords: consensus assessment, electroencephalogram, emergency response, human-robot collaboration, intention recognition, search and rescue

Procedia PDF Downloads 86
24282 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network

Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson

Abstract:

The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.

Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0

Procedia PDF Downloads 173
24281 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 80
24280 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization

Authors: Hironori Karachi, Haruka Yamashita

Abstract:

Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.

Keywords: data science, non-negative matrix factorization, missing data, quality of services

Procedia PDF Downloads 124
24279 Place Attachment and Residential Satisfaction in Old Residential Buildings: A Case of Pune City

Authors: Vaishali Anagal, Vasudha Gokhale, Sharvey Dhongde

Abstract:

Old buildings have significance in many aspects. The manifold significance may include historic, architectural and cultural aspects. In a cultural city like Pune, India, numerous residential buildings exist in the core city whose age may range between 60-100 years. These represent the city’s history and culture. Most of them are still in use as residential buildings with adaptations in various degrees. Some of these buildings are enlisted as ‘Heritage Buildings’ by local municipal authority. However, there are number of buildings that have heritage value although they are not enlisted as heritage sites. A lot of these buildings have already been pulled down for several reasons such as end of technical life, inadequacy for users, increasing floor area ratios, inflating land prices and changing lifestyles etc. Literature suggest that place attachment and residential satisfaction are positively related. It also indicates that length of residency is positively correlated with the place attachment. Residential satisfaction is associated with number of factors including socio demographic characteristics of users, housing characteristics, neighborhood characteristics and behavioral characteristics. This research paper poses an inquiry about the dynamics of co-relation between place attachment and residential satisfaction in case of old residential buildings. The motive of this enquiry is to examine if place attachment can serve as a strong ground for restoration of these old buildings and evade the devastation of emblems of cultural heritage of the city. The methodology includes questionnaire survey of users as well as a qualitative assessment regarding place attachment and residential satisfaction. About 20 residential buildings in the core city of Pune are selected for this purpose. The results of survey are analyzed and conclusions are drawn.

Keywords: place attachment, residential satisfaction, old residential buildings, housing characteristics, cultural heritage

Procedia PDF Downloads 235
24278 Developing Guidelines for Public Health Nurse Data Management and Use in Public Health Emergencies

Authors: Margaret S. Wright

Abstract:

Background/Significance: During many recent public health emergencies/disasters, public health nursing data has been missing or delayed, potentially impacting the decision-making and response. Data used as evidence for decision-making in response, planning, and mitigation has been erratic and slow, decreasing the ability to respond. Methodology: Applying best practices in data management and data use in public health settings, and guided by the concepts outlined in ‘Disaster Standards of Care’ models leads to the development of recommendations for a model of best practices in data management and use in public health disasters/emergencies by public health nurses. As the ‘patient’ in public health disasters/emergencies is the community (local, regional or national), guidelines for patient documentation are incorporated in the recommendations. Findings: Using model public health nurses could better plan how to prepare for, respond to, and mitigate disasters in their communities, and better participate in decision-making in all three phases bringing public health nursing data to the discussion as part of the evidence base for decision-making.

Keywords: data management, decision making, disaster planning documentation, public health nursing

Procedia PDF Downloads 212
24277 An Embarrassingly Simple Semi-supervised Approach to Increase Recall in Online Shopping Domain to Match Structured Data with Unstructured Data

Authors: Sachin Nagargoje

Abstract:

Complete labeled data is often difficult to obtain in a practical scenario. Even if one manages to obtain the data, the quality of the data is always in question. In shopping vertical, offers are the input data, which is given by advertiser with or without a good quality of information. In this paper, an author investigated the possibility of using a very simple Semi-supervised learning approach to increase the recall of unhealthy offers (has badly written Offer Title or partial product details) in shopping vertical domain. The author found that the semisupervised learning method had improved the recall in the Smart Phone category by 30% on A=B testing on 10% traffic and increased the YoY (Year over Year) number of impressions per month by 33% at production. This also made a significant increase in Revenue, but that cannot be publicly disclosed.

Keywords: semi-supervised learning, clustering, recall, coverage

Procedia PDF Downloads 116
24276 Genodata: The Human Genome Variation Using BigData

Authors: Surabhi Maiti, Prajakta Tamhankar, Prachi Uttam Mehta

Abstract:

Since the accomplishment of the Human Genome Project, there has been an unparalled escalation in the sequencing of genomic data. This project has been the first major vault in the field of medical research, especially in genomics. This project won accolades by using a concept called Bigdata which was earlier, extensively used to gain value for business. Bigdata makes use of data sets which are generally in the form of files of size terabytes, petabytes, or exabytes and these data sets were traditionally used and managed using excel sheets and RDBMS. The voluminous data made the process tedious and time consuming and hence a stronger framework called Hadoop was introduced in the field of genetic sciences to make data processing faster and efficient. This paper focuses on using SPARK which is gaining momentum with the advancement of BigData technologies. Cloud Storage is an effective medium for storage of large data sets which is generated from the genetic research and the resultant sets produced from SPARK analysis.

Keywords: human genome project, Bigdata, genomic data, SPARK, cloud storage, Hadoop

Procedia PDF Downloads 250
24275 Superordinated Control for Increasing Feed-in Capacity and Improving Power Quality in Low Voltage Distribution Grids

Authors: Markus Meyer, Bastian Maucher, Rolf Witzmann

Abstract:

The ever increasing amount of distributed generation in low voltage distribution grids (mainly PV and micro-CHP) can lead to reverse load flows from low to medium/high voltage levels at times of high feed-in. Reverse load flow leads to rising voltages that may even exceed the limits specified in the grid codes. Furthermore, the share of electrical loads connected to low voltage distribution grids via switched power supplies continuously increases. In combination with inverter-based feed-in, this results in high harmonic levels reducing overall power quality. Especially high levels of third-order harmonic currents can lead to neutral conductor overload, which is even more critical if lines with reduced neutral conductor section areas are used. This paper illustrates a possible concept for smart grids in order to increase the feed-in capacity, improve power quality and to ensure safe operation of low voltage distribution grids at all times. The key feature of the concept is a hierarchically structured control strategy that is run on a superordinated controller, which is connected to several distributed grid analyzers and inverters via broad band powerline (BPL). The strategy is devised to ensure both quick response time as well as the technically and economically reasonable use of the available inverters in the grid (PV-inverters, batteries, stepless line voltage regulators). These inverters are provided with standard features for voltage control, e.g. voltage dependent reactive power control. In addition they can receive reactive power set points transmitted by the superordinated controller. To further improve power quality, the inverters are capable of active harmonic filtering, as well as voltage balancing, whereas the latter is primarily done by the stepless line voltage regulators. By additionally connecting the superordinated controller to the control center of the grid operator, supervisory control and data acquisition capabilities for the low voltage distribution grid are enabled, which allows easy monitoring and manual input. Such a low voltage distribution grid can also be used as a virtual power plant.

Keywords: distributed generation, distribution grid, power quality, smart grid, virtual power plant, voltage control

Procedia PDF Downloads 262
24274 Simulation Model for Evaluating the Impact of Adaptive E-Learning in the Agricultural Sector

Authors: Maria Nabakooza

Abstract:

Efficient agricultural production is very significant in attaining food sufficiency and security in the world. Many methods are employed by the farmers while attending to their gardens, from manual to mechanized, with Farmers range from subsistence to commercial depending on the motive. This creates a lacuna in the modes of operation in this field as different farmers will take different approaches. This has led to many e-Learning courses being introduced to address this gap. Many e-learning systems use advanced network technologies like Web services, grid computing to promote learning at any time and any place. Many of the existing systems have not inculcated the applicability of the modules in them, the tools to be used and further access whether they are the right tools for the right job. A thorough investigation into the applicability of adaptive eLearning in the agricultural sector has not been taken into account; enabling the assumption that eLearning is the right tool for boosting productivity in this sector. This study comes in to provide an insight and thorough analysis as to whether adaptive eLearning is the right tool for boosting agricultural productivity. The Simulation will adopt a system dynamics modeling approach as a way of examining causality and effect relationship. This study will provide teachers with an insight into which tools they should adopt in designing, and provide students the opportunities to achieve an orderly learning experience through adaptive navigating e-learning services.

Keywords: agriculture, adaptive, e-learning, technology

Procedia PDF Downloads 246
24273 Ontology for a Voice Transcription of OpenStreetMap Data: The Case of Space Apprehension by Visually Impaired Persons

Authors: Said Boularouk, Didier Josselin, Eitan Altman

Abstract:

In this paper, we present a vocal ontology of OpenStreetMap data for the apprehension of space by visually impaired people. Indeed, the platform based on produsage gives a freedom to data producers to choose the descriptors of geocoded locations. Unfortunately, this freedom, called also folksonomy leads to complicate subsequent searches of data. We try to solve this issue in a simple but usable method to extract data from OSM databases in order to send them to visually impaired people using Text To Speech technology. We focus on how to help people suffering from visual disability to plan their itinerary, to comprehend a map by querying computer and getting information about surrounding environment in a mono-modal human-computer dialogue.

Keywords: TTS, ontology, open street map, visually impaired

Procedia PDF Downloads 288
24272 Design and Development of a Platform for Analyzing Spatio-Temporal Data from Wireless Sensor Networks

Authors: Walid Fantazi

Abstract:

The development of sensor technology (such as microelectromechanical systems (MEMS), wireless communications, embedded systems, distributed processing and wireless sensor applications) has contributed to a broad range of WSN applications which are capable of collecting a large amount of spatiotemporal data in real time. These systems require real-time data processing to manage storage in real time and query the data they process. In order to cover these needs, we propose in this paper a Snapshot spatiotemporal data model based on object-oriented concepts. This model allows saving storing and reducing data redundancy which makes it easier to execute spatiotemporal queries and save analyzes time. Further, to ensure the robustness of the system as well as the elimination of congestion from the main access memory we propose a spatiotemporal indexing technique in RAM called Captree *. As a result, we offer an RIA (Rich Internet Application) -based SOA application architecture which allows the remote monitoring and control.

Keywords: WSN, indexing data, SOA, RIA, geographic information system

Procedia PDF Downloads 247
24271 Prediction of Marine Ecosystem Changes Based on the Integrated Analysis of Multivariate Data Sets

Authors: Prozorkevitch D., Mishurov A., Sokolov K., Karsakov L., Pestrikova L.

Abstract:

The current body of knowledge about the marine environment and the dynamics of marine ecosystems includes a huge amount of heterogeneous data collected over decades. It generally includes a wide range of hydrological, biological and fishery data. Marine researchers collect these data and analyze how and why the ecosystem changes from past to present. Based on these historical records and linkages between the processes it is possible to predict future changes. Multivariate analysis of trends and their interconnection in the marine ecosystem may be used as an instrument for predicting further ecosystem evolution. A wide range of information about the components of the marine ecosystem for more than 50 years needs to be used to investigate how these arrays can help to predict the future.

Keywords: barents sea ecosystem, abiotic, biotic, data sets, trends, prediction

Procedia PDF Downloads 110
24270 Optical Fiber Data Throughput in a Quantum Communication System

Authors: Arash Kosari, Ali Araghi

Abstract:

A mathematical model for an optical-fiber communication channel is developed which results in an expression that calculates the throughput and loss of the corresponding link. The data are assumed to be transmitted by using of separate photons with different polarizations. The derived model also shows the dependency of data throughput with length of the channel and depolarization factor. It is observed that absorption of photons affects the throughput in a more intensive way in comparison with that of depolarization. Apart from that, the probability of depolarization and the absorption of radiated photons are obtained.

Keywords: absorption, data throughput, depolarization, optical fiber

Procedia PDF Downloads 282
24269 University Short Courses Web Application Using ASP.Net

Authors: Ahmed Hariri

Abstract:

E-Learning has become a necessity in the advanced education. It is easier for the student and teacher communication also it speed up the process with less time and less effort. With the progress and the enormous development of distance education must keep up with this age of making a website that allows students and teachers to take all the advantages of advanced education. In this regards, we developed University Short courses web application which is specially designed for Faculty of computing and information technology, Rabigh, Kingdom of Saudi Arabia. After an elaborate review of the current state-of-the-art methods of teaching and learning, we found that instructors deliver extra short courses and workshop to students to enhance the knowledge of students. Moreover, this process is completely manual. The prevailing methods of teaching and learning consume a lot of time; therefore in this context, University Short courses web application will help to make process easy and user friendly. The site allows for students can view and register short courses online conducted by instructor also they can see courses starting dates, finishing date and locations. It also allows the instructor to put things on his courses on the site and see the students enrolled in the study material. Finally, student can print the certificate after finished the course online. ASP.NET, SQLSERVER, JavaScript SQL SERVER Database will use to develop the University Short Courses web application.

Keywords: e-learning, short courses, ASP.NET, SQL SERVER

Procedia PDF Downloads 125
24268 Event Driven Dynamic Clustering and Data Aggregation in Wireless Sensor Network

Authors: Ashok V. Sutagundar, Sunilkumar S. Manvi

Abstract:

Energy, delay and bandwidth are the prime issues of wireless sensor network (WSN). Energy usage optimization and efficient bandwidth utilization are important issues in WSN. Event triggered data aggregation facilitates such optimal tasks for event affected area in WSN. Reliable delivery of the critical information to sink node is also a major challenge of WSN. To tackle these issues, we propose an event driven dynamic clustering and data aggregation scheme for WSN that enhances the life time of the network by minimizing redundant data transmission. The proposed scheme operates as follows: (1) Whenever the event is triggered, event triggered node selects the cluster head. (2) Cluster head gathers data from sensor nodes within the cluster. (3) Cluster head node identifies and classifies the events out of the collected data using Bayesian classifier. (4) Aggregation of data is done using statistical method. (5) Cluster head discovers the paths to the sink node using residual energy, path distance and bandwidth. (6) If the aggregated data is critical, cluster head sends the aggregated data over the multipath for reliable data communication. (7) Otherwise aggregated data is transmitted towards sink node over the single path which is having the more bandwidth and residual energy. The performance of the scheme is validated for various WSN scenarios to evaluate the effectiveness of the proposed approach in terms of aggregation time, cluster formation time and energy consumed for aggregation.

Keywords: wireless sensor network, dynamic clustering, data aggregation, wireless communication

Procedia PDF Downloads 442
24267 Offshore Outsourcing: Global Data Privacy Controls and International Compliance Issues

Authors: Michelle J. Miller

Abstract:

In recent year, there has been a rise of two emerging issues that impact the global employment and business market that the legal community must review closer: offshore outsourcing and data privacy. These two issues intersect because employment opportunities are shifting due to offshore outsourcing and some States, like the United States, anti-outsourcing legislation has been passed or presented to retain jobs within the country. In addition, the legal requirements to retain the privacy of data as a global employer extends to employees and third party service provides, including services outsourced to offshore locations. For this reason, this paper will review the intersection of these two issues with a specific focus on data privacy.

Keywords: outsourcing, data privacy, international compliance, multinational corporations

Procedia PDF Downloads 403
24266 Weighted Data Replication Strategy for Data Grid Considering Economic Approach

Authors: N. Mansouri, A. Asadi

Abstract:

Data Grid is a geographically distributed environment that deals with data intensive application in scientific and enterprise computing. Data replication is a common method used to achieve efficient and fault-tolerant data access in Grids. In this paper, a dynamic data replication strategy, called Enhanced Latest Access Largest Weight (ELALW) is proposed. This strategy is an enhanced version of Latest Access Largest Weight strategy. However, replication should be used wisely because the storage capacity of each Grid site is limited. Thus, it is important to design an effective strategy for the replication replacement task. ELALW replaces replicas based on the number of requests in future, the size of the replica, and the number of copies of the file. It also improves access latency by selecting the best replica when various sites hold replicas. The proposed replica selection selects the best replica location from among the many replicas based on response time that can be determined by considering the data transfer time, the storage access latency, the replica requests that waiting in the storage queue and the distance between nodes. Simulation results utilizing the OptorSim show our replication strategy achieve better performance overall than other strategies in terms of job execution time, effective network usage and storage resource usage.

Keywords: data grid, data replication, simulation, replica selection, replica placement

Procedia PDF Downloads 255
24265 Evaluation of Satellite and Radar Rainfall Product over Seyhan Plain

Authors: Kazım Kaba, Erdem Erdi, M. Akif Erdoğan, H. Mustafa Kandırmaz

Abstract:

Rainfall is crucial data source for very different discipline such as agriculture, hydrology and climate. Therefore rain rate should be known well both spatial and temporal for any area. Rainfall is measured by using rain-gauge at meteorological ground stations traditionally for many years. At the present time, rainfall products are acquired from radar and satellite images with a temporal and spatial continuity. In this study, we investigated the accuracy of these rainfall data according to rain-gauge data. For this purpose, we used Adana-Hatay radar hourly total precipitation product (RN1) and Meteosat convective rainfall rate (CRR) product over Seyhan plain. We calculated daily rainfall values from RN1 and CRR hourly precipitation products. We used the data of rainy days of four stations located within range of the radar from October 2013 to November 2015. In the study, we examined two rainfall data over Seyhan plain and the correlation between the rain-gauge data and two raster rainfall data was observed lowly.

Keywords: meteosat, radar, rainfall, rain-gauge, Turkey

Procedia PDF Downloads 318
24264 Business Skills Laboratory in Action: Combining a Practice Enterprise Model and an ERP-Simulation to a Comprehensive Business Learning Environment

Authors: Karoliina Nisula, Samuli Pekkola

Abstract:

Business education has been criticized for being too theoretical and distant from business life. Different types of experiential learning environments ranging from manual role-play to computer simulations and enterprise resource planning (ERP) systems have been used to introduce the realistic and practical experience into business learning. Each of these learning environments approaches business learning from a different perspective. The implementations tend to be individual exercises supplementing the traditional courses. We suggest combining them into a business skills laboratory resembling an actual workplace. In this paper, we present a concrete implementation of an ERP-supported business learning environment that is used throughout the first year undergraduate business curriculum. We validate the implementation by evaluating the learning outcomes through the different domains of Bloom’s taxonomy. We use the role-play oriented practice enterprise model as a comparison group. Our findings indicate that using the ERP simulation improves the poor and average students’ lower-level cognitive learning. On the affective domain, the ERP-simulation appears to enhance motivation to learn as well as perceived acquisition of practical hands-on skills.

Keywords: business simulations, experiential learning, ERP systems, learning environments

Procedia PDF Downloads 248
24263 Spatial Data Mining by Decision Trees

Authors: Sihem Oujdi, Hafida Belbachir

Abstract:

Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.

Keywords: C4.5 algorithm, decision trees, S-CART, spatial data mining

Procedia PDF Downloads 607
24262 Advancing in Cricket Analytics: Novel Approaches for Pitch and Ball Detection Employing OpenCV and YOLOV8

Authors: Pratham Madnur, Prathamkumar Shetty, Sneha Varur, Gouri Parashetti

Abstract:

In order to overcome conventional obstacles, this research paper investigates novel approaches for cricket pitch and ball detection that make use of cutting-edge technologies. The research integrates OpenCV for pitch inspection and modifies the YOLOv8 model for cricket ball detection in order to overcome the shortcomings of manual pitch assessment and traditional ball detection techniques. To ensure flexibility in a range of pitch environments, the pitch detection method leverages OpenCV’s color space transformation, contour extraction, and accurate color range defining features. Regarding ball detection, the YOLOv8 model emphasizes the preservation of minor object details to improve accuracy and is specifically trained to the unique properties of cricket balls. The methods are more reliable because of the careful preparation of the datasets, which include novel ball and pitch information. These cutting-edge methods not only improve cricket analytics but also set the stage for flexible methods in more general sports technology applications.

Keywords: OpenCV, YOLOv8, cricket, custom dataset, computer vision, sports

Procedia PDF Downloads 60
24261 Optimization of Oxygen Plant Parameters Simulating with MATLAB

Authors: B. J. Sonani, J. K. Ratnadhariya, Srinivas Palanki

Abstract:

Cryogenic engineering is the fast growing branch of the modern technology. There are various applications of the cryogenic engineering such as liquefaction in gas industries, metal industries, medical science, space technology, and transportation. The low-temperature technology developed superconducting materials which lead to reduce the friction and wear in various components of the systems. The liquid oxygen, hydrogen and helium play vital role in space application. The liquefaction process is produced very low temperature liquid for various application in research and modern application. The air liquefaction system for oxygen plants in gas industries is based on the Claude cycle. The effect of process parameters on the overall system is difficult to be analysed by manual calculations, and this provides the motivation to use process simulators for understanding the steady state and dynamic behaviour of such systems. The parametric study of this system via MATLAB simulations provide useful guidelines for preliminary design of air liquefaction system based on the Claude cycle. Every organization is always trying for reduce the cost and using the optimum performance of the plant for the staying in the competitive market.

Keywords: cryogenic, liquefaction, low -temperature, oxygen, claude cycle, optimization, MATLAB

Procedia PDF Downloads 318
24260 The Association of Excessive Work Stress with Job Satisfaction and Turnover Intention in Operating Room Nurses: A Cross-Sectional Study in a Metropolitan Teaching Hospital in Southern Taiwan

Authors: Chia Yu Chen, Shu Fen Wu, Chen-Fuh Lam, I-Ling Tsai, Shu Jiuan Chen, Yen Ling Liu

Abstract:

Aim: It remains undetermined that whether increased work stress may affect the job satisfaction and career loyalty among nursing staffs in the operating room. The long-term goal of this study is to lengthen the professional life of operating room nurses by attenuating the work stress and enhancing their contentment in work. Method: This was a cross-sectional, descriptive study performed in a metropolitan teaching hospital in the southern Taiwan between May 2017 to July 2017. A structured self-administered questionnaire, modified from the Occupational Stress Indicator-2 (OSI-2) and Maslach Burnout Inventory (MBI) manual was collected from the operating room nurses. Chi-square test was used to analyze the categorical data and Pearson correlation was used to analyze the association between two numerical datasets (SPSS version 20.0). Results: The response rate was 80% (80/100) and a total of 73 (73%) completed forms were eventually proceeded for analysis. The average scores for work stress and job satisfaction of the operating room nurses were 145.96±32.91 and 47.38±6.07, respectively. The correlation coefficients of work stress versus job satisfaction and organizational identity were (r=-0.338, p=0.003 and r=-0.354, p=0.002), respectively. There were more nurses who took rotating shift quitted works from the operating room than those who took only dayshift (2=5.176, p<0.05). Nurses who reported of having lower job satisfaction were associated with significantly higher turnover intention (t=3.714, p< 0.01). Following multivariate regression analysis, rotating shift and low job satisfaction were identified as the two independent predictors of intention to quit from working in the operating room. Conclusion: Our study clearly demonstrates that increased work stress significantly attenuates job satisfaction and organizational identity. Rotating shift is associated with higher work stress, lower job satisfaction, and higher turnover intention, which is consistent with the previous surveys carried out in the department of medical technology. Therefore, improvement of working quality in the operating rooms is essential to increase the retain intention of the well-trained nursing staffs. Further investigation into types of work shifts and other strategies of attenuating stress in workplace is currently undertaken in order to improve the job satisfaction and to decrease turnover intention in the operating room.

Keywords: rotating shift, work stress, job satisfaction, turnover intention

Procedia PDF Downloads 187
24259 Gulfnet: The Advent of Computer Networking in Saudi Arabia and Its Social Impact

Authors: Abdullah Almowanes

Abstract:

The speed of adoption of new information and communication technologies is often seen as an indicator of the growth of knowledge- and technological innovation-based regional economies. Indeed, technological progress and scientific inquiry in any society have undergone a particularly profound transformation with the introduction of computer networks. In the spring of 1981, the Bitnet network was launched to link thousands of nodes all over the world. In 1985 and as one of the first adopters of Bitnet, Saudi Arabia launched a Bitnet-based network named Gulfnet that linked computer centers, universities, and libraries of Saudi Arabia and other Gulf countries through high speed communication lines. In this paper, the origins and the deployment of Gulfnet are discussed as well as social, economical, political, and cultural ramifications of the new information reality created by the network. Despite its significance, the social and cultural aspects of Gulfnet have not been investigated in history of science and technology literature to a satisfactory degree before. The presented research is based on an extensive archival research aimed at seeking out and analyzing of primary evidence from archival sources and records. During its decade and a half-long existence, Gulfnet demonstrated that the scope and functionality of public computer networks in Saudi Arabia have to be fine-tuned for compliance with Islamic culture and political system of the country. It also helped lay the groundwork for the subsequent introduction of the Internet. Since 1980s, in just few decades, the proliferation of computer networks has transformed communications world-wide.

Keywords: Bitnet, computer networks, computing and culture, Gulfnet, Saudi Arabia

Procedia PDF Downloads 239
24258 Data-Driven Dynamic Overbooking Model for Tour Operators

Authors: Kannapha Amaruchkul

Abstract:

We formulate a dynamic overbooking model for a tour operator, in which most reservations contain at least two people. The cancellation rate and the timing of the cancellation may depend on the group size. We propose two overbooking policies, namely economic- and service-based. In an economic-based policy, we want to minimize the expected oversold and underused cost, whereas, in a service-based policy, we ensure that the probability of an oversold situation does not exceed the pre-specified threshold. To illustrate the applicability of our approach, we use tour package data in 2016-2018 from a tour operator in Thailand to build a data-driven robust optimization model, and we tested the proposed overbooking policy in 2019. We also compare the data-driven approach to the conventional approach of fitting data into a probability distribution.

Keywords: applied stochastic model, data-driven robust optimization, overbooking, revenue management, tour operator

Procedia PDF Downloads 125
24257 Modeling and Statistical Analysis of a Soap Production Mix in Bejoy Manufacturing Industry, Anambra State, Nigeria

Authors: Okolie Chukwulozie Paul, Iwenofu Chinwe Onyedika, Sinebe Jude Ebieladoh, M. C. Nwosu

Abstract:

The research work is based on the statistical analysis of the processing data. The essence is to analyze the data statistically and to generate a design model for the production mix of soap manufacturing products in Bejoy manufacturing company Nkpologwu, Aguata Local Government Area, Anambra state, Nigeria. The statistical analysis shows the statistical analysis and the correlation of the data. T test, Partial correlation and bi-variate correlation were used to understand what the data portrays. The design model developed was used to model the data production yield and the correlation of the variables show that the R2 is 98.7%. However, the results confirm that the data is fit for further analysis and modeling. This was proved by the correlation and the R-squared.

Keywords: General Linear Model, correlation, variables, pearson, significance, T-test, soap, production mix and statistic

Procedia PDF Downloads 436
24256 Helping the Development of Public Policies with Knowledge of Criminal Data

Authors: Diego De Castro Rodrigues, Marcelo B. Nery, Sergio Adorno

Abstract:

The project aims to develop a framework for social data analysis, particularly by mobilizing criminal records and applying descriptive computational techniques, such as associative algorithms and extraction of tree decision rules, among others. The methods and instruments discussed in this work will enable the discovery of patterns, providing a guided means to identify similarities between recurring situations in the social sphere using descriptive techniques and data visualization. The study area has been defined as the city of São Paulo, with the structuring of social data as the central idea, with a particular focus on the quality of the information. Given this, a set of tools will be validated, including the use of a database and tools for visualizing the results. Among the main deliverables related to products and the development of articles are the discoveries made during the research phase. The effectiveness and utility of the results will depend on studies involving real data, validated both by domain experts and by identifying and comparing the patterns found in this study with other phenomena described in the literature. The intention is to contribute to evidence-based understanding and decision-making in the social field.

Keywords: social data analysis, criminal records, computational techniques, data mining, big data

Procedia PDF Downloads 74
24255 An Inquiry of the Impact of Flood Risk on Housing Market with Enhanced Geographically Weighted Regression

Authors: Lin-Han Chiang Hsieh, Hsiao-Yi Lin

Abstract:

This study aims to determine the impact of the disclosure of flood potential map on housing prices. The disclosure is supposed to mitigate the market failure by reducing information asymmetry. On the other hand, opponents argue that the official disclosure of simulated results will only create unnecessary disturbances on the housing market. This study identifies the impact of the disclosure of the flood potential map by comparing the hedonic price of flood potential before and after the disclosure. The flood potential map used in this study is published by Taipei municipal government in 2015, which is a result of a comprehensive simulation based on geographical, hydrological, and meteorological factors. The residential property sales data of 2013 to 2016 is used in this study, which is collected from the actual sales price registration system by the Department of Land Administration (DLA). The result shows that the impact of flood potential on residential real estate market is statistically significant both before and after the disclosure. But the trend is clearer after the disclosure, suggesting that the disclosure does have an impact on the market. Also, the result shows that the impact of flood potential differs by the severity and frequency of precipitation. The negative impact for a relatively mild, high frequency flood potential is stronger than that for a heavy, low possibility flood potential. The result indicates that home buyers are of more concern to the frequency, than the intensity of flood. Another contribution of this study is in the methodological perspective. The classic hedonic price analysis with OLS regression suffers from two spatial problems: the endogeneity problem caused by omitted spatial-related variables, and the heterogeneity concern to the presumption that regression coefficients are spatially constant. These two problems are seldom considered in a single model. This study tries to deal with the endogeneity and heterogeneity problem together by combining the spatial fixed-effect model and geographically weighted regression (GWR). A series of literature indicates that the hedonic price of certain environmental assets varies spatially by applying GWR. Since the endogeneity problem is usually not considered in typical GWR models, it is arguable that the omitted spatial-related variables might bias the result of GWR models. By combing the spatial fixed-effect model and GWR, this study concludes that the effect of flood potential map is highly sensitive by location, even after controlling for the spatial autocorrelation at the same time. The main policy application of this result is that it is improper to determine the potential benefit of flood prevention policy by simply multiplying the hedonic price of flood risk by the number of houses. The effect of flood prevention might vary dramatically by location.

Keywords: flood potential, hedonic price analysis, endogeneity, heterogeneity, geographically-weighted regression

Procedia PDF Downloads 285
24254 Optimization of Real Time Measured Data Transmission, Given the Amount of Data Transmitted

Authors: Michal Kopcek, Tomas Skulavik, Michal Kebisek, Gabriela Krizanova

Abstract:

The operation of nuclear power plants involves continuous monitoring of the environment in their area. This monitoring is performed using a complex data acquisition system, which collects status information about the system itself and values of many important physical variables e.g. temperature, humidity, dose rate etc. This paper describes a proposal and optimization of communication that takes place in teledosimetric system between the central control server responsible for the data processing and storing and the decentralized measuring stations, which are measuring the physical variables. Analyzes of ongoing communication were performed and consequently the optimization of the system architecture and communication was done.

Keywords: communication protocol, transmission optimization, data acquisition, system architecture

Procedia PDF Downloads 514