Search results for: manual data inquiry
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25273

Search results for: manual data inquiry

24403 Gender Disparity in Film Industries: A Conceptual Study

Authors: Daniel Edem Adzovie, Jakub Kudlac

Abstract:

The subtle institutionalization of male dominance in the film industry in the 1930s and its rippling effect of gender imbalance especially, regarding female active participation in film industries across the globe in terms of number and influence, is a worrying trend. The main purpose of the study is to explore the role of gender themes, especially patriarchal themes in films, in influencing the disparity experienced in film industries. Partially, we examine the motivations vis-à-vis the demotivating factors that attract and or refract females from enrolling in film schools against their male contemporaries. Employing a qualitative inquiry with a specific focus on document analysis as well as experts’ opinions in order to ascertain the antecedents and consequences of patriarchal themes in films on female participation in film industries, we drew extant literature from reputable databases such as EBSCO, Scopus, Web of Science, ERIH Plus, Google Scholar as well as notable books on gender and film. Secondly, we conceptualized a research model for a future qualitative research design that could take into consideration a study from at least three different film industries and analyze using thematic analysis. This could help validate the proposed conceptual model of the study. The literature review revealed that culture, to a large extent, influences the patriarchal themes conveyed in films, which inhibits active female participation in film industries. Research implications have been discussed.

Keywords: film industry, female, gender, male dominance, patriarchal themes

Procedia PDF Downloads 134
24402 A Development of Science Instructional Model Based on Stem Education Approach to Enhance Scientific Mind and Problem Solving Skills for Primary Students

Authors: Prasita Sooksamran, Wareerat Kaewurai

Abstract:

STEM is an integrated teaching approach promoted by the Ministry of Education in Thailand. STEM Education is an integrated approach to teaching Science, Technology, Engineering, and Mathematics. It has been questioned by Thai teachers on the grounds of how to integrate STEM into the classroom. Therefore, the main objective of this study is to develop a science instructional model based on the STEM approach to enhance scientific mind and problem-solving skills for primary students. This study is participatory action research, and follows the following steps: 1) develop a model 2) seek the advice of experts regarding the teaching model. Developing the instructional model began with the collection and synthesis of information from relevant documents, related research and other sources in order to create prototype instructional model. 2) The examination of the validity and relevance of instructional model by a panel of nine experts. The findings were as follows: 1. The developed instructional model comprised of principles, objective, content, operational procedures and learning evaluation. There were 4 principles: 1) Learning based on the natural curiosity of primary school level children leading to knowledge inquiry, understanding and knowledge construction, 2) Learning based on the interrelation between people and environment, 3) Learning that is based on concrete learning experiences, exploration and the seeking of knowledge, 4) Learning based on the self-construction of knowledge, creativity, innovation and 5) relating their findings to real life and the solving of real-life problems. The objective of this construction model is to enhance scientific mind and problem-solving skills. Children will be evaluated according to their achievements. Lesson content is based on science as a core subject which is integrated with technology and mathematics at grade 6 level according to The Basic Education Core Curriculum 2008 guidelines. The operational procedures consisted of 6 steps: 1) Curiosity 2) Collection of data 3) Collaborative planning 4) Creativity and Innovation 5) Criticism and 6) Communication and Service. The learning evaluation is an authentic assessment based on continuous evaluation of all the material taught. 2. The experts agreed that the Science Instructional Model based on the STEM Education Approach had an excellent level of validity and relevance (4.67 S.D. 0.50).

Keywords: instructional model, STEM education, scientific mind, problem solving

Procedia PDF Downloads 184
24401 Characterizing the Rectification Process for Designing Scoliosis Braces: Towards Digital Brace Design

Authors: Inigo Sanz-Pena, Shanika Arachchi, Dilani Dhammika, Sanjaya Mallikarachchi, Jeewantha S. Bandula, Alison H. McGregor, Nicolas Newell

Abstract:

The use of orthotic braces for adolescent idiopathic scoliosis (AIS) patients is the most common non-surgical treatment to prevent deformity progression. The traditional method to create an orthotic brace involves casting the patient’s torso to obtain a representative geometry, which is then rectified by an orthotist to the desired geometry of the brace. Recent improvements in 3D scanning technologies, rectification software, CNC, and additive manufacturing processes have given the possibility to compliment, or in some cases, replace manual methods with digital approaches. However, the rectification process remains dependent on the orthotist’s skills. Therefore, the rectification process needs to be carefully characterized to ensure that braces designed through a digital workflow are as efficient as those created using a manual process. The aim of this study is to compare 3D scans of patients with AIS against 3D scans of both pre- and post-rectified casts that have been manually shaped by an orthotist. Six AIS patients were recruited from the Ragama Rehabilitation Clinic, Colombo, Sri Lanka. All patients were between 10 and 15 years old, were skeletally immature (Risser grade 0-3), and had Cobb angles between 20-45°. Seven spherical markers were placed at key anatomical locations on each patient’s torso and on the pre- and post-rectified molds so that distances could be reliably measured. 3D scans were obtained of 1) the patient’s torso and pelvis, 2) the patient’s pre-rectification plaster mold, and 3) the patient’s post-rectification plaster mold using a Structure Sensor Mark II 3D scanner (Occipital Inc., USA). 3D stick body models were created for each scan to represent the distances between anatomical landmarks. The 3D stick models were used to analyze the changes in position and orientation of the anatomical landmarks between scans using Blender open-source software. 3D Surface deviation maps represented volume differences between the scans using CloudCompare open-source software. The 3D stick body models showed changes in the position and orientation of thorax anatomical landmarks between the patient and the post-rectification scans for all patients. Anatomical landmark position and volume differences were seen between 3D scans of the patient’s torsos and the pre-rectified molds. Between the pre- and post-rectified molds, material removal was consistently seen on the anterior side of the thorax and the lateral areas below the ribcage. Volume differences were seen in areas where the orthotist planned to place pressure pads (usually at the trochanter on the side to which the lumbar curve was tilted (trochanter pad), at the lumbar apical vertebra (lumbar pad), on the rib connected to the apical vertebrae at the mid-axillary line (thoracic pad), and on the ribs corresponding to the upper thoracic vertebra (axillary extension pad)). The rectification process requires the skill and experience of an orthotist; however, this study demonstrates that the brace shape, location, and volume of material removed from the pre-rectification mold can be characterized and quantified. Results from this study can be fed into software that can accelerate the brace design process and make steps towards the automated digital rectification process.

Keywords: additive manufacturing, orthotics, scoliosis brace design, sculpting software, spinal deformity

Procedia PDF Downloads 141
24400 The Integration and Automation of EDA Tools in an Integrated Circuit Design Environment

Authors: Rohaya Abdul Wahab, Raja Mohd Fuad Tengku Aziz, Nazaliza Othman, Sharifah Saleh, Nabihah Razali, Rozaimah Baharim, M. Hanif M. Nasir

Abstract:

This paper will discuss how EDA tools are integrated and automated in an Integrated Circuit Design Environment. Some of the problems face in our current environment is that users need to configure manually on the library paths, start-up files and project directories. Certain manual processes that happen between the users and applications can be automated but they must be transparent to the users. For example, the users can run the applications directly after login without knowing the library paths and start-up files locations. The solution to these problems is to automate the processes using standard configuration files which will benefit the users and EDA support. This paper will discuss how the implementation is done to automate the process using scripting languages such as Perl, Tcl, Scheme and Shell Script. These scripting tools are great assets for design engineers to build a robust and powerful design flow and this technique is widely used to integrate all the tools together.

Keywords: EDA tools, Integrated Circuits, scripting, integration, automation

Procedia PDF Downloads 317
24399 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption

Authors: Waziri Victor Onomza, John K. Alhassan, Idris Ismaila, Noel Dogonyaro Moses

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute theoretical presentations in high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, homomorphic, homomorphic encryption scheme

Procedia PDF Downloads 371
24398 Socioeconomic Burden of a Diagnosis of Cervical Cancer in Women in Rural Uganda: Findings from a Phenomenological Study

Authors: Germans Natuhwera, Peter Ellis, Acuda Wilson, Anne Merriman, Martha Rabwoni

Abstract:

Objective: The aim of the study was to diagnose the socio-economic burden and impact of a diagnosis of cervical cancer (CC) in rural women in the context of low-resourced country Uganda, using a phenomenological enquiry. Methods: This was a multi-site phenomenological inquiry, conducted at three hospice settings; Mobile Hospice Mbarara in southwestern, Little Hospice Hoima in Western, and Hospice Africa Uganda Kampala in central Uganda. A purposive sample of women with a histologically confirmed diagnosis of CC was recruited. Data was collected using open-ended audio-recorded interviews conducted in the native languages of participants. Interviews were transcribed verbatim in English, and Braun and Clarke’s (2019) framework of thematic analysis was used. Results: 13 women with a mean age of 49.2 and age range 29-71 participated in the study. All participants were of low socioeconomic status. The majority (84.6%) had advanced disease at diagnosis. A fuller reading of transcripts produced four major themes clustered under; (1) socioeconomic characteristics of women, (2) impact of CC on women’s relationships, (3) disrupted and impaired activities of daily living (ADLs), and (4) economic disruptions. Conclusions: A diagnosis of CC introduces significant socio-economic disruptions in a woman’s and her family’s life. CC causes disability, impairs the woman and her family’s productivity hence exacerbating levels of poverty in the home. High and expensive out-of-pocket expenditure on treatment, investigations, and transport costs further compound the socio-economic burden. Decentralizing cancer care services to regional centers, scaling up screening services, subsidizing costs of cancer care services, or making cervical cancer care treatment free of charge, strengthening monitoring mechanisms in public facilities to curb the vice of healthcare workers soliciting bribes from patients, increased mass awareness campaigns about cancer, training more healthcare professionals in cancer investigation and management, and palliative care, and introducing an introductory course on gynecologic cancers into all health training institutions are recommended.

Keywords: activities of daily living, cervical cancer, out-of-pocket, expenditure, phenomenology, socioeconomic

Procedia PDF Downloads 203
24397 Protecting Privacy and Data Security in Online Business

Authors: Bilquis Ferdousi

Abstract:

With the exponential growth of the online business, the threat to consumers’ privacy and data security has become a serious challenge. This literature review-based study focuses on a better understanding of those threats and what legislative measures have been taken to address those challenges. Research shows that people are increasingly involved in online business using different digital devices and platforms, although this practice varies based on age groups. The threat to consumers’ privacy and data security is a serious hindrance in developing trust among consumers in online businesses. There are some legislative measures taken at the federal and state level to protect consumers’ privacy and data security. The study was based on an extensive review of current literature on protecting consumers’ privacy and data security and legislative measures that have been taken.

Keywords: privacy, data security, legislation, online business

Procedia PDF Downloads 96
24396 Flowing Online Vehicle GPS Data Clustering Using a New Parallel K-Means Algorithm

Authors: Orhun Vural, Oguz Bayat, Rustu Akay, Osman N. Ucan

Abstract:

This study presents a new parallel approach clustering of GPS data. Evaluation has been made by comparing execution time of various clustering algorithms on GPS data. This paper aims to propose a parallel based on neighborhood K-means algorithm to make it faster. The proposed parallelization approach assumes that each GPS data represents a vehicle and to communicate between vehicles close to each other after vehicles are clustered. This parallelization approach has been examined on different sized continuously changing GPS data and compared with serial K-means algorithm and other serial clustering algorithms. The results demonstrated that proposed parallel K-means algorithm has been shown to work much faster than other clustering algorithms.

Keywords: parallel k-means algorithm, parallel clustering, clustering algorithms, clustering on flowing data

Procedia PDF Downloads 214
24395 Weed Classification Using a Two-Dimensional Deep Convolutional Neural Network

Authors: Muhammad Ali Sarwar, Muhammad Farooq, Nayab Hassan, Hammad Hassan

Abstract:

Pakistan is highly recognized for its agriculture and is well known for producing substantial amounts of wheat, cotton, and sugarcane. However, some factors contribute to a decline in crop quality and a reduction in overall output. One of the main factors contributing to this decline is the presence of weed and its late detection. This process of detection is manual and demands a detailed inspection to be done by the farmer itself. But by the time detection of weed, the farmer will be able to save its cost and can increase the overall production. The focus of this research is to identify and classify the four main types of weeds (Small-Flowered Cranesbill, Chick Weed, Prickly Acacia, and Black-Grass) that are prevalent in our region’s major crops. In this work, we implemented three different deep learning techniques: YOLO-v5, Inception-v3, and Deep CNN on the same Dataset, and have concluded that deep convolutions neural network performed better with an accuracy of 97.45% for such classification. In relative to the state of the art, our proposed approach yields 2% better results. We devised the architecture in an efficient way such that it can be used in real-time.

Keywords: deep convolution networks, Yolo, machine learning, agriculture

Procedia PDF Downloads 101
24394 An Analysis of Privacy and Security for Internet of Things Applications

Authors: Dhananjay Singh, M. Abdullah-Al-Wadud

Abstract:

The Internet of Things is a concept of a large scale ecosystem of wireless actuators. The actuators are defined as things in the IoT, those which contribute or produces some data to the ecosystem. However, ubiquitous data collection, data security, privacy preserving, large volume data processing, and intelligent analytics are some of the key challenges into the IoT technologies. In order to solve the security requirements, challenges and threats in the IoT, we have discussed a message authentication mechanism for IoT applications. Finally, we have discussed data encryption mechanism for messages authentication before propagating into IoT networks.

Keywords: Internet of Things (IoT), message authentication, privacy, security

Procedia PDF Downloads 373
24393 Cognitive Science Based Scheduling in Grid Environment

Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya

Abstract:

Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.

Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence

Procedia PDF Downloads 388
24392 Heritage and Tourism in the Era of Big Data: Analysis of Chinese Cultural Tourism in Catalonia

Authors: Xinge Liao, Francesc Xavier Roige Ventura, Dolores Sanchez Aguilera

Abstract:

With the development of the Internet, the study of tourism behavior has rapidly expanded from the traditional physical market to the online market. Data on the Internet is characterized by dynamic changes, and new data appear all the time. In recent years the generation of a large volume of data was characterized, such as forums, blogs, and other sources, which have expanded over time and space, together they constitute large-scale Internet data, known as Big Data. This data of technological origin that derives from the use of devices and the activity of multiple users is becoming a source of great importance for the study of geography and the behavior of tourists. The study will focus on cultural heritage tourist practices in the context of Big Data. The research will focus on exploring the characteristics and behavior of Chinese tourists in relation to the cultural heritage of Catalonia. Geographical information, target image, perceptions in user-generated content will be studied through data analysis from Weibo -the largest social networks of blogs in China. Through the analysis of the behavior of heritage tourists in the Big Data environment, this study will understand the practices (activities, motivations, perceptions) of cultural tourists and then understand the needs and preferences of tourists in order to better guide the sustainable development of tourism in heritage sites.

Keywords: Barcelona, Big Data, Catalonia, cultural heritage, Chinese tourism market, tourists’ behavior

Procedia PDF Downloads 130
24391 Towards A Framework for Using Open Data for Accountability: A Case Study of A Program to Reduce Corruption

Authors: Darusalam, Jorish Hulstijn, Marijn Janssen

Abstract:

Media has revealed a variety of corruption cases in the regional and local governments all over the world. Many governments pursued many anti-corruption reforms and have created a system of checks and balances. Three types of corruption are faced by citizens; administrative corruption, collusion and extortion. Accountability is one of the benchmarks for building transparent government. The public sector is required to report the results of the programs that have been implemented so that the citizen can judge whether the institution has been working such as economical, efficient and effective. Open Data is offering solutions for the implementation of good governance in organizations who want to be more transparent. In addition, Open Data can create transparency and accountability to the community. The objective of this paper is to build a framework of open data for accountability to combating corruption. This paper will investigate the relationship between open data, and accountability as part of anti-corruption initiatives. This research will investigate the impact of open data implementation on public organization.

Keywords: open data, accountability, anti-corruption, framework

Procedia PDF Downloads 321
24390 Intelligent Semi-Active Suspension Control of a Electric Model Vehicle System

Authors: Shiuh-Jer Huang, Yun-Han Yeh

Abstract:

A four-wheel drive electric vehicle was built with hub DC motors and FPGA embedded control structure. A 40 steps manual adjusting motorcycle shock absorber was refitted with DC motor driving mechanism to construct as a semi-active suspension system. Accelerometer and potentiometer sensors are installed to measure the sprung mass acceleration and suspension system compression or rebound states for control purpose. An intelligent fuzzy logic controller was proposed to real-time search appropriate damping ratio based on vehicle running condition. Then, a robust fuzzy sliding mode controller (FSMC) is employed to regulate the target damping ratio of each wheel axis semi-active suspension system. Finally, different road surface conditions are chosen to evaluate the control performance of this semi-active suspension and compare with that of passive system based on wheel axis acceleration signal.

Keywords: acceleration, FPGA, Fuzzy sliding mode control, semi-active suspension

Procedia PDF Downloads 405
24389 Simulation of Flood Inundation in Kedukan River Using HEC-RAS and GIS

Authors: Reini S. Ilmiaty, Muhammad B. Al Amin, Sarino, Muzamil Jariski

Abstract:

Kedukan River is an artificial river which serves as a Watershed Boang drainage channel in Palembang. The river has upstream and downstream connected to Musi River, that often overflowing and flooding caused by the huge runoff discharge and high tide water level of Musi River. This study aimed to analyze the flood water surface profile on Kedukan River continued with flood inundation simulation to determine flooding prone areas in research area. The analysis starts from the peak runoff discharge calculations using rational method followed by water surface profile analysis using HEC-RAS program controlled by manual calculations using standard stages. The analysis followed by running flood inundation simulation using ArcGIS program that has been integrated with HEC-GeoRAS. Flood inundation simulation on Kedukan River creates inundation characteristic maps with depth, area, and circumference of inundation as the parameters. The inundation maps are very useful in providing an overview of flood prone areas in Kedukan River.

Keywords: flood modelling, HEC-GeoRAS, HEC-RAS, inundation map

Procedia PDF Downloads 504
24388 Syndromic Surveillance Framework Using Tweets Data Analytics

Authors: David Ming Liu, Benjamin Hirsch, Bashir Aden

Abstract:

Syndromic surveillance is to detect or predict disease outbreaks through the analysis of medical sources of data. Using social media data like tweets to do syndromic surveillance becomes more and more popular with the aid of open platform to collect data and the advantage of microblogging text and mobile geographic location features. In this paper, a Syndromic Surveillance Framework is presented with machine learning kernel using tweets data analytics. Influenza and the three cities Abu Dhabi, Al Ain and Dubai of United Arabic Emirates are used as the test disease and trial areas. Hospital cases data provided by the Health Authority of Abu Dhabi (HAAD) are used for the correlation purpose. In our model, Latent Dirichlet allocation (LDA) engine is adapted to do supervised learning classification and N-Fold cross validation confusion matrix are given as the simulation results with overall system recall 85.595% performance achieved.

Keywords: Syndromic surveillance, Tweets, Machine Learning, data mining, Latent Dirichlet allocation (LDA), Influenza

Procedia PDF Downloads 106
24387 Encouraging Skills and Entrepreneurial Spirit to Improve Employability of Young Artists

Authors: Olga Lasaga, Carmen Parra

Abstract:

Within the EU 'New Skills for New Jobs' initiative, the art and music sector is considered one of the most vulnerable. Its graduates are faced with the threat of the dole or of not finding work in the sector in which they trained. In this regard, an increasing number of students are graduating every year from European Conservatories and Fine Arts Centres, while the number of job opportunities in this sector has stagnated or decreased. Moreover, the traditional teaching of these institutes does not favour the acquisition of basic skills, such as team building, entrepreneurship, marketing, website design and the design of events, which are among the most important facets of project management and are precisely those aspects that are often most related to the improvement of employability in the art world. To remedy this situation, the results of the European Erasmus+ OMEGA project (Opening More Employment Gates for Art and Music Students) are presented. The OMEGA project aims to increase the employability of art and music students by equipping them with additional skills needed for the search for work. As a result of this project, a manual has been created, a pilot course has been designed and taught, and a comparative study has been conducted on the state of play of the participating countries.

Keywords: artists, employability, entrepreneurship, musicians, skills

Procedia PDF Downloads 236
24386 The Utilisation of Storytelling as a Therapeutic Intervention by Educational Psychologists to Address Behavioural Challenges Relating to Grief of Adolescent Clients

Authors: Laila Jeebodh Desai

Abstract:

Storytelling as a therapeutic intervention entails the narrating of events by externalising emotions, thoughts and responses to life-changing events such as loss and grief. This creates the opportunity for clients to engage with psychologists by projecting various beliefs and challenges, such as grief, through a range of therapeutic modalities. This study conducts an inquiry into the ways in which storytelling can be utilised by educational psychologists with adolescent clients to address behavioural challenges relating to grief. This qualitative study therefore aims to facilitate an understanding of the use and benefits of storytelling as a therapeutic intervention. This has been achieved by examining interviews with four educational psychologists who have utilised storytelling as a therapeutic intervention with adolescent clients to overcome challenges with grief. The participants (educational psychologists) discussed case studies during interviews, which provided evidence of their practical administration of storytelling as a therapeutic intervention incorporating integrated theoretical approaches through the use of blended therapeutic techniques. Behavioural challenges relating to grief were also predominant in the case study information provided by the participants. The participants further confirmed that the term ‘grief’ included different types of loss that were experienced among adolescent clients. The implications and recommendations of the findings encouraged the utilisation of storytelling as a therapeutic intervention with adolescent clients in addressing behavioural challenges related to grief, based on the outcome of the case studies discussed by the participants.

Keywords: storytelling, therapeutic intervention, adolescents, grief

Procedia PDF Downloads 492
24385 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion

Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro

Abstract:

In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.

Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment

Procedia PDF Downloads 30
24384 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia

Authors: Yuyun Wabula, B. J. Dewancker

Abstract:

In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.

Keywords: geolocation, Twitter, distribution analysis, human mobility

Procedia PDF Downloads 305
24383 Analysis and Rule Extraction of Coronary Artery Disease Data Using Data Mining

Authors: Rezaei Hachesu Peyman, Oliyaee Azadeh, Salahzadeh Zahra, Alizadeh Somayyeh, Safaei Naser

Abstract:

Coronary Artery Disease (CAD) is one major cause of disability in adults and one main cause of death in developed. In this study, data mining techniques including Decision Trees, Artificial neural networks (ANNs), and Support Vector Machine (SVM) analyze CAD data. Data of 4948 patients who had suffered from heart diseases were included in the analysis. CAD is the target variable, and 24 inputs or predictor variables are used for the classification. The performance of these techniques is compared in terms of sensitivity, specificity, and accuracy. The most significant factor influencing CAD is chest pain. Elderly males (age > 53) have a high probability to be diagnosed with CAD. SVM algorithm is the most useful way for evaluation and prediction of CAD patients as compared to non-CAD ones. Application of data mining techniques in analyzing coronary artery diseases is a good method for investigating the existing relationships between variables.

Keywords: classification, coronary artery disease, data-mining, knowledge discovery, extract

Procedia PDF Downloads 648
24382 Sensor Data Analysis for a Large Mining Major

Authors: Sudipto Shanker Dasgupta

Abstract:

One of the largest mining companies wanted to look at health analytics for their driverless trucks. These trucks were the key to their supply chain logistics. The automated trucks had multi-level sub-assemblies which would send out sensor information. The use case that was worked on was to capture the sensor signal from the truck subcomponents and analyze the health of the trucks from repair and replacement purview. Open source software was used to stream the data into a clustered Hadoop setup in Amazon Web Services cloud and Apache Spark SQL was used to analyze the data. All of this was achieved through a 10 node amazon 32 core, 64 GB RAM setup real-time analytics was achieved on ‘300 million records’. To check the scalability of the system, the cluster was increased to 100 node setup. This talk will highlight how Open Source software was used to achieve the above use case and the insights on the high data throughput on a cloud set up.

Keywords: streaming analytics, data science, big data, Hadoop, high throughput, sensor data

Procedia PDF Downloads 399
24381 Automated Machine Learning Algorithm Using Recurrent Neural Network to Perform Long-Term Time Series Forecasting

Authors: Ying Su, Morgan C. Wang

Abstract:

Long-term time series forecasting is an important research area for automated machine learning (AutoML). Currently, forecasting based on either machine learning or statistical learning is usually built by experts, and it requires significant manual effort, from model construction, feature engineering, and hyper-parameter tuning to the construction of the time series model. Automation is not possible since there are too many human interventions. To overcome these limitations, this article proposed to use recurrent neural networks (RNN) through the memory state of RNN to perform long-term time series prediction. We have shown that this proposed approach is better than the traditional Autoregressive Integrated Moving Average (ARIMA). In addition, we also found it is better than other network systems, including Fully Connected Neural Networks (FNN), Convolutional Neural Networks (CNN), and Nonpooling Convolutional Neural Networks (NPCNN).

Keywords: automated machines learning, autoregressive integrated moving average, neural networks, time series analysis

Procedia PDF Downloads 94
24380 Data-Centric Anomaly Detection with Diffusion Models

Authors: Sheldon Liu, Gordon Wang, Lei Liu, Xuefeng Liu

Abstract:

Anomaly detection, also referred to as one-class classification, plays a crucial role in identifying product images that deviate from the expected distribution. This study introduces Data-centric Anomaly Detection with Diffusion Models (DCADDM), presenting a systematic strategy for data collection and further diversifying the data with image generation via diffusion models. The algorithm addresses data collection challenges in real-world scenarios and points toward data augmentation with the integration of generative AI capabilities. The paper explores the generation of normal images using diffusion models. The experiments demonstrate that with 30% of the original normal image size, modeling in an unsupervised setting with state-of-the-art approaches can achieve equivalent performances. With the addition of generated images via diffusion models (10% equivalence of the original dataset size), the proposed algorithm achieves better or equivalent anomaly localization performance.

Keywords: diffusion models, anomaly detection, data-centric, generative AI

Procedia PDF Downloads 77
24379 Semi-Automated Tracking of Vibrissal Movements in Free-Moving Rodents Captured by High-Speed Videos

Authors: Hyun June Kim, Tailong Shi, Seden Akdagli, Sam Most, Yuling Yan

Abstract:

Quantitative analysis of mouse whisker movement can be used to study functional recovery and regeneration of facial nerve after an injury. However, it is challenging to accurately track mouse whisker movements, and most whisker tracking methods require manual intervention, e.g. fixing the head of the mouse during a study. Here we describe a semi-automated image processing method that is applied to high-speed video recordings of free-moving mice to track whisker movements. We first track the head movement of a mouse by delineating the lower head contour frame-by-frame to locate and determine the orientation of its head. Then, a region of interest is identified for each frame, with subsequent application of the Hough transform to track individual whisker movements on each side of the head. Our approach is used to examine the functional recovery of damaged facial nerves in mice over a course of 21 days.

Keywords: mystacial macrovibrissae, whisker tracking, head tracking, facial nerve recovery

Procedia PDF Downloads 586
24378 Wireless Sensor Networks for Water Quality Monitoring: Prototype Design

Authors: Cesar Eduardo Hernández Curiel, Victor Hugo Benítez Baltazar, Jesús Horacio Pacheco Ramírez

Abstract:

This paper is devoted to present the advances in the design of a prototype that is able to supervise the complex behavior of water quality parameters such as pH and temperature, via a real-time monitoring system. The current water quality tests that are performed in government water quality institutions in Mexico are carried out in problematic locations and they require taking manual samples. The water samples are then taken to the institution laboratory for examination. In order to automate this process, a water quality monitoring system based on wireless sensor networks is proposed. The system consists of a sensor node which contains one pH sensor, one temperature sensor, a microcontroller, and a ZigBee radio, and a base station composed by a ZigBee radio and a PC. The progress in this investigation shows the development of a water quality monitoring system. Due to recent events that affected water quality in Mexico, the main motivation of this study is to address water quality monitoring systems, so in the near future, a more robust, affordable, and reliable system can be deployed.

Keywords: pH measurement, water quality monitoring, wireless sensor networks, ZigBee

Procedia PDF Downloads 389
24377 Classifying and Analysis 8-Bit to 8-Bit S-Boxes Characteristic Using S-Box Evaluation Characteristic

Authors: Muhammad Luqman, Yusuf Kurniawan

Abstract:

S-Boxes is one of the linear parts of the cryptographic algorithm. The existence of S-Box in the cryptographic algorithm is needed to maintain non-linearity of the algorithm. Nowadays, modern cryptographic algorithms use an S-Box as a part of algorithm process. Despite the fact that several cryptographic algorithms today reuse theoretically secure and carefully constructed S-Boxes, there is an evaluation characteristic that can measure security properties of S-Boxes and hence the corresponding primitives. Analysis of an S-Box usually is done using manual mathematics calculation. Several S-Boxes are presented as a Truth Table without any mathematical background algorithm. Then, it’s rather difficult to determine the strength of Truth Table S-Box without a mathematical algorithm. A comprehensive analysis should be applied to the Truth Table S-Box to determine the characteristic. Several important characteristics should be owned by the S-Boxes, they are Nonlinearity, Balancedness, Algebraic degree, LAT, DAT, differential delta uniformity, correlation immunity and global avalanche criterion. Then, a comprehensive tool will be present to automatically calculate the characteristics of S-Boxes and determine the strength of S-Box. Comprehensive analysis is done on a deterministic process to produce a sequence of S-Boxes characteristic and give advice for a better S-Box construction.

Keywords: cryptographic properties, Truth Table S-Boxes, S-Boxes characteristic, deterministic process

Procedia PDF Downloads 359
24376 Clinicians' and Nurses' Documentation Practices in Palliative and Hospice Care: A Mixed Methods Study Providing Evidence for Quality Improvement at Mobile Hospice Mbarara, Uganda

Authors: G. Natuhwera, M. Rabwoni, P. Ellis, A. Merriman

Abstract:

Aims: Health workers are likely to document patients’ care inaccurately, especially when using new and revised case tools, and this could negatively impact patient care. This study set out to; (1) assess nurses’ and clinicians’ documentation practices when using a new patients’ continuation case sheet (PCCS) and (2) explore nurses’ and clinicians’ experiences regarding documentation of patients’ information in the new PCCS. The purpose of introducing the PCCS was to improve continuity of care for patients attending clinics at which they were unlikely to see the same clinician or nurse consistently. Methods: This was a mixed methods study. The cross-sectional inquiry retrospectively reviewed 100 case notes of active patients on hospice and palliative care program. Data was collected using a structured questionnaire with constructs formulated from the new PCCS under study. The qualitative element was face-to-face audio-recorded, open-ended interviews with a purposive sample of one palliative care clinician, and four palliative care nurse specialists. Thematic analysis was used. Results: Missing patients’ biogeographic information was prevalent at 5-10%. Spiritual and psychosocial issues were not documented in 42.6%, and vital signs in 49.2%. Poorest documentation practices were observed in past medical history part of the PCCS at 40-63%. Four themes emerged from interviews with clinicians and nurses-; (1) what remains unclear and challenges, (2) comparing the past with the present, (3) experiential thoughts, and (4) transition and adapting to change. Conclusions: The PCCS seems to be a comprehensive and simple tool to be used to document patients’ information at subsequent visits. The comprehensiveness and utility of the PCCS does paper to be limited by the failure to train staff in its use prior to introducing. The authors find the PCCS comprehensive and suitable to capture patients’ information and recommend it can be adopted and used in other palliative and hospice care settings, if suitable introductory training accompanies its introduction. Otherwise, the reliability and validity of patients’ information collected by this PCCS can be significantly reduced if some sections therein are unclear to the clinicians/nurses. The study identified clinicians- and nurses-related pitfalls in documentation of patients’ care. Clinicians and nurses need to prioritize accurate and complete documentation of patient care in the PCCS for quality care provision. This study should be extended to other sites using similar tools to ensure representative and generalizable findings.

Keywords: documentation, information case sheet, palliative care, quality improvement

Procedia PDF Downloads 138
24375 A Mechanism of Reusable, Portable, and Reliable Script Generator on Android

Authors: Kuei-Chun Liu, Yu-Yu Lai, Ching-Hong Wu

Abstract:

A good automated testing tool could reduce as much as possible the manual work done by testers. Traditional record-replay testing tool provides an automated testing solution by recording mouse coordinates as test scripts, but it will be easily broken if any change of resolutions. Therefore, more and more testers design multiple test scripts to automate the testing process for different devices. In order to improve the traditional record-replay approach and reduce the effort that the testers spending on writing test scripts, we propose an approach for generating the Android application test scripts based on accessibility service without connecting to a computer. This approach simulates user input actions and replays them correctly even at the different conditions such as the internet connection is unstable when the device under test, the different resolutions on Android devices. In this paper, we describe how to generate test scripts automatically and make a comparison with existing tools for Android such as Robotium, Appium, UIAutomator, and MonkeyTalk.

Keywords: accessibility service, Appium, automated testing, MonkeyTalk, Robotium, testing, UIAutomator

Procedia PDF Downloads 370
24374 Regulation on the Protection of Personal Data Versus Quality Data Assurance in the Healthcare System Case Report

Authors: Elizabeta Krstić Vukelja

Abstract:

Digitization of personal data is a consequence of the development of information and communication technologies that create a new work environment with many advantages and challenges, but also potential threats to privacy and personal data protection. Regulation (EU) 2016/679 of the European Parliament and of the Council is becoming a law and obligation that should address the issues of personal data protection and information security. The existence of the Regulation leads to the conclusion that national legislation in the field of virtual environment, protection of the rights of EU citizens and processing of their personal data is insufficiently effective. In the health system, special emphasis is placed on the processing of special categories of personal data, such as health data. The healthcare industry is recognized as a particularly sensitive area in which a large amount of medical data is processed, the digitization of which enables quick access and quick identification of the health insured. The protection of the individual requires quality IT solutions that guarantee the technical protection of personal categories. However, the real problems are the technical and human nature and the spatial limitations of the application of the Regulation. Some conclusions will be drawn by analyzing the implementation of the basic principles of the Regulation on the example of the Croatian health care system and comparing it with similar activities in other EU member states.

Keywords: regulation, healthcare system, personal dana protection, quality data assurance

Procedia PDF Downloads 32