Search results for: applications of big data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29625

Search results for: applications of big data

28125 i2kit: A Tool for Immutable Infrastructure Deployments

Authors: Pablo Chico De Guzman, Cesar Sanchez

Abstract:

Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.

Keywords: container, deployment, immutable infrastructure, microservice

Procedia PDF Downloads 175
28124 Artificial Intelligence and Governance in Relevance to Satellites in Space

Authors: Anwesha Pathak

Abstract:

With the increasing number of satellites and space debris, space traffic management (STM) becomes crucial. AI can aid in STM by predicting and preventing potential collisions, optimizing satellite trajectories, and managing orbital slots. Governance frameworks need to address the integration of AI algorithms in STM to ensure safe and sustainable satellite activities. AI and governance play significant roles in the context of satellite activities in space. Artificial intelligence (AI) technologies, such as machine learning and computer vision, can be utilized to process vast amounts of data received from satellites. AI algorithms can analyse satellite imagery, detect patterns, and extract valuable information for applications like weather forecasting, urban planning, agriculture, disaster management, and environmental monitoring. AI can assist in automating and optimizing satellite operations. Autonomous decision-making systems can be developed using AI to handle routine tasks like orbit control, collision avoidance, and antenna pointing. These systems can improve efficiency, reduce human error, and enable real-time responsiveness in satellite operations. AI technologies can be leveraged to enhance the security of satellite systems. AI algorithms can analyze satellite telemetry data to detect anomalies, identify potential cyber threats, and mitigate vulnerabilities. Governance frameworks should encompass regulations and standards for securing satellite systems against cyberattacks and ensuring data privacy. AI can optimize resource allocation and utilization in satellite constellations. By analyzing user demands, traffic patterns, and satellite performance data, AI algorithms can dynamically adjust the deployment and routing of satellites to maximize coverage and minimize latency. Governance frameworks need to address fair and efficient resource allocation among satellite operators to avoid monopolistic practices. Satellite activities involve multiple countries and organizations. Governance frameworks should encourage international cooperation, information sharing, and standardization to address common challenges, ensure interoperability, and prevent conflicts. AI can facilitate cross-border collaborations by providing data analytics and decision support tools for shared satellite missions and data sharing initiatives. AI and governance are critical aspects of satellite activities in space. They enable efficient and secure operations, ensure responsible and ethical use of AI technologies, and promote international cooperation for the benefit of all stakeholders involved in the satellite industry.

Keywords: satellite, space debris, traffic, threats, cyber security.

Procedia PDF Downloads 67
28123 Modifications in Design of Lap Joint of Fiber Metal Laminates

Authors: Shaher Bano, Samia Fida, Asif Israr

Abstract:

The continuous development and exploitation of materials and designs have diverted the attention of the world towards the use of robust composite materials known as fiber-metal laminates in many high-performance applications. The hybrid structure of fiber metal laminates makes them a material of choice for various applications such as aircraft skin panels, fuselage floorings, door panels and other load bearing applications. The synergistic effect of properties of metals and fibers reinforced laminates are responsible for their high damage tolerance as the metal element provides better fatigue and impact properties, while high stiffness and better corrosion properties are inherited from the fiber reinforced matrix systems. They are mostly used as a layered structure in different joint configurations such as lap and but joints. The FML layers are usually bonded with each other using either mechanical fasteners or adhesive bonds. This research work is also focused on modification of an adhesive bonded joint as a single lap joint of carbon fibers based CARALL FML has been modified to increase interlaminar shear strength and avoid delamination. For this purpose different joint modification techniques such as the introduction of spews and shoulder to modify the bond shape and use of nanofillers such as carbon nano-tubes as a reinforcement in the adhesive materials, have been utilized to improve shear strength of lap joint of the adhesively bonded FML layers. Both the simulation and experimental results showed that lap joint with spews and shoulders configuration have better properties due to stress distribution over a large area at the corner of the joint. The introduction of carbon nanotubes has also shown a positive effect on shear stress and joint strength as they act as reinforcement in the adhesive bond material.

Keywords: adhesive joint, Carbon Reinforced Aluminium Laminate (CARALL), fiber metal laminates, spews

Procedia PDF Downloads 296
28122 User Experience in Relation to Eye Tracking Behaviour in VR Gallery

Authors: Veslava Osinska, Adam Szalach, Dominik Piotrowski

Abstract:

Contemporary VR technologies allow users to explore virtual 3D spaces where they can work, socialize, learn, and play. User's interaction with GUI and the pictures displayed implicate perceptual and also cognitive processes which can be monitored due to neuroadaptive technologies. These modalities provide valuable information about the users' intentions, situational interpretations, and emotional states, to adapt an application or interface accordingly. Virtual galleries outfitted by specialized assets have been designed using the Unity engine BITSCOPE project in the frame of CHIST-ERA IV program. Users interaction with gallery objects implies the questions about his/her visual interests in art works and styles. Moreover, an attention, curiosity, and other emotional states are possible to be monitored and analyzed. Natural gaze behavior data and eye position were recorded by built-in eye-tracking module within HTC Vive headset gogle for VR. Eye gaze results are grouped due to various users’ behavior schemes and the appropriate perpetual-cognitive styles are recognized. Parallelly usability tests and surveys were adapted to identify the basic features of a user-centered interface for the virtual environments across most of the timeline of the project. A total of sixty participants were selected from the distinct faculties of University and secondary schools. Users’ primary knowledge about art and was evaluated during pretest and this way the level of art sensitivity was described. Data were collected during two months. Each participant gave written informed consent before participation. In data analysis reducing the high-dimensional data into a relatively low-dimensional subspace ta non linear algorithms were used such as multidimensional scaling and novel technique technique t-Stochastic Neighbor Embedding. This way it can classify digital art objects by multi modal time characteristics of eye tracking measures and reveal signatures describing selected artworks. Current research establishes the optimal place on aesthetic-utility scale because contemporary interfaces of most applications require to be designed in both functional and aesthetical ways. The study concerns also an analysis of visual experience for subsamples of visitors, differentiated, e.g., in terms of frequency of museum visits, cultural interests. Eye tracking data may also show how to better allocate artefacts and paintings or increase their visibility when possible.

Keywords: eye tracking, VR, UX, visual art, virtual gallery, visual communication

Procedia PDF Downloads 38
28121 Comparative Analysis of Automation Testing Tools

Authors: Amit Bhanushali

Abstract:

In the ever-changing landscape of software development, automated software testing has emerged as a critical component of the Software Development Life Cycle (SDLC). This research undertakes a comparative study of three major automated testing tools -UFT, Selenium, and RPA- evaluating them on usability, maintenance, and effectiveness. Leveraging existing JAVA-based applications as test cases, the study aims to guide testers in selecting the optimal tool for specific applications. By exploring key features such as source and licensing, testing expenses, object repositories, usability, and language support, the research provides practical insights into UFT, Selenium, and RPA. Acknowledging the pivotal role of these tools in streamlining testing processes amid time constraints and resource limitations, the study assists professionals in making informed choices aligned with their organizational needs.

Keywords: software testing tools, software development lifecycle (SDLC), test automation frameworks, automated software, JAVA-based, UFT, selenium and RPA (robotic process automation), source and licensing, object repository

Procedia PDF Downloads 92
28120 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 367
28119 Extraction of Urban Building Damage Using Spectral, Height and Corner Information

Authors: X. Wang

Abstract:

Timely and accurate information on urban building damage caused by earthquake is important basis for disaster assessment and emergency relief. Very high resolution (VHR) remotely sensed imagery containing abundant fine-scale information offers a large quantity of data for detecting and assessing urban building damage in the aftermath of earthquake disasters. However, the accuracy obtained using spectral features alone is comparatively low, since building damage, intact buildings and pavements are spectrally similar. Therefore, it is of great significance to detect urban building damage effectively using multi-source data. Considering that in general height or geometric structure of buildings change dramatically in the devastated areas, a novel multi-stage urban building damage detection method, using bi-temporal spectral, height and corner information, was proposed in this study. The pre-event height information was generated using stereo VHR images acquired from two different satellites, while the post-event height information was produced from airborne LiDAR data. The corner information was extracted from pre- and post-event panchromatic images. The proposed method can be summarized as follows. To reduce the classification errors caused by spectral similarity and errors in extracting height information, ground surface, shadows, and vegetation were first extracted using the post-event VHR image and height data and were masked out. Two different types of building damage were then extracted from the remaining areas: the height difference between pre- and post-event was used for detecting building damage showing significant height change; the difference in the density of corners between pre- and post-event was used for extracting building damage showing drastic change in geometric structure. The initial building damage result was generated by combining above two building damage results. Finally, a post-processing procedure was adopted to refine the obtained initial result. The proposed method was quantitatively evaluated and compared to two existing methods in Port au Prince, Haiti, which was heavily hit by an earthquake in January 2010, using pre-event GeoEye-1 image, pre-event WorldView-2 image, post-event QuickBird image and post-event LiDAR data. The results showed that the method proposed in this study significantly outperformed the two comparative methods in terms of urban building damage extraction accuracy. The proposed method provides a fast and reliable method to detect urban building collapse, which is also applicable to relevant applications.

Keywords: building damage, corner, earthquake, height, very high resolution (VHR)

Procedia PDF Downloads 210
28118 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 426
28117 Intersections and Cultural Landscape Interpretation, in the Case of Ancient Messene in the Peloponnese

Authors: E. Maistrou, P. Themelis, D. Kosmopoulos, K. Boulougoura, A. M. Konidi, K. Moretti

Abstract:

InterArch is an ongoing research project that is running since September 2020 and aims to propose a digital application for the enhancement of the cultural landscape, which emphasizes the contribution of physical space and time in digital data organization. The research case study refers to Ancient Messene in the Peloponnese, one of the most important archaeological sites in Greece. The project integrates an interactive approach to the natural environment, aiming at a manifold sensory experience. It combines the physical space of the archaeological site with the digital space of archaeological and cultural data while, at the same time, it embraces storytelling processes by engaging an interdisciplinary approach that familiarizes the user to multiple semantic interpretations. The research project is co‐financed by the European Union and Greek national funds, through the Operational Program Competitiveness, Entrepreneurship, and Innovation, under the call RESEARCH - CREATE – INNOVATE (project code: Τ2ΕΔΚ-01659). It involves mutual collaboration between academic and cultural institutions and the contribution of an IT applications development company. New technologies and the integration of digital data enable the implementation of non‐linear narratives related to the representational characteristics of the art of collage. Various images (photographs, drawings, etc.) and sounds (narrations, music, soundscapes, audio signs, etc.) could be presented according to our proposal through new semiotics of augmented and virtual reality technologies applied in touch screens and smartphones. Despite the fragmentation of tangible or intangible references, material landscape formations, including archaeological remains, constitute the common ground that can inspire cultural narratives in a process that unfolds personal perceptions and collective imaginaries. It is in this context that cultural landscape may be considered an indication of space and historical continuity. It is in this context that history could emerge, according to our proposal, not solely as a previous inscription but also as an actual happening. As a rhythm of occurrences suggesting mnemonic references and, moreover, evolving history projected on the contemporary ongoing cultural landscape.

Keywords: cultural heritage, digital data, landscape, archaeological sites, visitors’ itineraries

Procedia PDF Downloads 78
28116 Virtual Learning during the Period of COVID-19 Pandemic at a Saudi University

Authors: Ahmed Mohammed Omer Alghamdi

Abstract:

Since the COVID-19 pandemic started, a rapid, unexpected transition from face-to-face to virtual classroom (VC) teaching has involved several challenges and obstacles. However, there are also opportunities and thoughts that need to be examined and discussed. In addition, the entire world is witnessing that the teaching system and, more particularly, higher education institutes have been interrupted. To maintain the learning and teaching practices as usual, countries were forced to transition from traditional to virtual classes using various technology-based devices. In this regard, the Kingdom of Saudi Arabia (KSA) is no exception. Focusing on how the current situation has forced many higher education institutes to change to virtual classes may possibly provide a clear insight into adopted practices and implications. The main purpose of this study, therefore, was to investigate how both Saudi English as a foreign language (EFL) teachers and students perceived the implementation of virtual classes as a key factor for useful language teaching and learning process during the COVID-19 pandemic period at a Saudi university. The impetus for the research was, therefore, the need to find ways of identifying the deficiencies in this application and to suggest possible solutions that might rectify those deficiencies. This study seeks to answer the following overarching research question: “How do Saudi EFL instructors and students perceive the use of virtual classes during the COVID-19 pandemic period in their language teaching and learning context?” The following sub-questions are also used to guide the design of the study to answer the main research question: (1) To what extent are virtual classes important intra-pandemic from Saudi EFL instructors’ and students’ perspectives? (2) How effective are virtual classes for fostering English language students’ achievement? (3) What are the challenges and obstacles that instructors and students may face during the implementation of virtual teaching? A mixed method approach was employed in this study; the questionnaire data collection represented the quantitative method approach for this study, whereas the transcripts of recorded interviews represented the qualitative method approach. The participants included EFL teachers (N = 4) and male and female EFL students (N = 36). Based on the findings of this study, various aspects from teachers' and students’ perspectives were examined to determine the use of the virtual classroom applications in terms of fulfilling the students’ English language learning needs. The major findings of the study revealed that the virtual classroom applications during the current pandemic situation encountered three major challenges, among which the existence of the following essential aspects, namely lack of technology and an internet connection, having a large number of students in a virtual classroom and lack of students’ and teachers’ interactions during the virtual classroom applications. Finally, the findings indicated that although Saudi EFL students and teachers view the virtual classrooms in a positive light during the pandemic period, they reported that for long and post-pandemic period, they preferred the traditional face-to-face teaching procedure.

Keywords: virtual classes, English as a foreign language, COVID-19, Internet, pandemic

Procedia PDF Downloads 85
28115 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations

Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe

Abstract:

In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.

Keywords: electronic health records, electronic emergency department information system, emergency department, data quality

Procedia PDF Downloads 270
28114 Inertial Motion Capture System for Biomechanical Analysis in Rehabilitation and Sports

Authors: Mario Sandro F. Rocha, Carlos S. Ande, Anderson A. Oliveira, Felipe M. Bersotti, Lucas O. Venzel

Abstract:

The inertial motion capture systems (mocap) are among the most suitable tools for quantitative clinical analysis in rehabilitation and sports medicine. The inertial measuring units (IMUs), composed by accelerometers, gyroscopes, and magnetometers, are able to measure spatial orientations and calculate displacements with sufficient precision for applications in biomechanical analysis of movement. Furthermore, this type of system is relatively affordable and has the advantages of portability and independence from external references. In this work, we present the last version of our inertial motion capture system, based on the foregoing technology, with a unity interface designed for rehabilitation and sports. In our hardware architecture, only one serial port is required. First, the board client must be connected to the computer by a USB cable. Next, an available serial port is configured and opened to establish the communication between the client and the application, and then the client starts scanning for the active MOCAP_S servers around. The servers play the role of the inertial measuring units that capture the movements of the body and send the data to the client, which in turn create a package composed by the ID of the server, the current timestamp, and the motion capture data defined in the client pre-configuration of the capture session. In the current version, we can measure the game rotation vector (grv) and linear acceleration (lacc), and we also have a step detector that can be abled or disabled. The grv data are processed and directly linked to the bones of the 3D model, and, along with the data of lacc and step detector, they are also used to perform the calculations of displacements and other variables shown on the graphical user interface. Our user interface was designed to calculate and present variables that are important for rehabilitation and sports, such as cadence, speed, total gait cycle, gait cycle length, obliquity and rotation, and center of gravity displacement. Our goal is to present a low-cost portable and wearable system with a friendly interface for application in biomechanics and sports, which also performs as a product of high precision and low consumption of energy.

Keywords: biomechanics, inertial sensors, motion capture, rehabilitation

Procedia PDF Downloads 137
28113 Synthesis and Characterization of Green Coke-Derived Activated Carbon by KOH Activation

Authors: Richard, Iyan Subiyanto, Chairul Hudaya

Abstract:

Activated carbon has been playing a significant role for many applications, especially in energy storage devices. However, commercially activated carbons generally require complicated processes and high production costs. Therefore, in this study, an activated carbon originating from green coke waste, that is economically affordable will be used as a carbon source. To synthesize activated carbon, KOH as an activator was employed with variation of C:KOH in ratio of 1:2, 1:3, 1:4, and 1:5, respectively, with an activation temperature of 700°C. The characterizations of activated carbon are obtained from Scanning Electron Microscopy, Energy Dispersive X-Ray, Raman Spectroscopy, and Brunauer-Emmett-Teller. The optimal activated carbon sample with specific surface area of 2,024 m²/g with high carbon content ( > 80%) supported by the high porosity carbon image obtained by SEM was prepared at C:KOH ratio of 1:4. The result shows that the synthesized activated carbon would be an ideal choice for energy storage device applications. Therefore, this study is expected to reduce the costs of activated carbon production by expanding the utilization of petroleum waste.

Keywords: activated carbon, energy storage material, green coke, specific surface area

Procedia PDF Downloads 162
28112 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset

Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba

Abstract:

We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).

Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process

Procedia PDF Downloads 251
28111 Spectroscopy Study of Jatropha curcas Seed Oil for Pharmaceutical Applications

Authors: Bashar Mudhaffar Abdullah, Hasniza Zaman Huri, Nany Hairunisa

Abstract:

This study was carried out to determine the thermal properties and spectroscopy study of Malaysian Jatropha curcas seed oil. The J. curcas seed oil physicochemical properties such as free fatty acid (FFA %), acid value, saponification value, iodine value, unsaponifiable matter, and viscosity (cp) gave values of 1.89±0.10%, 3.76±0.07, 203.36±0.36 mg/g, 4.90±0.25, 1.76±0.03%, and 32, respectively. Gas chromatography (GC) was used to determine the fatty acids (FAs) composition. J. curcas seed oil is consisting of saturated FAs (19.55%) such as palmitic (13.19%), palmitoleic (0.40%), and stearic (6.36%) acids and unsaturated FAs (80.42%) such as oleic (43.32%) and linoleic (36.70%) acids. The thermal properties using differential scanning calorimetry (DSC) showed that crystallized TAG was observed at -6.79°C. The melting curves displayed three major exothermic regions of J. curcas seed oil, monounsaturated (lower-temperature peak) at -31.69°C, di-unsaturated (medium temperature peak) at -20.23°C and tri-unsaturated (higher temperature peak) at -12.72°C. The results of this study showed that the J. curcas seed oil is a plausible source of polyunsaturated fatty acid (PUFA) to be developed in the future for pharmaceutical applications.

Keywords: Jatropha curcas seed oil, thermal properties, crystallization, melting, spectroscopy

Procedia PDF Downloads 472
28110 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 485
28109 Resale Housing Development Board Price Prediction Considering Covid-19 through Sentiment Analysis

Authors: Srinaath Anbu Durai, Wang Zhaoxia

Abstract:

Twitter sentiment has been used as a predictor to predict price values or trends in both the stock market and housing market. The pioneering works in this stream of research drew upon works in behavioural economics to show that sentiment or emotions impact economic decisions. Latest works in this stream focus on the algorithm used as opposed to the data used. A literature review of works in this stream through the lens of data used shows that there is a paucity of work that considers the impact of sentiments caused due to an external factor on either the stock or the housing market. This is despite an abundance of works in behavioural economics that show that sentiment or emotions caused due to an external factor impact economic decisions. To address this gap, this research studies the impact of Twitter sentiment pertaining to the Covid-19 pandemic on resale Housing Development Board (HDB) apartment prices in Singapore. It leverages SNSCRAPE to collect tweets pertaining to Covid-19 for sentiment analysis, lexicon based tools VADER and TextBlob are used for sentiment analysis, Granger Causality is used to examine the relationship between Covid-19 cases and the sentiment score, and neural networks are leveraged as prediction models. Twitter sentiment pertaining to Covid-19 as a predictor of HDB price in Singapore is studied in comparison with the traditional predictors of housing prices i.e., the structural and neighbourhood characteristics. The results indicate that using Twitter sentiment pertaining to Covid19 leads to better prediction than using only the traditional predictors and performs better as a predictor compared to two of the traditional predictors. Hence, Twitter sentiment pertaining to an external factor should be considered as important as traditional predictors. This paper demonstrates the real world economic applications of sentiment analysis of Twitter data.

Keywords: sentiment analysis, Covid-19, housing price prediction, tweets, social media, Singapore HDB, behavioral economics, neural networks

Procedia PDF Downloads 108
28108 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 553
28107 The Integration and Automation of EDA Tools in an Integrated Circuit Design Environment

Authors: Rohaya Abdul Wahab, Raja Mohd Fuad Tengku Aziz, Nazaliza Othman, Sharifah Saleh, Nabihah Razali, Rozaimah Baharim, M. Hanif M. Nasir

Abstract:

This paper will discuss how EDA tools are integrated and automated in an Integrated Circuit Design Environment. Some of the problems face in our current environment is that users need to configure manually on the library paths, start-up files and project directories. Certain manual processes that happen between the users and applications can be automated but they must be transparent to the users. For example, the users can run the applications directly after login without knowing the library paths and start-up files locations. The solution to these problems is to automate the processes using standard configuration files which will benefit the users and EDA support. This paper will discuss how the implementation is done to automate the process using scripting languages such as Perl, Tcl, Scheme and Shell Script. These scripting tools are great assets for design engineers to build a robust and powerful design flow and this technique is widely used to integrate all the tools together.

Keywords: EDA tools, Integrated Circuits, scripting, integration, automation

Procedia PDF Downloads 321
28106 Hydroxyapatite-Chitosan Composites for Tissue Engineering Applications

Authors: Georgeta Voicu, Cristina Daniela Ghitulica, Andreia Cucuruz, Cristina Busuioc

Abstract:

In the field of tissue engineering, the compositional and microstructural features of the employed materials play an important role, with implications on the mechanical and biological behaviour of the medical devices. In this context, the development of apatite - natural biopolymer composites represents a choice of many scientific groups. Thus, hydroxyapatite powders were synthesized by a wet method, namely co-precipitation, starting from high purity reagents (CaO, MgO, and H3PO4). Moreover, the substitution of calcium with magnesium have been approached, in the 5 - 10 wt.% range. Afterward, the phosphate powders were integrated in two types of composites with chitosan, different from morphological point of view. First, 3D porous scaffolds were obtained by a freeze-drying procedure. Second, uniform, compact films were achieved by film casting. The influence of chitosan molecular weight (low, medium and high), as well as apatite powder to polymer ratio (1:1 and 1:2) on the morphological properties, were analysed in detail. In conclusion, the reported biocomposites, prepared by a straightforward route are suitable for bone substitution or repairing applications.

Keywords: bone reconstruction, chitosan, composite scaffolds, hydroxyapatite

Procedia PDF Downloads 317
28105 Magnesium Alloys Containing Y, Gd and Ca with Enhanced Ignition Temperature and Mechanical Properties for Aviation Applications

Authors: Jiří Kubásek, Peter Minárik, Klára Hosová, Stanislav Šašek, Jozef Veselý, Jitka Stráská, Drahomír Dvorský, Dalibor Vojtěch, Miloš Janeček

Abstract:

Mg-2Y-2Gd-1Ca and Mg-4Y-4Gd-2Ca alloys were processed by extrusion or equal channel angular pressing (ECAP) to analyse the effect of the microstructure on ignition temperature, mechanical properties and corrosion resistance. The alloys are characterized by good mechanical properties and exceptionally high ignition temperature, which is a critical safety measure. The effect of extrusion and ECAP on the microstructure, mechanical properties and ignition temperature was studied. The obtained results indicated a substantial effect of the processing conditions on the average grain size, the recrystallized fraction and texture formation. Both alloys featured a high strength, depending on the composition and processing condition, and a high ignition temperature of ≈1100 °C (Mg-4Y-4Gd-2Ca) and ≈950 °C (Mg-2Y-2Gd-1Ca), which was attributed to the synergic effect of Y, Gd and Ca oxides, with the dominant effect of Y₂O₃. The achieved combination of enhanced mechanical properties and the ignition temperature makes these alloys a prominent candidate for aircraft applications.

Keywords: magnesium alloys, enhanced ignition temperature, mechanical properties, ECAP

Procedia PDF Downloads 99
28104 Development of a Fire Analysis Drone for Smoke Toxicity Measurement for Fire Prediction and Management

Authors: Gabrielle Peck, Ryan Hayes

Abstract:

This research presents the design and creation of a drone gas analyser, aimed at addressing the need for independent data collection and analysis of gas emissions during large-scale fires, particularly wasteland fires. The analyser drone, comprising a lightweight gas analysis system attached to a remote-controlled drone, enables the real-time assessment of smoke toxicity and the monitoring of gases released into the atmosphere during such incidents. The key components of the analyser unit included two gas line inlets connected to glass wool filters, a pump with regulated flow controlled by a mass flow controller, and electrochemical cells for detecting nitrogen oxides, hydrogen cyanide, and oxygen levels. Additionally, a non-dispersive infrared (NDIR) analyser is employed to monitor carbon monoxide (CO), carbon dioxide (CO₂), and hydrocarbon concentrations. Thermocouples can be attached to the analyser to monitor temperature, as well as McCaffrey probes combined with pressure transducers to monitor air velocity and wind direction. These additions allow for monitoring of the large fire and can be used for predictions of fire spread. The innovative system not only provides crucial data for assessing smoke toxicity but also contributes to fire prediction and management. The remote-controlled drone's mobility allows for safe and efficient data collection in proximity to the fire source, reducing the need for human exposure to hazardous conditions. The data obtained from the gas analyser unit facilitates informed decision-making by emergency responders, aiding in the protection of both human health and the environment. This abstract highlights the successful development of a drone gas analyser, illustrating its potential for enhancing smoke toxicity analysis and fire prediction capabilities. The integration of this technology into fire management strategies offers a promising solution for addressing the challenges associated with wildfires and other large-scale fire incidents. The project's methodology and results contribute to the growing body of knowledge in the field of environmental monitoring and safety, emphasizing the practical utility of drones for critical applications.

Keywords: fire prediction, drone, smoke toxicity, analyser, fire management

Procedia PDF Downloads 85
28103 Design Considerations on Cathodic Protection for X65 Steel Tank Containing Fresh Water

Authors: A. M. Al-Sabagh, M. A. Deyab, M. N. Kroush

Abstract:

The present study focused on critical and detailed approach for using aluminum electrode as impressed current anode for cathodic protection of X65 steel tank containing fresh water. The impressed current design calculation showed 0.6 A of current demand and voltage of 0.33 V required to adequately protect the X65 steel tank with internal surface area of 421 m². We used here one transformer rectifier with current and voltage output of 25 A and 25 V, respectively. The data showed that the potentials ranged from -0.474 to -0.509 V (vs. Cu/CuSO₄), prior to the application of cathodic protection. When the potential was measured 1 h after the application of cathodic protection, the potential values showed considerable shift within protection range (-0.950 V vs. Cu/CuSO₄). The results confirmed that aluminum anode can be used in freshwater applications with high efficiency (current capacity) and low consumption rate.

Keywords: cathodic protection, aluminum, steel, fresh water

Procedia PDF Downloads 149
28102 Modelling of Recovery and Application of Low-Grade Thermal Resources in the Mining and Mineral Processing Industry

Authors: S. McLean, J. A. Scott

Abstract:

The research topic is focusing on improving sustainable operation through recovery and reuse of waste heat in process water streams, an area in the mining industry that is often overlooked. There are significant advantages to the application of this topic, including economic and environmental benefits. The smelting process in the mining industry presents an opportunity to recover waste heat and apply it to alternative uses, thereby enhancing the overall process. This applied research has been conducted at the Sudbury Integrated Nickel Operations smelter site, in particular on the water cooling towers. The aim was to determine and optimize methods for appropriate recovery and subsequent upgrading of thermally low-grade heat lost from the water cooling towers in a manner that makes it useful for repurposing in applications, such as within an acid plant. This would be valuable to mining companies as it would be an opportunity to reduce the cost of the process, as well as decrease environmental impact and primary fuel usage. The waste heat from the cooling towers needs to be upgraded before it can be beneficially applied, as lower temperatures result in a decrease of the number of potential applications. Temperature and flow rate data were collected from the water cooling towers at an acid plant over two years. The research includes process control strategies and the development of a model capable of determining if the proposed heat recovery technique is economically viable, as well as assessing any environmental impact with the reduction in net energy consumption by the process. Therefore, comprehensive cost and impact analyses are carried out to determine the best area of application for the recovered waste heat. This method will allow engineers to easily identify the value of thermal resources available to them and determine if a full feasibility study should be carried out. The rapid scoping model developed will be applicable to any site that generates large amounts of waste heat. Results show that heat pumps are an economically viable solution for this application, allowing for reduced cost and CO₂ emissions.

Keywords: environment, heat recovery, mining engineering, sustainability

Procedia PDF Downloads 108
28101 Scalable Cloud-Based LEO Satellite Constellation Simulator

Authors: Karim Sobh, Khaled El-Ayat, Fady Morcos, Amr El-Kadi

Abstract:

Distributed applications deployed on LEO satellites and ground stations require substantial communication between different members in a constellation to overcome the earth coverage barriers imposed by GEOs. Applications running on LEO constellations suffer the earth line-of-sight blockage effect. They need adequate lab testing before launching to space. We propose a scalable cloud-based net-work simulation framework to simulate problems created by the earth line-of-sight blockage. The framework utilized cloud IaaS virtual machines to simulate LEO satellites and ground stations distributed software. A factorial ANOVA statistical analysis is conducted to measure simulator overhead on overall communication performance. The results showed a very low simulator communication overhead. Consequently, the simulation framework is proposed as a candidate for testing LEO constellations with distributed software in the lab before space launch.

Keywords: LEO, cloud computing, constellation, satellite, network simulation, netfilter

Procedia PDF Downloads 381
28100 Analysis of Delivery of Quad Play Services

Authors: Rahul Malhotra, Anurag Sharma

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: FTTH, quad play, play service, access networks, data rate

Procedia PDF Downloads 404
28099 Healthcare Providers’ Perception Towards Utilization of Health Information Applications and Its Associated Factors in Healthcare Delivery in Health Facilities in Cape Coast Metropolis, Ghana

Authors: Richard Okyere Boadu, Godwin Adzakpah, Nathan Kumasenu Mensah, Kwame Adu Okyere Boadu, Jonathan Kissi, Christiana Dziyaba, Rosemary Bermaa Abrefa

Abstract:

Information and communication technology (ICT) has significantly advanced global healthcare, with electronic health (e-Health) applications improving health records and delivery. These innovations, including electronic health records, strengthen healthcare systems. The study investigates healthcare professionals' perceptions of health information applications and their associated factors in the Cape Coast Metropolis of Ghana's health facilities. Methods: We used a descriptive cross-sectional study design to collect data from 632 healthcare professionals (HCPs), in the three purposively selected health facilities in the Cape Coast municipality of Ghana in July 2022. Shapiro-Wilk test was used to check the normality of dependent variables. Descriptive statistics were used to report means with corresponding standard deviations for continuous variables. Proportions were also reported for categorical variables. Bivariate regression analysis was conducted to determine the factors influencing the Benefits of Information Technology (BoIT); Barriers to Information Technology Use (BITU); and Motives of Information Technology Use (MoITU) in healthcare delivery. Stata SE version 15 was used for the analysis. A p-value of less than 0.05 served as the basis for considering a statistically significant accepting hypothesis. Results: Healthcare professionals (HCPs) generally perceived moderate benefits (Mean score (M)=5.67) from information technology (IT) in healthcare. However, they slightly agreed that barriers like insufficient computers (M=5.11), frequent system downtime (M=5.09), low system performance (M=5.04), and inadequate staff training (M=4.88) hindered IT utilization. Respondents slightly agreed that training (M=5.56), technical support (M=5.46), and changes in work procedures (M=5.10) motivated their IT use. Bivariate regression analysis revealed significant influences of education, working experience, healthcare profession, and IT training on attitudes towards IT utilization in healthcare delivery (BoIT, BITU, and MoITU). Additionally, the age of healthcare providers, education, and working experience significantly influenced BITU. Ultimately, age, education, working experience, healthcare profession, and IT training significantly influenced MoITU in healthcare delivery. Conclusions: Healthcare professionals acknowledge moderate benefits of IT in healthcare but encounter barriers like inadequate resources and training. Motives for IT use include staff training and support. Bivariate regression analysis shows education, working experience, profession, and IT training significantly influence attitudes toward IT adoption. Targeted interventions and policies can enhance IT utilization in the Cape Coast Metropolis, Ghana.

Keywords: health information application, utilization of information application, information technology use, healthcare

Procedia PDF Downloads 60
28098 Classification of Manufacturing Data for Efficient Processing on an Edge-Cloud Network

Authors: Onyedikachi Ulelu, Andrew P. Longstaff, Simon Fletcher, Simon Parkinson

Abstract:

The widespread interest in 'Industry 4.0' or 'digital manufacturing' has led to significant research requiring the acquisition of data from sensors, instruments, and machine signals. In-depth research then identifies methods of analysis of the massive amounts of data generated before and during manufacture to solve a particular problem. The ultimate goal is for industrial Internet of Things (IIoT) data to be processed automatically to assist with either visualisation or autonomous system decision-making. However, the collection and processing of data in an industrial environment come with a cost. Little research has been undertaken on how to specify optimally what data to capture, transmit, process, and store at various levels of an edge-cloud network. The first step in this specification is to categorise IIoT data for efficient and effective use. This paper proposes the required attributes and classification to take manufacturing digital data from various sources to determine the most suitable location for data processing on the edge-cloud network. The proposed classification framework will minimise overhead in terms of network bandwidth/cost and processing time of machine tool data via efficient decision making on which dataset should be processed at the ‘edge’ and what to send to a remote server (cloud). A fast-and-frugal heuristic method is implemented for this decision-making. The framework is tested using case studies from industrial machine tools for machine productivity and maintenance.

Keywords: data classification, decision making, edge computing, industrial IoT, industry 4.0

Procedia PDF Downloads 174
28097 Attribute Analysis of Quick Response Code Payment Users Using Discriminant Non-negative Matrix Factorization

Authors: Hironori Karachi, Haruka Yamashita

Abstract:

Recently, the system of quick response (QR) code is getting popular. Many companies introduce new QR code payment services and the services are competing with each other to increase the number of users. For increasing the number of users, we should grasp the difference of feature of the demographic information, usage information, and value of users between services. In this study, we conduct an analysis of real-world data provided by Nomura Research Institute including the demographic data of users and information of users’ usages of two services; LINE Pay, and PayPay. For analyzing such data and interpret the feature of them, Nonnegative Matrix Factorization (NMF) is widely used; however, in case of the target data, there is a problem of the missing data. EM-algorithm NMF (EMNMF) to complete unknown values for understanding the feature of the given data presented by matrix shape. Moreover, for comparing the result of the NMF analysis of two matrices, there is Discriminant NMF (DNMF) shows the difference of users features between two matrices. In this study, we combine EMNMF and DNMF and also analyze the target data. As the interpretation, we show the difference of the features of users between LINE Pay and Paypay.

Keywords: data science, non-negative matrix factorization, missing data, quality of services

Procedia PDF Downloads 126
28096 Developing Guidelines for Public Health Nurse Data Management and Use in Public Health Emergencies

Authors: Margaret S. Wright

Abstract:

Background/Significance: During many recent public health emergencies/disasters, public health nursing data has been missing or delayed, potentially impacting the decision-making and response. Data used as evidence for decision-making in response, planning, and mitigation has been erratic and slow, decreasing the ability to respond. Methodology: Applying best practices in data management and data use in public health settings, and guided by the concepts outlined in ‘Disaster Standards of Care’ models leads to the development of recommendations for a model of best practices in data management and use in public health disasters/emergencies by public health nurses. As the ‘patient’ in public health disasters/emergencies is the community (local, regional or national), guidelines for patient documentation are incorporated in the recommendations. Findings: Using model public health nurses could better plan how to prepare for, respond to, and mitigate disasters in their communities, and better participate in decision-making in all three phases bringing public health nursing data to the discussion as part of the evidence base for decision-making.

Keywords: data management, decision making, disaster planning documentation, public health nursing

Procedia PDF Downloads 219