Search results for: Microsoft Azure Cloud
286 Removal of Heavy Metal Using Continous Mode
Authors: M. Abd elfattah, M. Ossman, Nahla A. Taha
Abstract:
The present work explored the use of Egyptian rice straw, an agricultural waste that leads to global warming problem through brown cloud, as a potential feedstock for the preparation of activated carbon by physical and chemical activation. The results of this study showed that it is feasible to prepare activated carbons with relatively high surface areas and pore volumes from the Egyptian rice straw by direct chemical and physical activation. The produced activated carbon from the two methods (AC1 and AC2) could be used as potential adsorbent for the removal of Fe(III) from aqueous solution contains heavy metals and polluted water. The adsorption of Fe(III) was depended on the pH of the solution. The optimal Fe(III) removal efficiency occurs at pH 5. Based on the results, the optimum contact time is 60 minutes and adsorbent dosage is 3 g/L. The adsorption breakthrough curves obtained at different bed depths indicated increase of breakthrough time with increase in bed depths. A rise in inlet Fe(III) concentration reduces the throughput volume before the packed bed gets saturated. AC1 showed higher affinity for Fe(III) as compared to Raw rice husk.Keywords: rice straw, activated carbon, Fe(III), fixed bed column, pyrolysis
Procedia PDF Downloads 251285 Depth Camera Aided Dead-Reckoning Localization of Autonomous Mobile Robots in Unstructured GNSS-Denied Environments
Authors: David L. Olson, Stephen B. H. Bruder, Adam S. Watkins, Cleon E. Davis
Abstract:
In global navigation satellite systems (GNSS), denied settings such as indoor environments, autonomous mobile robots are often limited to dead-reckoning navigation techniques to determine their position, velocity, and attitude (PVA). Localization is typically accomplished by employing an inertial measurement unit (IMU), which, while precise in nature, accumulates errors rapidly and severely degrades the localization solution. Standard sensor fusion methods, such as Kalman filtering, aim to fuse precise IMU measurements with accurate aiding sensors to establish a precise and accurate solution. In indoor environments, where GNSS and no other a priori information is known about the environment, effective sensor fusion is difficult to achieve, as accurate aiding sensor choices are sparse. However, an opportunity arises by employing a depth camera in the indoor environment. A depth camera can capture point clouds of the surrounding floors and walls. Extracting attitude from these surfaces can serve as an accurate aiding source, which directly combats errors that arise due to gyroscope imperfections. This configuration for sensor fusion leads to a dramatic reduction of PVA error compared to traditional aiding sensor configurations. This paper provides the theoretical basis for the depth camera aiding sensor method, initial expectations of performance benefit via simulation, and hardware implementation, thus verifying its veracity. Hardware implementation is performed on the Quanser Qbot 2™ mobile robot, with a Vector-Nav VN-200™ IMU and Kinect™ camera from Microsoft.Keywords: autonomous mobile robotics, dead reckoning, depth camera, inertial navigation, Kalman filtering, localization, sensor fusion
Procedia PDF Downloads 207284 The Relationship of Fast Food Consumption Preference with Macro and Micro Nutrient Adequacy Students of SMP Negeri 5 Padang
Authors: Widari
Abstract:
This study aims to determine the relationship of fast food consumption preferences with macro and micro nutrient adequacy students of SMP Negeri 5 Padang. This study used a cross sectional study conducted on 100 students of SMP Negeri 5 Padang. The variables studied were fast food preferences, nutrition adequacy macronutrients (carbohydrate, protein, fat, fiber) and micro nutrients (sodium, calcium, iron). Confounding factor in this study was the physical activity level because it was considered quite affecting food consumption of students. Data collected by using a questionnaire food recall as many as 2 x 24 hours to see the history of the respondents eat at school day and on holidays. Then, data processed using software Nutrisurvey and Microsoft Excel 2010. The analysis was performed on samples that have low and medium category on physical activity. The physical activity was not analyzed with another variable to see the strength of the relationship between independent and dependent variables. So that, do restrictions on physical activity variables in an attempt to get rid of confounding in design. Univariate and bivariate analyzes performed using SPSS 16.0 for Windows with Kolmogrov-Smirnov statistical tests, confidence level = 95% (α = 0,05). Results of univariate analysis showed that more than 70% of respondents liked fast food. On average, respondents were malnourished macro; malnourished fiber (100%), carbohydrates (72%), and protein (56%), whereas for fat, excess intake of the respondents (41%). Furthermor, many respondents who have micronutrient deficiencies; 98% for sodium, 96% for iron, and 91% for calcium. The results of the bivariate analysis showed no significant association between fast food consumption preferences with macro and micro nutrient adequacy (p > 0,05). This happens because in the fact not all students who have a preference for fast food actually eat them. To study better in the future, it is expected sampling really like and eat fast food in order to obtain better analysis results.Keywords: fast food, nutritional adequacy, preferences, students
Procedia PDF Downloads 374283 A Systematic Review on Prevalence, Serotypes and Antibiotic Resistance of Salmonella in Ethiopia
Authors: Atsebaha Gebrekidan Kahsay, Tsehaye Asmelash, Enquebaher Kassaye
Abstract:
Background: Salmonella remains a global public health problem with a significant burden in sub-Saharan African countries. Human restricted cause of typhoid and paratyphoid fever are S. Typhi and S. Paratyphi, whereas S. Enteritidis and S. Typhimurium is the causative agent of invasive nontyphoidal diseases among humans and animals are their reservoir. The antibiotic resistance of Salmonella is another public health threat around the globe. To come up with full information about human and animal salmonellosis, we made a systematic review of the prevalence, serotypes, and antibiotic resistance of Salmonella in Ethiopia. Methods: This systematic review used Google Scholar and PubMed search engines to search articles from Ethiopia that were published in English in peer-reviewed international journals from 2010 to 2022. We used keywords to identify the intended research articles and used a Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklist to ensure the inclusion and exclusion criteria. Frequencies and percentages were analyzed using Microsoft Excel. Results: Two hundred seven published articles were searched, and 43 were selected for a systematic review, human (28) and animals (15). The prevalence of Salmonella in humans and animals was 434 (5.2%) and 641(10.1%), respectively. Fourteen serotypes were identified from animals, and S. Typhimurium was among the top five. Among the ciprofloxacin-resistant isolates in human studies, 16.7% was the highest, whereas, for ceftriaxone, 100% resistance was reported. Conclusions: The prevalence of Salmonella among diarrheic patients and food handlers (5.2%) was lower than the prevalence in food animals (10.1%). We did not find serotypes of Salmonella in human studies, although fourteen serotypes were included in food-animal studies, and S. Typhimurium was among the top five. Salmonella species from some human studies revealed a non-susceptibility to ceftriaxone. We recommend further study about invasive nontyphoidal Salmonella and predisposing factors among humans and animals in Ethiopia.Keywords: antibiotic resistance, prevalence, systematic review, serotypes, Salmonella, Ethiopia
Procedia PDF Downloads 84282 Multi-Modal Visualization of Working Instructions for Assembly Operations
Authors: Josef Wolfartsberger, Michael Heiml, Georg Schwarz, Sabrina Egger
Abstract:
Growing individualization and higher numbers of variants in industrial assembly products raise the complexity of manufacturing processes. Technical assistance systems considering both procedural and human factors allow for an increase in product quality and a decrease in required learning times by supporting workers with precise working instructions. Due to varying needs of workers, the presentation of working instructions leads to several challenges. This paper presents an approach for a multi-modal visualization application to support assembly work of complex parts. Our approach is integrated within an interconnected assistance system network and supports the presentation of cloud-streamed textual instructions, images, videos, 3D animations and audio files along with multi-modal user interaction, customizable UI, multi-platform support (e.g. tablet-PC, TV screen, smartphone or Augmented Reality devices), automated text translation and speech synthesis. The worker benefits from more accessible and up-to-date instructions presented in an easy-to-read way.Keywords: assembly, assistive technologies, augmented reality, manufacturing, visualization
Procedia PDF Downloads 165281 A Numerical Description of a Fibre Reinforced Concrete Using a Genetic Algorithm
Authors: Henrik L. Funke, Lars Ulke-Winter, Sandra Gelbrich, Lothar Kroll
Abstract:
This work reports about an approach for an automatic adaptation of concrete formulations based on genetic algorithms (GA) to optimize a wide range of different fit-functions. In order to achieve the goal, a method was developed which provides a numerical description of a fibre reinforced concrete (FRC) mixture regarding the production technology and the property spectrum of the concrete. In a first step, the FRC mixture with seven fixed components was characterized by varying amounts of the components. For that purpose, ten concrete mixtures were prepared and tested. The testing procedure comprised flow spread, compressive and bending tensile strength. The analysis and approximation of the determined data was carried out by GAs. The aim was to obtain a closed mathematical expression which best describes the given seven-point cloud of FRC by applying a Gene Expression Programming with Free Coefficients (GEP-FC) strategy. The seven-parametric FRC-mixtures model which is generated according to this method correlated well with the measured data. The developed procedure can be used for concrete mixtures finding closed mathematical expressions, which are based on the measured data.Keywords: concrete design, fibre reinforced concrete, genetic algorithms, GEP-FC
Procedia PDF Downloads 281280 Evaluation of Thrombolytic Activity of Zingiber cassumunar Roxb. and Thai Herbal Prasaplai Formula
Authors: Warachate Khobjai, Suriyan Sukati, Khemjira Jarmkom, Pattaranut Eakwaropas, Surachai Techaoei
Abstract:
The propose of this study was to investigate in vitro thrombolytic activity of Zingiber cassumunar Roxb. and Prasaplai, a Thai herbal formulation of Z. cassumunar Roxb. Herbs were extracted with boiling water and concentrated by lyophilization. To observe their thrombolytic potential, an in vitro clot lysis method was applied where streptokinase and sterile distilled water were used as positive and negative controls, respectively. Crude aqueous extracts from Z. cassumunar Roxb. and Prasaplai formula showed significant thrombolytic activity by clot lysis of 17.90% and 25.21%, respectively, compared to the negative control water (5.16%) while the standard streptokinase revealed 64.78% clot lysis. These findings suggest that Z. cassumunar Roxb. exhibits moderate thrombolytic activity and cloud play an important role in the thrombolytic properties of Prasaplai formula. However, further study should be done to observe in vivo clot dissolving potential and to isolate active component(s) of these extracts.Keywords: thrombolytic activity, clot lysis, Zingiber cassumunar Roxb., Prasaplai formula, aqueous extract
Procedia PDF Downloads 341279 Genesis of Entrepreneur Business Models in New Ventures
Authors: Arash Najmaei, Jo Rhodes, Peter Lok, Zahra Sadeghinejad
Abstract:
In this article, we endeavor to explore how a new business model comes into existence in the Australian cloud-computing eco-system. Findings from multiple case study methodology reveal that to develop a business model new ventures adopt a three-phase approach. In the first phase, labelled as business model ideation (BMID) various ideas for a viable business model are generated from both internal and external networks of the entrepreneurial team and the most viable one is chosen. Strategic consensus and commitment are generated in the second phase. This phase is a business modelling strategic action phase. We labelled this phase as business model strategic commitment (BMSC) because through commitment and the subsequent actions of executives resources are pooled, coordinated and allocated to the business model. Three complementary sets of resources shape the business model: managerial (MnRs), marketing (MRs) and technological resources (TRs). The third phase is the market-test phase where the business model is reified through the delivery of the intended value to customers and conversion of revenue into profit. We labelled this phase business model actualization (BMAC). Theoretical and managerial implications of these findings will be discussed and several directions for future research will be illuminated.Keywords: entrepreneur business model, high-tech venture, resources, conversion of revenue
Procedia PDF Downloads 447278 Open Source, Open Hardware Ground Truth for Visual Odometry and Simultaneous Localization and Mapping Applications
Authors: Janusz Bedkowski, Grzegorz Kisala, Michal Wlasiuk, Piotr Pokorski
Abstract:
Ground-truth data is essential for VO (Visual Odometry) and SLAM (Simultaneous Localization and Mapping) quantitative evaluation using e.g. ATE (Absolute Trajectory Error) and RPE (Relative Pose Error). Many open-access data sets provide raw and ground-truth data for benchmark purposes. The issue appears when one would like to validate Visual Odometry and/or SLAM approaches on data captured using the device for which the algorithm is targeted for example mobile phone and disseminate data for other researchers. For this reason, we propose an open source, open hardware groundtruth system that provides an accurate and precise trajectory with a 3D point cloud. It is based on LiDAR Livox Mid-360 with a non-repetitive scanning pattern, on-board Raspberry Pi 4B computer, battery and software for off-line calculations (camera to LiDAR calibration, LiDAR odometry, SLAM, georeferencing). We show how this system can be used for the evaluation of various the state of the art algorithms (Stella SLAM, ORB SLAM3, DSO) in typical indoor monocular VO/SLAM.Keywords: SLAM, ground truth, navigation, LiDAR, visual odometry, mapping
Procedia PDF Downloads 74277 A Simple Algorithm for Real-Time 3D Capturing of an Interior Scene Using a Linear Voxel Octree and a Floating Origin Camera
Authors: Vangelis Drosos, Dimitrios Tsoukalos, Dimitrios Tsolis
Abstract:
We present a simple algorithm for capturing a 3D scene (focused on the usage of mobile device cameras in the context of augmented/mixed reality) by using a floating origin camera solution and storing the resulting information in a linear voxel octree. Data is derived from cloud points captured by a mobile device camera. For the purposes of this paper, we assume a scene of fixed size (known to us or determined beforehand) and a fixed voxel resolution. The resulting data is stored in a linear voxel octree using a hashtable. We commence by briefly discussing the logic behind floating origin approaches and the usage of linear voxel octrees for efficient storage. Following that, we present the algorithm for translating captured feature points into voxel data in the context of a fixed origin world and storing them. Finally, we discuss potential applications and areas of future development and improvement to the efficiency of our solution.Keywords: voxel, octree, computer vision, XR, floating origin
Procedia PDF Downloads 133276 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution
Authors: Masomeh Jamshid Nejad
Abstract:
Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.Keywords: statistics, excel-based instruction, data visualization, pedagogy
Procedia PDF Downloads 55275 The Impact of Barefoot versus Shod Running on Lower Limb Gait Cycle Pattern among Recreational Club Runners in Durban, South Africa
Authors: Siyabonga Kunene, Calvin Shipley
Abstract:
Introduction: Despite health benefits that come with running, injuries are common with prevalence ranging between 18.2% and 92.4% worldwide. Differences in gait patterns between barefoot and shod running, can determine traits that could lead to running injuries. The aim was to assess and compare lower limb gait cycle patterns between barefoot and shod running among runners. Methods: An experimental same-subject study design was used. The study population consisted of male and female adult recreational runners who were injury free from a running club in Durban. A convenience sampling method was used and 14 participants were recruited. The study was conducted in the physiotherapy performance laboratory at the University of KwaZulu-Natal. A Woodway Desmo Treadmill and KinePro gait analysis system were used. Descriptive & inferential statistics were analysed using Microsoft Excel and Intercooled Stata. Results: Participants included a greater percentage of females (57.1%, n = 8) than males (42.9%, n = 6). The mean population age was 38.57. A significant difference (p < 0.0009) between barefoot cadence (177.9236steps/min) and shod cadence (171.9445steps/min) was observed. Right (0.261s) and left (0.257s) barefoot stand phase was shorter than right (0.273s) and left (0.270s) shod stand phase. Right barefoot swing phase exhibited less significant (0.420s) results when compared to right shod swing phase (0.427s), whereas left barefoot swing phase was quicker (0.416s) than left shod swing phase (0.432s). Significant differences between barefoot and shod stand (p < 0.009) and swing (p < 0.040) phase symmetry occurred. Conclusion: A considerable difference was found between barefoot and shod running gait cycle patterns among participants. This difference may play a role in prevention of running related injuries.Keywords: barefoot running, shod running, gait cycle pattern, same-subject study design
Procedia PDF Downloads 252274 A Monocular Measurement for 3D Objects Based on Distance Area Number and New Minimize Projection Error Optimization Algorithms
Authors: Feixiang Zhao, Shuangcheng Jia, Qian Li
Abstract:
High-precision measurement of the target’s position and size is one of the hotspots in the field of vision inspection. This paper proposes a three-dimensional object positioning and measurement method using a monocular camera and GPS, namely the Distance Area Number-New Minimize Projection Error (DAN-NMPE). Our algorithm contains two parts: DAN and NMPE; specifically, DAN is a picture sequence algorithm, NMPE is a relatively positive optimization algorithm, which greatly improves the measurement accuracy of the target’s position and size. Comprehensive experiments validate the effectiveness of our proposed method on a self-made traffic sign dataset. The results show that with the laser point cloud as the ground truth, the size and position errors of the traffic sign measured by this method are ± 5% and 0.48 ± 0.3m, respectively. In addition, we also compared it with the current mainstream method, which uses a monocular camera to locate and measure traffic signs. DAN-NMPE attains significant improvements compared to existing state-of-the-art methods, which improves the measurement accuracy of size and position by 50% and 15.8%, respectively.Keywords: monocular camera, GPS, positioning, measurement
Procedia PDF Downloads 144273 Evaluation of Outpatient Management of Proctological Surgery under Saddle Block
Authors: Bouhouf Atef, Beloulou Mohamed Lamine
Abstract:
Introduction: Outpatient surgery is continually developing compared to conventional inpatient surgery; its rate is constantly increasing every year due to global socio-economic pressure. Most hospitals continue to perform proctologic surgery in conventional hospitalization. Purpose: As part of a monocentric prospective descriptive study, we examined the feasibility of proctologic surgery under saddle block on an outpatient basis with the same safety conditions as in traditional hospitalization. Material and methods: This is a monocentric prospective descriptive study spread over a period of 24 months, from December 2018 to December 2020 including 150 patients meeting the medico-surgical and socio-environmental criteria of eligibility for outpatient surgery, operated for proctological pathologies under saddle block in outpatient mode, in the surgery department of the regional military hospital of Constantine Algeria. The data were collected and analyzed by the biomedical statistics software Epi-info and Microsoft Excel, then compared with other related studies. Results: This study involved over a period of two years, 150 male patients with an average age of 32 years (20-64). Most patients (95,33%) were ASA I class, and 4,67% ASA II class. All patients received saddle blocks. The average length of stay of patients was six hours. The quality indicators in outpatient surgery in our study were: zero (0)% of deprogrammings, three (3)% of conversions to full hospitalization, 0,7% of readmissions, an average waiting time before access to the operating room of 83 minutes without delay of discharge, a satisfaction rate of 90,8% and a reduction in the cost compared to conventional inpatient surgery in proportions ranging from – 32,6% and – 48,75%. Conclusions: The outpatient management of proctological surgery under saddle block is very beneficial in terms of safety, efficiency, simplicity, and economy. Our results are in line with those of the literature and our work deserves to be continued to include many patients.Keywords: outpatient surgery, proctological surgery, saddle block, satisfaction, cost
Procedia PDF Downloads 23272 The Influence of Students’ Learning Factor and Parents’ Involvement in Their Learning and Suspension: The Application of Big Data Analysis of Internet of Things Technology
Authors: Chih Ming Kung
Abstract:
This study is an empirical study examining the enrollment rate and dropout rate of students from the perspectives of students’ learning, parents’ involvement and the learning process. Methods: Using the data collected from the entry website of Internet of Things (IoT), parents’ participation and the installation pattern of exit poll website, an investigation was conducted. Results: This study discovered that in the aspect of the degree of involvement, the attractiveness of courses, self-performance and departmental loyalty exerts significant influences on the four aspects: psychological benefits, physical benefits, social benefits and educational benefits of learning benefits. Parents’ participation also exerts a significant influence on the learning benefits. A suitable tool on the cloud was designed to collect the dynamic big data of students’ learning process. Conclusion: This research’s results can be valuable references for the government when making and promoting related policies, with more macro view and consideration. It is also expected to be contributory to schools for the practical study of promotion for enrollment.Keywords: students’ learning factor, parents’ involvement, involvement, technology
Procedia PDF Downloads 147271 Optimizing Organizational Performance: The Critical Role of Headcount Budgeting in Strategic Alignment and Financial Stability
Authors: Shobhit Mittal
Abstract:
Headcount budgeting stands as a pivotal element in organizational financial management, extending beyond traditional budgeting to encompass strategic resource allocation for workforce-related expenses. This process is integral to maintaining financial stability and fostering a productive workforce, requiring a comprehensive analysis of factors such as market trends, business growth projections, and evolving workforce skill requirements. It demands a collaborative approach, primarily involving Human Resources (HR) and finance departments, to align workforce planning with an organization's financial capabilities and strategic objectives. The dynamic nature of headcount budgeting necessitates continuous monitoring and adjustment in response to economic fluctuations, business strategy shifts, technological advancements, and market dynamics. Its significance in talent management is also highlighted, aligning financial planning with talent acquisition and retention strategies to ensure a competitive edge in the market. The consequences of incorrect headcount budgeting are explored, showing how it can lead to financial strain, operational inefficiencies, and hindered strategic objectives. Examining case studies like IBM's strategic workforce rebalancing and Microsoft's shift for long-term success, the importance of aligning headcount budgeting with organizational goals is underscored. These examples illustrate that effective headcount budgeting transcends its role as a financial tool, emerging as a strategic element crucial for an organization's success. This necessitates continuous refinement and adaptation to align with evolving business goals and market conditions, highlighting its role as a key driver in organizational success and sustainability.Keywords: strategic planning, fiscal budget, headcount planning, resource allocation, financial management, decision-making, operational efficiency, risk management, headcount budget
Procedia PDF Downloads 52270 Statistical Shape Analysis of the Human Upper Airway
Authors: Ramkumar Gunasekaran, John Cater, Vinod Suresh, Haribalan Kumar
Abstract:
The main objective of this project is to develop a statistical shape model using principal component analysis that could be used for analyzing the shape of the human airway. The ultimate goal of this project is to identify geometric risk factors for diagnosis and management of Obstructive Sleep Apnoea (OSA). Anonymous CBCT scans of 25 individuals were obtained from the Otago Radiology Group. The airways were segmented between the hard-palate and the aryepiglottic fold using snake active contour segmentation. The point data cloud of the segmented images was then fitted with a bi-cubic mesh, and pseudo landmarks were placed to perform PCA on the segmented airway to analyze the shape of the airway and to find the relationship between the shape and OSA risk factors. From the PCA results, the first four modes of variation were found to be significant. Mode 1 was interpreted to be the overall length of the airway, Mode 2 was related to the anterior-posterior width of the retroglossal region, Mode 3 was related to the lateral dimension of the oropharyngeal region and Mode 4 was related to the anterior-posterior width of the oropharyngeal region. All these regions are subjected to the risk factors of OSA.Keywords: medical imaging, image processing, FEM/BEM, statistical modelling
Procedia PDF Downloads 514269 A Machine Learning Based Method to Detect System Failure in Resource Constrained Environment
Authors: Payel Datta, Abhishek Das, Abhishek Roychoudhury, Dhiman Chattopadhyay, Tanushyam Chattopadhyay
Abstract:
Machine learning (ML) and deep learning (DL) is most predominantly used in image/video processing, natural language processing (NLP), audio and speech recognition but not that much used in system performance evaluation. In this paper, authors are going to describe the architecture of an abstraction layer constructed using ML/DL to detect the system failure. This proposed system is used to detect the system failure by evaluating the performance metrics of an IoT service deployment under constrained infrastructure environment. This system has been tested on the manually annotated data set containing different metrics of the system, like number of threads, throughput, average response time, CPU usage, memory usage, network input/output captured in different hardware environments like edge (atom based gateway) and cloud (AWS EC2). The main challenge of developing such system is that the accuracy of classification should be 100% as the error in the system has an impact on the degradation of the service performance and thus consequently affect the reliability and high availability which is mandatory for an IoT system. Proposed ML/DL classifiers work with 100% accuracy for the data set of nearly 4,000 samples captured within the organization.Keywords: machine learning, system performance, performance metrics, IoT, edge
Procedia PDF Downloads 195268 Undersea Communications Infrastructure: Risks, Opportunities, and Geopolitical Considerations
Authors: Lori W. Gordon, Karen A. Jones
Abstract:
Today’s high-speed data connectivity depends on a vast global network of infrastructure across space, air, land, and sea, with undersea cable infrastructure (UCI) serving as the primary means for intercontinental and ‘long-haul’ communications. The UCI landscape is changing and includes an increasing variety of state actors, such as the growing economies of Brazil, Russia, India, China, and South Africa. Non-state commercial actors, such as hyper-scale content providers including Google, Facebook, Microsoft, and Amazon, are also seeking to control their data and networks through significant investments in submarine cables. Active investments by both state and non-state actors will invariably influence the growth, geopolitics, and security of this sector. Beyond these hyper-scale content providers, there are new commercial satellite communication providers. These new players include traditional geosynchronous (GEO) satellites that offer broad coverage, high throughput GEO satellites offering high capacity with spot beam technology, low earth orbit (LEO) ‘mega constellations’ – global broadband services. And potential new entrants such as High Altitude Platforms (HAPS) offer low latency connectivity, LEO constellations offer high-speed optical mesh networks, i.e., ‘fiber in the sky.’ This paper focuses on understanding the role of submarine cables within the larger context of the global data commons, spanning space, terrestrial, air, and sea networks, including an analysis of national security policy and geopolitical implications. As network operators and commercial and government stakeholders plan for emerging technologies and architectures, hedging risks for future connectivity will ensure that our data backbone will be secure for years to come.Keywords: communications, global, infrastructure, technology
Procedia PDF Downloads 89267 The Integration of Patient Health Record Generated from Wearable and Internet of Things Devices into Health Information Exchanges
Authors: Dalvin D. Hill, Hector M. Castro Garcia
Abstract:
A growing number of individuals utilize wearable devices on a daily basis. The usage and functionality of these wearable devices vary from user to user. One popular usage of said devices is to track health-related activities that are typically stored on a device’s memory or uploaded to an account in the cloud; based on the current trend, the data accumulated from the wearable device are stored in a standalone location. In many of these cases, this health related datum is not a factor when considering the holistic view of a user’s health lifestyle or record. This health-related data generated from wearable and Internet of Things (IoT) devices can serve as empirical information to a medical provider, as the standalone data can add value to the holistic health record of a patient. This paper proposes a solution to incorporate the data gathered from these wearable and IoT devices, with that a patient’s Personal Health Record (PHR) stored within the confines of a Health Information Exchange (HIE).Keywords: electronic health record, health information exchanges, internet of things, personal health records, wearable devices, wearables
Procedia PDF Downloads 130266 New Insights Into Fog Role In Atmospheric Deposition Using Satellite Images
Authors: Suruchi
Abstract:
This study aims to examine the spatial and temporal patterns of fog occurrences across Czech Republic. It utilizes satellite imagery and other data sources to achieve this goal. The main objective is to understand the role of fog in atmospheric deposition processes and its potential impact on the environment and ecosystems. Through satellite image analysis, the study will identify and categorize different types of fog, including radiation fog, orographic fog, and mountain fog. Fog detection algorithms and cloud type products will be evaluated to assess the frequency and distribution of fog events throughout the Czech Republic. Furthermore, the regions covered by fog will be classified based on their fog type and associated pollution levels. This will provide insights into the variability in fog characteristics and its implications for atmospheric deposition. Spatial analysis techniques will be used to pinpoint areas prone to frequent fog events and evaluate their pollution levels. Statistical methods will be employed to analyze patterns in fog occurrence over time and its connection with environmental factors. The ultimate goal of this research is to offer fresh perspectives on fog's role in atmospheric deposition processes, enhancing our understanding of its environmental significance and informing future research and environmental management initiatives.Keywords: pollution, GIS, FOG, satellie, atmospheric deposition
Procedia PDF Downloads 23265 Preparing Curved Canals Using Mtwo and RaCe Rotary Instruments: A Comparison Study
Authors: Mimoza Canga, Vito Malagnino, Giulia Malagnino, Irene Malagnino
Abstract:
Objective: The objective of this study was to compare the effectiveness of Mtwo and RaCe rotary instruments, in cleaning and shaping root canals curvature. Material and Method: The present study was conducted on 160 simulated canals in resin blocks, with an angle curvature 15°-30°. These 160 simulated canals were divided into two groups, where each group consisted of 80 blocks. Each group was divided into two subgroups (n=40 canals each). The simulated canals subgroups were prepared with Mtwo and RaCe rotary nickel-titanium instruments. The root canals were measured at four different points of reference, starting at 13 mm from the orifice. In the first group, the canals were prepared using Mtwo rotary system (VDW, Munich, Germany). The Mtwo files used were: 10/0.04, 15/0.05, 20/0.06, and 25/0.06. These instruments entered in the full length of the canal. Each file was rotated in the canal until it reached the apical point. In the second group, the canals were prepared using RaCe instruments (La Chaux-De-Fonds, Switzerland), performing the crown down technique, using the torque electric control motor (VDWCO, Munich, Germany), with 600 RPM and 2n/cm as follow: ≠40/0.10, ≠35/0.08, ≠30/0.06, ≠25/0.04, ≠25/0.02. The data were recorded using SPSS version 23 software (Microsoft, IL, USA). Data analysis was done using ANOVA test. Results: The results obtained by using the Mtwo rotary instruments, showed that these instruments were able to clean and shape in the right-to-left motion curved canals, at different levels, without any deviation, and in perfect symmetry, with a P-value=0.000. The data showed that the greater the depth of the root canal, the greater the deviations of the RaCe rotary instruments. These deviations occurred in three levels, which are: S2(P=0.004), S3( P=0.007), S4(P=0.009). The Mtwo files can go deeper and create a greater angle in S4 level (21°-28°), compared to RaCe instruments with an angle equal to 19°-24°. Conclusion: The present study noted a clinically significant difference between Mtwo rotary instruments and RaCe rotary files used for the canal preparation and indicated that Mtwo instruments are a better choice for the curved canals.Keywords: canal curvature, canal preparation, Mtwo, RaCe, resin blocks
Procedia PDF Downloads 122264 Effectiveness of Gamified Virtual Physiotherapy Patients with Shoulder Problems
Authors: A. Barratt, M. H. Granat, S. Buttress, B. Roy
Abstract:
Introduction: Physiotherapy is an essential part of the treatment of patients with shoulder problems. The focus of treatment is usually centred on addressing specific physiotherapy goals, ultimately resulting in the improvement in pain and function. This study investigates if computerised physiotherapy using gamification principles are as effective as standard physiotherapy. Methods: Physiotherapy exergames were created using a combination of commercially available hardware, the Microsoft Kinect, and bespoke software. The exergames used were validated by mapping physiotherapy goals of physiotherapy which included; strength, range of movement, control, speed, and activation of the kinetic chain. A multicenter, randomised prospective controlled trial investigated the use of exergames on patients with Shoulder Impingement Syndrome who had undergone Arthroscopic Subacromial Decompression surgery. The intervention group was provided with the automated sensor-based technology, allowing them to perform exergames and track their rehabilitation progress. The control group was treated with standard physiotherapy protocols. Outcomes from different domains were used to compare the groups. An important metric was the assessment of shoulder range of movement pre- and post-operatively. The range of movement data included abduction, forward flexion and external rotation which were measured by the software, pre-operatively, 6 weeks and 12 weeks post-operatively. Results: Both groups show significant improvement from pre-operative to 12 weeks in elevation in forward flexion and abduction planes. Results for abduction showed an improvement for the interventional group (p < 0.015) as well as the test group (p < 0.003). Forward flexion improvement was interventional group (p < 0.0201) with the control group (p < 0.004). There was however no significant difference between the groups at 12 weeks for abduction (p < 0.118067) , forward flexion (p < 0.189755) or external rotation (p < 0.346967). Conclusion: Exergames may be used as an alternative to standard physiotherapy regimes; however, further analysis is required focusing on patient engagement.Keywords: shoulder, physiotherapy, exergames, gamification
Procedia PDF Downloads 197263 Multi Data Management Systems in a Cluster Randomized Trial in Poor Resource Setting: The Pneumococcal Vaccine Schedules Trial
Authors: Abdoullah Nyassi, Golam Sarwar, Sarra Baldeh, Mamadou S. K. Jallow, Bai Lamin Dondeh, Isaac Osei, Grant A. Mackenzie
Abstract:
A randomized controlled trial is the "gold standard" for evaluating the efficacy of an intervention. Large-scale, cluster-randomized trials are expensive and difficult to conduct, though. To guarantee the validity and generalizability of findings, high-quality, dependable, and accurate data management systems are necessary. Robust data management systems are crucial for optimizing and validating the quality, accuracy, and dependability of trial data. Regarding the difficulties of data gathering in clinical trials in low-resource areas, there is a scarcity of literature on this subject, which may raise concerns. Effective data management systems and implementation goals should be part of trial procedures. Publicizing the creative clinical data management techniques used in clinical trials should boost public confidence in the study's conclusions and encourage further replication. In the ongoing pneumococcal vaccine schedule study in rural Gambia, this report details the development and deployment of multi-data management systems and methodologies. We implemented six different data management, synchronization, and reporting systems using Microsoft Access, RedCap, SQL, Visual Basic, Ruby, and ASP.NET. Additionally, data synchronization tools were developed to integrate data from these systems into the central server for reporting systems. Clinician, lab, and field data validation systems and methodologies are the main topics of this report. Our process development efforts across all domains were driven by the complexity of research project data collected in real-time data, online reporting, data synchronization, and ways for cleaning and verifying data. Consequently, we effectively used multi-data management systems, demonstrating the value of creative approaches in enhancing the consistency, accuracy, and reporting of trial data in a poor resource setting.Keywords: data management, data collection, data cleaning, cluster-randomized trial
Procedia PDF Downloads 28262 IoT Based Approach to Healthcare System for a Quadriplegic Patient Using EEG
Authors: R. Gautam, P. Sastha Kanagasabai, G. N. Rathna
Abstract:
The proposed healthcare system enables quadriplegic patients, people with severe motor disabilities to send commands to electronic devices and monitor their vitals. The growth of Brain-Computer-Interface (BCI) has led to rapid development in 'assistive systems' for the disabled called 'assistive domotics'. Brain-Computer-Interface is capable of reading the brainwaves of an individual and analyse it to obtain some meaningful data. This processed data can be used to assist people having speech disorders and sometimes people with limited locomotion to communicate. In this Project, Emotiv EPOC Headset is used to obtain the electroencephalogram (EEG). The obtained data is processed to communicate pre-defined commands over the internet to the desired mobile phone user. Other Vital Information like the heartbeat, blood pressure, ECG and body temperature are monitored and uploaded to the server. Data analytics enables physicians to scan databases for a specific illness. The Data is processed in Intel Edison, system on chip (SoC). Patient metrics are displayed via Intel IoT Analytics cloud service.Keywords: brain computer interface, Intel Edison, Emotiv EPOC, IoT analytics, electroencephalogram
Procedia PDF Downloads 186261 A Real Time Monitoring System of the Supply Chain Conditions, Products and Means of Transport
Authors: Dimitris E. Kontaxis, George Litainas, Dimitris P. Ptochos
Abstract:
Real-time monitoring of the supply chain conditions and procedures is a critical element for the optimal coordination and safety of the deliveries, as well as for the minimization of the delivery time and cost. Real-time monitoring requires IoT data streams, which are related to the conditions of the products and the means of transport (e.g., location, temperature/humidity conditions, kinematic state, ambient light conditions, etc.). These streams are generated by battery-based IoT tracking devices, equipped with appropriate sensors, and are transmitted to a cloud-based back-end system. Proper handling and processing of the IoT data streams, using predictive and artificial intelligence algorithms, can provide significant and useful results, which can be exploited by the supply chain stakeholders in order to enhance their financial benefits, as well as the efficiency, security, transparency, coordination, and sustainability of the supply chain procedures. The technology, the features, and the characteristics of a complete, proprietary system, including hardware, firmware, and software tools -developed in the context of a co-funded R&D programme- are addressed and presented in this paper.Keywords: IoT embedded electronics, real-time monitoring, tracking device, sensor platform
Procedia PDF Downloads 178260 Impact of Green Bonds Issuance on Stock Prices: An Event Study on Respective Indian Companies
Authors: S. L. Tulasi Devi, Shivam Azad
Abstract:
The primary objective of this study is to analyze the impact of green bond issuance on the stock prices of respective Indian companies. An event study methodology has been employed to study the effect of green bond issuance. For in-depth study and analysis, this paper used different window frames, including 15-15 days, 10-10 days, 7-7days, 6-6 days, and 5-5 days. Further, for better clarity, this paper also used an uneven window period of 7-5 days. The period of study covered all the companies which issued green bonds during the period of 2017-2022; Adani Green Energy, State Bank of India, Power Finance Corporation, Jain Irrigation, and Rural Electrification Corporation, except Indian Renewable Energy Development Agency and Indian Railway Finance Corporation, because of data unavailability. The paper used all three event study methods as discussed in earlier literature; 1) constant return model, 2) market-adjusted model, and 3) capital asset pricing model. For the fruitful comparison between results, the study considered cumulative average return (CAR) and buy and hold average return (BHAR) methodology. For checking the statistical significance, a two-tailed t-statistic has been used. All the statistical calculations have been performed in Microsoft Excel 2016. The study found that all other companies have shown positive returns on the event day except for the State Bank of India. The results demonstrated that constant return model outperformed compared to the market-adjusted model and CAPM. The p-value derived from all the methods has shown an almost insignificant impact of the issuance of green bonds on the stock prices of respective companies. The overall analysis states that there’s not much improvement in the market efficiency of the Indian Stock Markets.Keywords: green bonds, event study methodology, constant return model, market-adjusted model, CAPM
Procedia PDF Downloads 98259 Intelligent Technology for Real-Time Monitor and Data Analysis of the Aquaculture Toxic Water Concentration
Authors: Chin-Yuan Hsieh, Wei-Chun Lu, Yu-Hong Zeng
Abstract:
The situation of a group of fish die is frequently found due to the fish disease caused by the deterioration of aquaculture water quality. The toxic ammonia is produced by animals as a byproduct of protein. The system is designed by the smart sensor technology and developed by the mathematical model to monitor the water parameters 24 hours a day and predict the relationship among twelve water quality parameters for monitoring the water quality in aquaculture. All data measured are stored in cloud server. In productive ponds, the daytime pH may be high enough to be lethal to the fish. The sudden change of the aquaculture conditions often results in the increase of PH value of water, lack of oxygen dissolving content, water quality deterioration and yield reduction. From the real measurement, the system can send the message to user’s smartphone successfully on the bad conditions of water quality. From the data comparisons between measurement and model simulation in fish aquaculture site, the difference of parameters is less than 2% and the correlation coefficient is at least 98.34%. The solubility rate of oxygen decreases exponentially with the elevation of water temperature. The correlation coefficient is 98.98%.Keywords: aquaculture, sensor, ammonia, dissolved oxygen
Procedia PDF Downloads 284258 Prevalence of Pretreatment Drug HIV-1 Mutations in Moscow, Russia
Authors: Daria Zabolotnaya, Svetlana Degtyareva, Veronika Kanestri, Danila Konnov
Abstract:
An adequate choice of the initial antiretroviral treatment determines the treatment efficacy. In the clinical guidelines in Russia non-nucleoside reverse transcriptase inhibitors (NNRTIs) are still considered to be an option for first-line treatment while pretreatment drug resistance (PDR) testing is not routinely performed. We conducted a cohort retrospective study in HIV-positive treatment naïve patients of the H-clinic (Moscow, Russia) who performed PDR testing from July 2017 to November 2021. All the information was obtained from the medical records anonymously. We analyzed the mutations in reverse transcriptase and protease genes. RT-sequences were obtained by AmpliSens HIV-Resist-Seq kit. Drug resistance was defined using the HIVdb Program v. 8.9-1. PDR was estimated using the Stanford algorithm. Descriptive statistics were performed in Excel (Microsoft Office, 2019). A total of 261 HIV-1 infected patients were enrolled in the study including 197 (75.5%) male and 64 (24.5%) female. The mean age was 34.6±8.3 years. The median CD4 count – 521 cells/µl (IQR 367-687 cells/µl). Data on risk factors of HIV-infection were scarce. The total quantity of strains containing mutations in the reverse transcriptase gene was 75 (28.7%). From these 5 (1.9%) mutations were associated with PDR to nucleoside reverse transcriptase inhibitors (NRTIs) and 30 (11.5%) – with PDR to NNRTIs. The number of strains with mutations in protease gene was 43 (16.5%), from these only 3 (1.1%) mutations were associated with resistance to protease inhibitors. For NNRTIs the most prevalent PDR mutations were E138A, V106I. Most of the HIV variants exhibited a single PDR mutation, 2 were found in 3 samples. Most of HIV variants with PDR mutation displayed a single drug class resistance mutation. 2/37 (5.4%) strains had both NRTIs and NNRTIs mutations. There were no strains identified with PDR mutations to all three drug classes. Though earlier data demonstrated a lower level of PDR in HIV treatment naïve population in Russia and our cohort can be not fully representative as it is taken from the private clinic, it reflects the trend of increasing PDR especially to NNRTIs. Therefore, we consider either pretreatment testing or giving the priority to other drugs as first-line treatment necessary.Keywords: HIV, resistance, mutations, treatment
Procedia PDF Downloads 95257 Machine Learning Assisted Performance Optimization in Memory Tiering
Authors: Derssie Mebratu
Abstract:
As a large variety of micro services, web services, social graphic applications, and media applications are continuously developed, it is substantially vital to design and build a reliable, efficient, and faster memory tiering system. Despite limited design, implementation, and deployment in the last few years, several techniques are currently developed to improve a memory tiering system in a cloud. Some of these techniques are to develop an optimal scanning frequency; improve and track pages movement; identify pages that recently accessed; store pages across each tiering, and then identify pages as a hot, warm, and cold so that hot pages can store in the first tiering Dynamic Random Access Memory (DRAM) and warm pages store in the second tiering Compute Express Link(CXL) and cold pages store in the third tiering Non-Volatile Memory (NVM). Apart from the current proposal and implementation, we also develop a new technique based on a machine learning algorithm in that the throughput produced 25% improved performance compared to the performance produced by the baseline as well as the latency produced 95% improved performance compared to the performance produced by the baseline.Keywords: machine learning, bayesian optimization, memory tiering, CXL, DRAM
Procedia PDF Downloads 96