Search results for: multivariate time series data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 37847

Search results for: multivariate time series data

32447 Virtual Reality and Avatars in Education

Authors: Michael Brazley

Abstract:

Virtual Reality (VR) and 3D videos are the most current generation of learning technology today. Virtual Reality and 3D videos are being used in professional offices and Schools now for marketing and education. Technology in the field of design has progress from two dimensional drawings to 3D models, using computers and sophisticated software. Virtual Reality is being used as collaborative means to allow designers and others to meet and communicate inside models or VR platforms using avatars. This research proposes to teach students from different backgrounds how to take a digital model into a 3D video, then into VR, and finally VR with multiple avatars communicating with each other in real time. The next step would be to develop the model where people from three or more different locations can meet as avatars in real time, in the same model and talk to each other. This research is longitudinal, studying the use of 3D videos in graduate design and Virtual Reality in XR (Extended Reality) courses. The research methodology is a combination of quantitative and qualitative methods. The qualitative methods begin with the literature review and case studies. The quantitative methods come by way of student’s 3D videos, survey, and Extended Reality (XR) course work. The end product is to develop a VR platform with multiple avatars being able to communicate in real time. This research is important because it will allow multiple users to remotely enter your model or VR platform from any location in the world and effectively communicate in real time. This research will lead to improved learning and training using Virtual Reality and Avatars; and is generalizable because most Colleges, Universities, and many citizens own VR equipment and computer labs. This research did produce a VR platform with multiple avatars having the ability to move and speak to each other in real time. Major implications of the research include but not limited to improved: learning, teaching, communication, marketing, designing, planning, etc. Both hardware and software played a major role in project success.

Keywords: virtual reality, avatars, education, XR

Procedia PDF Downloads 80
32446 Lessons Learned from Implementation of Remote Pregnant and Newborn Care Service for Vulnerable Women and Children During COVID-19 and Political Crisis in Myanmar

Authors: Wint Wint Thu, Htet Ko Ko Win, Myat Mon San, Zaw Lin Tun, Nandar Than Aye, Khin Nyein Myat, Hayman Nyo Oo, Nay Aung Lin, Kusum Thapa, Kyaw Htet Aung

Abstract:

Background: In Myanmar, the intense political instability happened to start in Feb-2021, while the COVID-19 pandemic waves are also threatening the public health system, which subsequently led to severe health sector crisis, including difficulties in accessing maternal and newborn health care for vulnerable women and children. The Remote Pregnant and Newborn Care (RPNC) uses a telehealth approach United States Agency for International Development (USAID)-funded Essential Health Project. Implementation: The Remote Pregnant and Newborn Care (RPNC) service has adapted to the MNCH needs of vulnerable pregnant women and was implemented to mitigate the risk of limited access to essential quality MNH care in Yangon, Myanmar, under women, and the project trained 13 service providers on a telehealth care package for pregnancy and newborn developed Jhpiego to ensure understanding of evidence-based MNCH care practices. The phone numbers of the pregnant women were gathered through the preexisting and functioning community volunteers, who reach the most vulnerable pregnant women in the project's targeted area. A total of 212 pregnant women have been reached by service providers for RPNC during the implementation period. The trained service providers offer quality antenatal and postnatal care, including newborn care, via telephone calls. It includes 24/7 incoming calls and time-allotted outgoing calls to the pregnant women during antenatal and postnatal periods, including the newborn care. The required data were collected daily in time with the calls, and the quality of the medical services is made assured with the track of the calls, ensuring data privacy and patient confidentiality. Lessons learned: The key lessons are 1) cost-effectiveness: RPNC service could reduce out of pocket expenditure of pregnant women as it only costs 1.6 United States dollars (USD) per one telehealth call while it costs 8 to 10 USD per one time in-person care service at private service providers, including transportation cost, 2) network of care: telehealth call could not replace the in-person antenatal and postnatal care services, and integration of telehealth calls with in-person care by local healthcare providers with the support of the community is crucial for accessibility to essential MNH services by poor and vulnerable women, and 3) sharing information on health access points: most of the women seem to have financial barriers in accessing private health facilities while public health system collapse and telehealthcare could provide information on low-cost facilities and connect women to relevant health facilities. These key lessons are important for future efforts regarding the implementation of remote pregnancy and newborn care in Myanmar, especially during the political crisis and COVID-19 pandemic situation.

Keywords: telehealth, accessibility, maternal care, newborn care

Procedia PDF Downloads 77
32445 An Empirical Investigation of Big Data Analytics: The Financial Performance of Users versus Vendors

Authors: Evisa Mitrou, Nicholas Tsitsianis, Supriya Shinde

Abstract:

In the age of digitisation and globalisation, businesses have shifted online and are investing in big data analytics (BDA) to respond to changing market conditions and sustain their performance. Our study shifts the focus from the adoption of BDA to the impact of BDA on financial performance. We explore the financial performance of both BDA-vendors (business-to-business) and BDA-clients (business-to-customer). We distinguish between the five BDA-technologies (big-data-as-a-service (BDaaS), descriptive, diagnostic, predictive, and prescriptive analytics) and discuss them individually. Further, we use four perspectives (internal business process, learning and growth, customer, and finance) and discuss the significance of how each of the five BDA-technologies affects the performance measures of these four perspectives. We also present the analysis of employee engagement, average turnover, average net income, and average net assets for BDA-clients and BDA-vendors. Our study also explores the effect of the COVID-19 pandemic on business continuity for both BDA-vendors and BDA-clients.

Keywords: BDA-clients, BDA-vendors, big data analytics, financial performance

Procedia PDF Downloads 107
32444 Implementation of Lean Tools (Value Stream Mapping and ECRS) in an Oil Refinery

Authors: Ronita Singh, Yaman Pattanaik, Soham Lalwala

Abstract:

In today’s highly competitive business environment, every organization is striving towards lean manufacturing systems to achieve lower Production Lead Times, lower costs, less inventory and overall improvement in supply chains efficiency. Based on the similar idea, this paper presents the practical application of Value Stream Mapping (VSM) tool and ECRS (Eliminate, Combine, Reduce, and Simplify) technique in the receipt section of the material management center of an oil refinery. A value stream is an assortment of all actions (value added as well as non-value added) that are required to bring a product through the essential flows, starting with raw material and ending with the customer. For drawing current state value stream mapping, all relevant data of the receipt cycle has been collected and analyzed. Then analysis of current state map has been done for determining the type and quantum of waste at every stage which helped in ascertaining as to how far the warehouse is from the concept of lean manufacturing. From the results achieved by current VSM, it was observed that the two processes- Preparation of GRN (Goods Receipt Number) and Preparation of UD (Usage Decision) are both bottle neck operations and have higher cycle time. This root cause analysis of various types of waste helped in designing a strategy for step-wise implementation of lean tools. The future state thus created a lean flow of materials at the warehouse center, reducing the lead time of the receipt cycle from 11 days to 7 days and increasing overall efficiency by 27.27%.

Keywords: current VSM, ECRS, future VSM, receipt cycle, supply chain, VSM

Procedia PDF Downloads 284
32443 Scheduling Nodes Activity and Data Communication for Target Tracking in Wireless Sensor Networks

Authors: AmirHossein Mohajerzadeh, Mohammad Alishahi, Saeed Aslishahi, Mohsen Zabihi

Abstract:

In this paper, we consider sensor nodes with the capability of measuring the bearings (relative angle to the target). We use geometric methods to select a set of observer nodes which are responsible for collecting data from the target. Considering the characteristics of target tracking applications, it is clear that significant numbers of sensor nodes are usually inactive. Therefore, in order to minimize the total network energy consumption, a set of sensor nodes, called sentinel, is periodically selected for monitoring, controlling the environment and transmitting data through the network. The other nodes are inactive. Furthermore, the proposed algorithm provides a joint scheduling and routing algorithm to transmit data between network nodes and the fusion center (FC) in which not only provides an efficient way to estimate the target position but also provides an efficient target tracking. Performance evaluation confirms the superiority of the proposed algorithm.

Keywords: coverage, routing, scheduling, target tracking, wireless sensor networks

Procedia PDF Downloads 362
32442 A Low-Cost of Foot Plantar Shoes for Gait Analysis

Authors: Zulkifli Ahmad, Mohd Razlan Azizan, Nasrul Hadi Johari

Abstract:

This paper presents a study on development and conducting of a wearable sensor system for gait analysis measurement. For validation, the method of plantar surface measurement by force plate was prepared. In general gait analysis, force plate generally represents a studies about barefoot in whole steps and do not allow analysis of repeating movement step in normal walking and running. The measurements that were usually perform do not represent the whole daily plantar pressures in the shoe insole and only obtain the ground reaction force. The force plate measurement is usually limited a few step and it is done indoor and obtaining coupling information from both feet during walking is not easily obtained. Nowadays, in order to measure pressure for a large number of steps and obtain pressure in each insole part, it could be done by placing sensors within an insole. With this method, it will provide a method for determine the plantar pressures while standing, walking or running of a shoe wearing subject. Inserting pressure sensors in the insole will provide specific information and therefore the point of the sensor placement will result in obtaining the critical part under the insole. In the wearable shoe sensor project, the device consists left and right shoe insole with ten FSR. Arduino Mega was used as a micro-controller that read the analog input from FSR. The analog inputs were transmitted via bluetooth data transmission that gains the force data in real time on smartphone. Blueterm software which is an android application was used as an interface to read the FSR reading on the shoe wearing subject. The subject consist of two healthy men with different age and weight doing test while standing, walking (1.5 m/s), jogging (5 m/s) and running (9 m/s) on treadmill. The data obtain will be saved on the android device and for making an analysis and comparison graph.

Keywords: gait analysis, plantar pressure, force plate, earable sensor

Procedia PDF Downloads 427
32441 Elements of Usability and Sociability in Activity Management System for e-Masjid

Authors: Hidayah bt Rahmalan, Marhazli Kipli, Muhammad Suffian Sikandar Ghani, Maisarah Abu, Muhammad Faisal Ashaari, Norlizam Md Sukiban

Abstract:

This study presents an example of activity management system for e-Masjid implementing elements of usability and sociability. It is expected to resolve the shortcomings of the most e-Masjid that provide lot of activities to their community. However, the data on handling a lot of activities or events in which involve a lot of people will be difficult to manipulate. Thus, this paper presents the usability and sociability element on an activity management system that not only eases the job for the user but being practical for future when the community join any events. For the time being, this activity management system was only applied for Sayyidina Abu Bakar Mosque in Utem, Malacca.

Keywords: e-masjid, usability, sociability, activity management system

Procedia PDF Downloads 347
32440 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.

Keywords: apartment complex, big data, life-cycle building value analysis, machine learning

Procedia PDF Downloads 357
32439 Estimation of Consolidating Settlement Based on a Time-Dependent Skin Friction Model Considering Column Surface Roughness

Authors: Jiang Zhenbo, Ishikura Ryohei, Yasufuku Noriyuki

Abstract:

Improvement of soft clay deposits by the combination of surface stabilization and floating type cement-treated columns is one of the most popular techniques worldwide. On the basis of one dimensional consolidation model, a time-dependent skin friction model for the column-soil interaction is proposed. The nonlinear relationship between column shaft shear stresses and effective vertical pressure of the surrounding soil can be described in this model. The influence of column-soil surface roughness can be represented using a roughness coefficient R, which plays an important role in the design of column length. Based on the homogenization method, a part of floating type improved ground will be treated as an unimproved portion, which with a length of αH1 is defined as a time-dependent equivalent skin friction length. The compression settlement of this unimproved portion can be predicted only using the soft clay parameters. Apart from calculating the settlement of this composited ground, the load transfer mechanism is discussed utilizing model tests. The proposed model is validated by comparing with calculations and laboratory results of model and ring shear tests, which indicate the suitability and accuracy of the solutions in this paper.

Keywords: floating type improved foundation, time-dependent skin friction, roughness, consolidation

Procedia PDF Downloads 454
32438 Blockchain Technology Security Evaluation: Voting System Based on Blockchain

Authors: Omid Amini

Abstract:

Nowadays, technology plays the most important role in the life of human beings because people use technology to share data and to communicate with each other, but the challenge is the security of this data. For instance, as more people turn to technology in the world, more data is generated, and more hackers try to steal or infiltrate data. In addition, the data is under the control of the central authority, which can trigger the challenge of losing information and changing information; this can create widespread anxiety for different people in different communities. In this paper, we sought to investigate Blockchain technology that can guarantee information security and eliminate the challenge of central authority access to information. Now a day, people are suffering from the current voting system. This means that the lack of transparency in the voting system is a big problem for society and the government in most countries, but blockchain technology can be the best alternative to the previous voting system methods because it removes the most important challenge for voting. According to the results, this research can be a good start to getting acquainted with this new technology, especially on the security part and familiarity with how to use a voting system based on blockchain in the world. At the end of this research, it is concluded that the use of blockchain technology can solve the major security problem and lead to a secure and transparent election.

Keywords: blockchain, technology, security, information, voting system, transparency

Procedia PDF Downloads 108
32437 Integration of UPQC Based on Fuzzy Controller for Power Quality Enhancement in Distributed Network

Authors: M. Habab, C. Benachaiba, B. Mazari, H. Madi, C. Benoudjafer

Abstract:

The use of Distributed Generation (DG) has been increasing in recent years to fill the gap between energy supply and demand. This paper presents the grid connected wind energy system with UPQC based on fuzzy controller to compensate for voltage and current disturbances. The proposed system can improve power quality at the point of installation on power distribution systems. Simulation results show the capability of the DG-UPQC intelligent system to compensate sags voltage and current harmonics at the Point of Common Coupling (PCC).

Keywords: shunt active filter, series active filter, UPQC, power quality, sags voltage, distributed generation, wind turbine

Procedia PDF Downloads 392
32436 PaSA: A Dataset for Patent Sentiment Analysis to Highlight Patent Paragraphs

Authors: Renukswamy Chikkamath, Vishvapalsinhji Ramsinh Parmar, Christoph Hewel, Markus Endres

Abstract:

Given a patent document, identifying distinct semantic annotations is an interesting research aspect. Text annotation helps the patent practitioners such as examiners and patent attorneys to quickly identify the key arguments of any invention, successively providing a timely marking of a patent text. In the process of manual patent analysis, to attain better readability, recognising the semantic information by marking paragraphs is in practice. This semantic annotation process is laborious and time-consuming. To alleviate such a problem, we proposed a dataset to train machine learning algorithms to automate the highlighting process. The contributions of this work are: i) we developed a multi-class dataset of size 150k samples by traversing USPTO patents over a decade, ii) articulated statistics and distributions of data using imperative exploratory data analysis, iii) baseline Machine Learning models are developed to utilize the dataset to address patent paragraph highlighting task, and iv) future path to extend this work using Deep Learning and domain-specific pre-trained language models to develop a tool to highlight is provided. This work assists patent practitioners in highlighting semantic information automatically and aids in creating a sustainable and efficient patent analysis using the aptitude of machine learning.

Keywords: machine learning, patents, patent sentiment analysis, patent information retrieval

Procedia PDF Downloads 70
32435 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 147
32434 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 135
32433 Analysis of Ionosphere Anomaly Before Great Earthquake in Java on 2009 Using GPS Tec Data

Authors: Aldilla Damayanti Purnama Ratri, Hendri Subakti, Buldan Muslim

Abstract:

Ionosphere’s anomalies as an effect of earthquake activity is a phenomenon that is now being studied in seismo-ionospheric coupling. Generally, variation in the ionosphere caused by earthquake activity is weaker than the interference generated by different source, such as geomagnetic storms. However, disturbances of geomagnetic storms show a more global behavior, while the seismo-ionospheric anomalies occur only locally in the area which is largely determined by magnitude of the earthquake. It show that the earthquake activity is unique and because of its uniqueness it has been much research done thus expected to give clues as early warning before earthquake. One of the research that has been developed at this time is the approach of seismo-ionospheric-coupling. This study related the state in the lithosphere-atmosphere and ionosphere before and when earthquake occur. This paper choose the total electron content in a vertical (VTEC) in the ionosphere as a parameter. Total Electron Content (TEC) is defined as the amount of electron in vertical column (cylinder) with cross-section of 1m2 along GPS signal trajectory in ionosphere at around 350 km of height. Based on the analysis of data obtained from the LAPAN agency to identify abnormal signals by statistical methods, obtained that there are an anomaly in the ionosphere is characterized by decreasing of electron content of the ionosphere at 1 TECU before the earthquake occurred. Decreasing of VTEC is not associated with magnetic storm that is indicated as an earthquake precursor. This is supported by the Dst index showed no magnetic interference.

Keywords: earthquake, DST Index, ionosphere, seismoionospheric coupling, VTEC

Procedia PDF Downloads 569
32432 The Effect of Non-Surgical Periodontal Therapy on Metabolic Control in Children

Authors: Areej Al-Khabbaz, Swapna Goerge, Majedah Abdul-Rasoul

Abstract:

Introduction: The most prevalent periodontal disease among children is gingivitis, and it usually becomes more severe in adolescence. A number of intervention studies suggested that resolution of periodontal inflammation can improve metabolic control in patients diagnosed with diabetes mellitus. Aim: to assess the effect of non-surgical periodontal therapy on glycemic control of children diagnosed with diabetes mellitus. Method: Twenty-eight children diagnosed with diabetes mellitus were recruited with established diagnosis diabetes for at least 1 year. Informed consent and child assent form were obtained from children and parents prior to enrolment. The dental examination for the participants was performed on the same week directly following their annual medical assessment. All patients had their glycosylated hemoglobin (HbA1c%) test one week prior to their annual medical and dental visit and 3 months following non-surgical periodontal therapy. All patients received a comprehensive periodontal examination The periodontal assessment included clinical attachment loss, bleeding on probing, plaque score, plaque index and gingival index. All patients were referred for non-surgical periodontal therapy, which included oral hygiene instruction and motivation followed by supra-gingival and subg-ingival scaling using ultrasonic and hand instruments. Statistical Analysis: Data were entered and analyzed using the Statistical Package for Social Science software (SPSS, Chicago, USA), version 18. Statistical analysis of clinical findings was performed to detect differences between the two groups in term of periodontal findings and HbA1c%. Binary logistic regression analysis was performed in order to examine which factors were significant in multivariate analysis after adjusting for confounding between effects. The regression model used the dependent variable ‘Improved glycemic control’, and the independent variables entered in the model were plaque index, gingival index, bleeding %, plaque Statistical significance was set at p < 0.05. Result: A total of 28 children. The mean age of the participants was 13.3±1.92 years. The study participants were divided into two groups; Compliant group (received dental scaling) and non-complaints group (received oral hygiene instructions only). No statistical difference was found between compliant and non-compliant group in age, gender distribution, oral hygiene practice and the level of diabetes control. There was a significant difference between compliant and non-compliant group in term of improvement of HBa1c before and after periodontal therapy. Mean gingival index was the only significant variable associated with improved glycemic control level. In conclusion, this study has demonstrated that non-surgical mechanical periodontal therapy can improve HbA1c% control. The result of this study confirmed that children with diabetes mellitus who are compliant to dental care and have routine professional scaling may have better metabolic control compared to diabetic children who are erratic with dental care.

Keywords: children, diabetes, metabolic control, periodontal therapy

Procedia PDF Downloads 139
32431 Time Delayed Susceptible-Vaccinated-Infected-Recovered-Susceptible Epidemic Model along with Nonlinear Incidence and Nonlinear Treatment

Authors: Kanica Goel, Nilam

Abstract:

Infectious diseases are a leading cause of death worldwide and hence a great challenge for every nation. Thus, it becomes utmost essential to prevent and reduce the spread of infectious disease among humans. Mathematical models help to better understand the transmission dynamics and spread of infections. For this purpose, in the present article, we have proposed a nonlinear time-delayed SVIRS (Susceptible-Vaccinated-Infected-Recovered-Susceptible) mathematical model with nonlinear type incidence rate and nonlinear type treatment rate. Analytical study of the model shows that model exhibits two types of equilibrium points, namely, disease-free equilibrium and endemic equilibrium. Further, for the long-term behavior of the model, stability of the model is discussed with the help of basic reproduction number R₀ and we showed that disease-free equilibrium is locally asymptotically stable if the basic reproduction number R₀ is less than one and unstable if the basic reproduction number R₀ is greater than one for the time lag τ≥0. Furthermore, when basic reproduction number R₀ is one, using center manifold theory and Casillo-Chavez and Song theorem, we showed that the model undergoes transcritical bifurcation. Moreover, numerical simulations are being carried out using MATLAB 2012b to illustrate the theoretical results.

Keywords: nonlinear incidence rate, nonlinear treatment rate, stability, time delayed SVIRS epidemic model

Procedia PDF Downloads 135
32430 A Molding Surface Auto-inspection System

Authors: Ssu-Han Chen, Der-Baau Perng

Abstract:

Molding process in IC manufacturing secures chips against the harms done by hot, moisture or other external forces. While a chip was being molded, defects like cracks, dilapidation, or voids may be embedding on the molding surface. The molding surfaces the study poises to treat and the ones on the market, though, differ in the surface where texture similar to defects is everywhere. Manual inspection usually passes over low-contrast cracks or voids; hence an automatic optical inspection system for molding surface is necessary. The proposed system is consisted of a CCD, a coaxial light, a back light as well as a motion control unit. Based on the property of statistical textures of the molding surface, a series of digital image processing and classification procedure is carried out. After training of the parameter associated with above algorithm, result of the experiment suggests that the accuracy rate is up to 93.75%, contributing to the inspection quality of IC molding surface.

Keywords: molding surface, machine vision, statistical texture, discrete Fourier transformation

Procedia PDF Downloads 413
32429 Management of Myofascial Temporomandibular Disorder in Secondary Care: A Quality Improvement Project

Authors: Rishana Bilimoria, Selina Tang, Sajni Shah, Marianne Henien, Christopher Sproat

Abstract:

Temporomandibular disorders (TMD) may affect up to a third of the general population, and there is evidence demonstrating the majority of Myofascial TMD cases improve after education and conservative measures. In 2015 our department implemented a modified care pathway for myofascial TMD patients in an attempt to improve the patient journey. This involved the use of an interactive group therapy approach to deliver education, reinforce conservative measures and promote self-management. Patient reported experience measures from the new group clinic revealed 71% patient satisfaction. This service is efficient in improving aspects of health status while reducing health-care costs and redistributing clinical time. Since its’ establishment, 52 hours of clinical time, resources and funding have been redirected effectively. This Quality Improvement Project was initiated because it was felt that this new service was being underutilised by our surgical teams. The ‘Plan-Do-Study-Act cycle’ (PDSA) framework was employed to analyse utilisation of the service: The ‘plan’ stage involved outlining our aims: to raise awareness amongst clinicians of the unified care pathway and to increase referral to this clinic. The ‘do’ stage involved collecting data from a sample of 96 patients over 4 month period to ascertain the proportion of Myofascial TMD patients who were correctly referred to the designated clinic. ‘Suitable’ patients who weren’t referred were identified. The ‘Study’ phase involved analysis of results, which revealed that 77% of suitable patients weren’t referred to the designated clinic. They were reviewed on other clinics, which are often overbooked, or managed by junior staff members. This correlated with our original prediction. Barriers to referral included: lack of awareness of the clinic, individual consultant treatment preferences and patient, reluctance to be referred to a ‘group’ clinic. The ‘Act’ stage involved presenting our findings to the team at a clinical governance meeting. This included demonstration of the clinical effectiveness of the care-pathway and explaining the referral route and criteria. In light of the evaluation results, it was decided to keep the group clinic and maximize utilisation. The second cycle of data collection following these changes revealed that of 66 Myofascial TMD patients over a 4 month period, only 9% of suitable patients were not seen via the designated pathway; therefore this QIP was successful in meeting the set objectives. Overall, employing the PDSA cycle in this QIP resulted in appropriate utilisation of the modified care pathway for patients with myofascial TMD in Guy’s Oral Surgery Department. In turn, this leads to high patient satisfaction with the service and effectively redirected 52 hours of clinical time. It permitted adoption of a collaborative working style with oral surgery colleagues to investigate problems, identify solutions, and collectively raise standards of clinical care to ensure we adopt a unified care pathway in secondary care management of Myofascial TMD patients.

Keywords: myofascial, quality Improvement, PDSA, TMD

Procedia PDF Downloads 124
32428 Design and Implementation of Flexible Metadata Editing System for Digital Contents

Authors: K. W. Nam, B. J. Kim, S. J. Lee

Abstract:

Along with the development of network infrastructures, such as high-speed Internet and mobile environment, the explosion of multimedia data is expanding the range of multimedia services beyond voice and data services. Amid this flow, research is actively being done on the creation, management, and transmission of metadata on digital content to provide different services to users. This paper proposes a system for the insertion, storage, and retrieval of metadata about digital content. The metadata server with Binary XML was implemented for efficient storage space and retrieval speeds, and the transport data size required for metadata retrieval was simplified. With the proposed system, the metadata could be inserted into the moving objects in the video, and the unnecessary overlap could be minimized by improving the storage structure of the metadata. The proposed system can assemble metadata into one relevant topic, even if it is expressed in different media or in different forms. It is expected that the proposed system will handle complex network types of data.

Keywords: video, multimedia, metadata, editing tool, XML

Procedia PDF Downloads 153
32427 Effect of Iron Contents on Rheological Properties of Syndiotactic Polypropylene/iron Composites

Authors: Naveed Ahmad, Farooq Ahmad, Abdul Aal

Abstract:

The effect of iron contents on the rheological behavior of sPP/iron composites in the melt phase was investigated using a series of syndiotactic polypropylene/iron (sPP/iron) composite samples. Using the Advanced Rheometric Expansion System, studies with small amplitude oscillatory shear were conducted (ARES). It was discovered that the plateau modulus rose along with the iron loading. Also it was found that both entanglement molecular weight and packing length decrease with increase in iron loading.. This finding demonstrates how iron content in polymer/iron composites affects chain parameters and dimensions, which in turn affects the entire chain dynamics.

Keywords: plateau modulus, packing lenght, polymer/iron composites, rheology, entanglement molecular weight

Procedia PDF Downloads 138
32426 Development of Real Time System for Human Detection and Localization from Unmanned Aerial Vehicle Using Optical and Thermal Sensor and Visualization on Geographic Information Systems Platform

Authors: Nemi Bhattarai

Abstract:

In recent years, there has been a rapid increase in the use of Unmanned Aerial Vehicle (UAVs) in search and rescue (SAR) operations, disaster management, and many more areas where information about the location of human beings are important. This research will primarily focus on the use of optical and thermal camera via UAV platform in real-time detection, localization, and visualization of human beings on GIS. This research will be beneficial in disaster management search of lost humans in wilderness or difficult terrain, detecting abnormal human behaviors in border or security tight areas, studying distribution of people at night, counting people density in crowd, manage people flow during evacuation, planning provisions in areas with high human density and many more.

Keywords: UAV, human detection, real-time, localization, visualization, haar-like, GIS, thermal sensor

Procedia PDF Downloads 444
32425 Measurement and Modelling of HIV Epidemic among High Risk Groups and Migrants in Two Districts of Maharashtra, India: An Application of Forecasting Software-Spectrum

Authors: Sukhvinder Kaur, Ashok Agarwal

Abstract:

Background: For the first time in 2009, India was able to generate estimates of HIV incidence (the number of new HIV infections per year). Analysis of epidemic projections helped in revealing that the number of new annual HIV infections in India had declined by more than 50% during the last decade (GOI Ministry of Health and Family Welfare, 2010). Then, National AIDS Control Organisation (NACO) planned to scale up its efforts in generating projections through epidemiological analysis and modelling by taking recent available sources of evidence such as HIV Sentinel Surveillance (HSS), India Census data and other critical data sets. Recently, NACO generated current round of HIV estimates-2012 through globally recommended tool “Spectrum Software” and came out with the estimates for adult HIV prevalence, annual new infections, number of people living with HIV, AIDS-related deaths and treatment needs. State level prevalence and incidence projections produced were used to project consequences of the epidemic in spectrum. In presence of HIV estimates generated at state level in India by NACO, USIAD funded PIPPSE project under the leadership of NACO undertook the estimations and projections to district level using same Spectrum software. In 2011, adult HIV prevalence in one of the high prevalent States, Maharashtra was 0.42% ahead of the national average of 0.27%. Considering the heterogeneity of HIV epidemic between districts, two districts of Maharashtra – Thane and Mumbai were selected to estimate and project the number of People-Living-with-HIV/AIDS (PLHIV), HIV-prevalence among adults and annual new HIV infections till 2017. Methodology: Inputs in spectrum included demographic data from Census of India since 1980 and sample registration system, programmatic data on ‘Alive and on ART (adult and children)’,‘Mother-Baby pairs under PPTCT’ and ‘High Risk Group (HRG)-size mapping estimates’, surveillance data from various rounds of HSS, National Family Health Survey–III, Integrated Biological and Behavioural Assessment and Behavioural Sentinel Surveillance. Major Findings: Assuming current programmatic interventions in these districts, an estimated decrease of 12% points in Thane and 31% points in Mumbai among new infections in HRGs and migrants is observed from 2011 by 2017. Conclusions: Project also validated decrease in HIV new infection among one of the high risk groups-FSWs using program cohort data since 2012 to 2016. Though there is a decrease in HIV prevalence and new infections in Thane and Mumbai, further decrease is possible if appropriate programme response, strategies and interventions are envisaged for specific target groups based on this evidence. Moreover, evidence need to be validated by other estimation/modelling techniques; and evidence can be generated for other districts of the state, where HIV prevalence is high and reliable data sources are available, to understand the epidemic within the local context.

Keywords: HIV sentinel surveillance, high risk groups, projections, new infections

Procedia PDF Downloads 198
32424 The Trend of Injuries in Building Fire in Tehran from 2002 to 2012

Authors: Mohammadreza Ashouri, Majid Bayatian

Abstract:

Analysis of fire data is a way for the implementation of any plan to improve the level of safety in cities. Such an analysis is able to reveal signs of changes in a given period and can be used as a measure of safety. The information of about 66,341 fires (from 2002 to 2012) released by Tehran Safety Services and Fire-Fighting Organization and data on the population and the number of households provided by Tehran Municipality and the Statistical Yearbook of Iran were extracted. Using the data, the fire changes, the rate of injuries, and mortality rate were determined and analyzed. The rate of injuries and mortality rate of fires per one million population of Tehran were 59.58% and 86.12%, respectively. During the study period, the number of fires and fire stations increased by 104.38% and 102.63%, respectively. Most fires (9.21%) happened in the 4th District of Tehran. The results showed that the recorded fire data have not been systematically planned for fire prevention since one of the ways to reduce injuries caused by fires is to develop a systematic plan for necessary actions in emergency situations. To determine a reliable source for fire prevention, the stages, definitions of working processes and the cause and effect chains should be considered. Therefore, a comprehensive statistical system should be developed for reported and recorded fire data.

Keywords: fire statistics, fire analysis, accident prevention, Tehran

Procedia PDF Downloads 165
32423 Unsupervised Feature Learning by Pre-Route Simulation of Auto-Encoder Behavior Model

Authors: Youngjae Jin, Daeshik Kim

Abstract:

This paper describes a cycle accurate simulation results of weight values learned by an auto-encoder behavior model in terms of pre-route simulation. Given the results we visualized the first layer representations with natural images. Many common deep learning threads have focused on learning high-level abstraction of unlabeled raw data by unsupervised feature learning. However, in the process of handling such a huge amount of data, the learning method’s computation complexity and time limited advanced research. These limitations came from the fact these algorithms were computed by using only single core CPUs. For this reason, parallel-based hardware, FPGAs, was seen as a possible solution to overcome these limitations. We adopted and simulated the ready-made auto-encoder to design a behavior model in Verilog HDL before designing hardware. With the auto-encoder behavior model pre-route simulation, we obtained the cycle accurate results of the parameter of each hidden layer by using MODELSIM. The cycle accurate results are very important factor in designing a parallel-based digital hardware. Finally this paper shows an appropriate operation of behavior model based pre-route simulation. Moreover, we visualized learning latent representations of the first hidden layer with Kyoto natural image dataset.

Keywords: auto-encoder, behavior model simulation, digital hardware design, pre-route simulation, Unsupervised feature learning

Procedia PDF Downloads 425
32422 Cluster Analysis and Benchmarking for Performance Optimization of a Pyrochlore Processing Unit

Authors: Ana C. R. P. Ferreira, Adriano H. P. Pereira

Abstract:

Given the frequent variation of mineral properties throughout the Araxá pyrochlore deposit, even if a good homogenization work has been carried out before feeding the processing plants, an operation with quality and performance’s high variety standard is expected. These results could be improved and standardized if the blend composition parameters that most influence the processing route are determined, and then the types of raw materials are grouped by them, finally presenting a great reference with operational settings for each group. Associating the physical and chemical parameters of a unit operation through benchmarking or even an optimal reference of metallurgical recovery and product quality reflects in the reduction of the production costs, optimization of the mineral resource, and guarantee of greater stability in the subsequent processes of the production chain that uses the mineral of interest. Conducting a comprehensive exploratory data analysis to identify which characteristics of the ore are most relevant to the process route, associated with the use of Machine Learning algorithms for grouping the raw material (ore) and associating these with reference variables in the process’ benchmark is a reasonable alternative for the standardization and improvement of mineral processing units. Clustering methods through Decision Tree and K-Means were employed, associated with algorithms based on the theory of benchmarking, with criteria defined by the process team in order to reference the best adjustments for processing the ore piles of each cluster. A clean user interface was created to obtain the outputs of the created algorithm. The results were measured through the average time of adjustment and stabilization of the process after a new pile of homogenized ore enters the plant, as well as the average time needed to achieve the best processing result. Direct gains from the metallurgical recovery of the process were also measured. The results were promising, with a reduction in the adjustment time and stabilization when starting the processing of a new ore pile, as well as reaching the benchmark. Also noteworthy are the gains in metallurgical recovery, which reflect a significant saving in ore consumption and a consequent reduction in production costs, hence a more rational use of the tailings dams and life optimization of the mineral deposit.

Keywords: mineral clustering, machine learning, process optimization, pyrochlore processing

Procedia PDF Downloads 131
32421 Design and Implementation a Virtualization Platform for Providing Smart Tourism Services

Authors: Nam Don Kim, Jungho Moon, Tae Yun Chung

Abstract:

This paper proposes an Internet of Things (IoT) based virtualization platform for providing smart tourism services. The virtualization platform provides a consistent access interface to various types of data by naming IoT devices and legacy information systems as pathnames in a virtual file system. In the other words, the IoT virtualization platform functions as a middleware which uses the metadata for underlying collected data. The proposed platform makes it easy to provide customized tourism information by using tourist locations collected by IoT devices and additionally enables to create new interactive smart tourism services focused on the tourist locations. The proposed platform is very efficient so that the provided tourism services are isolated from changes in raw data and the services can be modified or expanded without changing the underlying data structure.

Keywords: internet of things (IoT), IoT platform, serviceplatform, virtual file system (VSF)

Procedia PDF Downloads 480
32420 Effect of Noise at Different Frequencies on Heart Rate Variability - Experimental Study Protocol

Authors: A. Bortkiewcz, A. Dudarewicz, P. Małecki, M. Kłaczyński, T. Wszołek, Małgorzata Pawlaczyk-Łuszczyńska

Abstract:

Low-frequency noise (LFN) has been recognized as a special environmental pollutant. It is usually considered a broadband noise with the dominant content of low frequencies from 10 Hz to 250 Hz. A growing body of data shows that LFN differs in nature from other environmental noises, which are at comparable levels but not dominated by low-frequency components. The primary and most frequent adverse effect of LFN exposure is annoyance. Moreover, some recent investigations showed that LFN at relatively low A-weighted sound pressure levels (40−45 dB) occurring in office-like areas could adversely affect the mental performance, especially of high-sensitive subjects. It is well documented that high-frequency noise disturbs various types of human functions; however, there is very little data on the impact of LFN on well-being and health, including the cardiovascular system. Heart rate variability (HRV) is a sensitive marker of autonomic regulation of the circulatory system. Walker and co-workers found that LFN has a significantly more negative impact on cardiovascular response than exposure to high-frequency noise and that changes in HRV parameters resulting from LFN exposure tend to persist over time. The negative reactions of the cardiovascular system in response to LFN generated by wind turbines (20-200 Hz) were confirmed by Chiu. The scientific aim of the study is to assess the relationship between the spectral-temporal characteristics of LFN and the activity of the autonomic nervous system, considering the subjective assessment of annoyance, sensitivity to this type of noise, and cognitive and general health status. The study will be conducted in 20 male students in a special, acoustically prepared, constantly supervised room. Each person will be tested 4 times (4 sessions), under conditions of non-exposure (sham) and exposure to noise of wind turbines recorded at a distance of 250 meters from the turbine with different frequencies and frequency ranges: acoustic band 20 Hz-20 kHz, infrasound band 5-20 Hz, acoustic band + infrasound band. The order of sessions of the experiment will be randomly selected. Each session will last 1 h. There will be a 2-3 days break between sessions to exclude the possibility of the earlier session influencing the results of the next one. Before the first exposure, a questionnaire will be conducted on noise sensitivity, general health status using the GHQ questionnaire, hearing organ status and sociodemographic data. Before each of the 4 exposures, subjects will complete a brief questionnaire on their mood and sleep quality the night before the test. After the test, the subjects will be asked about any discomfort and subjective symptoms during the exposure. Before the test begins, Holter ECG monitoring equipment will be installed. HRV will be analyzed from the ECG recordings, including time and frequency domain parameters. The tests will always be performed in the morning (9-12) to avoid the influence of diurnal rhythm on HRV results. Students will perform psychological tests 15 minutes before the end of the test (Vienna Test System).

Keywords: neurovegetative control, heart rate variability (HRV), cognitive processes, low frequency noise

Procedia PDF Downloads 59
32419 Levels of Loneliness and Quality of Life Among Retirees in Kuwait: Implication to Practice

Authors: Hamad Alhamad

Abstract:

Introduction: The number of retirees in Kuwait is rising quickly, and this is causing more people to become concerned about their well-being. Despite the fact that loneliness and quality of life are significant indices of retiree wellbeing, little research has been done on the topic among retirees in Kuwait. The aim of this study is to explore the level of loneliness and quality of life among retirees in Kuwait. Methods: This is a a cross-sectional descriptive research targeting retirees who live in Kuwait. The UCLA loneliness scale (version 3) and the 36-Item Short Form Survey (36- SF) were utilized. Data was analyzed using SPSS. The ethical approval was obtained from Kuwait University and the Ministry of Health (286). Results: Total respondents in this research were 202 (N=202). The results indicate 77.7% (N=157) experience moderate level of loneliness, 19.8% (N=40) experience high level of loneliness, and only 205% (N=5) experience low level of loneliness. The results of the SF-36 health related questionnaire, participants scores in the eight domains: Physical functioning, general health, role limitations due to physical and emotional health, energy, social functioning, pain, and emotional wellbeing , scored low means. The average of the means was calculated and was (49.8), which indicated that all participants have moderately low Quality of life. Significant relationship with p value equal to ( p= 0.004), was found between a sociodemographic characteristic and level of loneliness in which retirees who were married indicated higher levels of loneliness compared to the single, divorced, and widowed retirees. Conclusion: The study revealed retirees in Kuwait feel moderate loneliness and have a low Quality of Time. The study indicates that retirees should be more considered emotionally and improved and help explore the negative effects on their quality of time In addition to exploring the leading factors to the feeling of loneliness.

Keywords: older adults, social isolation, work, retirement

Procedia PDF Downloads 58
32418 Structural Damage Detection via Incomplete Model Data Using Output Data Only

Authors: Ahmed Noor Al-qayyim, Barlas Özden Çağlayan

Abstract:

Structural failure is caused mainly by damage that often occurs on structures. Many researchers focus on obtaining very efficient tools to detect the damage in structures in the early state. In the past decades, a subject that has received considerable attention in literature is the damage detection as determined by variations in the dynamic characteristics or response of structures. This study presents a new damage identification technique. The technique detects the damage location for the incomplete structure system using output data only. The method indicates the damage based on the free vibration test data by using “Two Points - Condensation (TPC) technique”. This method creates a set of matrices by reducing the structural system to two degrees of freedom systems. The current stiffness matrices are obtained from optimization of the equation of motion using the measured test data. The current stiffness matrices are compared with original (undamaged) stiffness matrices. High percentage changes in matrices’ coefficients lead to the location of the damage. TPC technique is applied to the experimental data of a simply supported steel beam model structure after inducing thickness change in one element. Where two cases are considered, the method detects the damage and determines its location accurately in both cases. In addition, the results illustrate that these changes in stiffness matrix can be a useful tool for continuous monitoring of structural safety using ambient vibration data. Furthermore, its efficiency proves that this technique can also be used for big structures.

Keywords: damage detection, optimization, signals processing, structural health monitoring, two points–condensation

Procedia PDF Downloads 344