Search results for: multivariate time series data
34346 Low Back Pain-Related Absenteeism among Healthcare Workers in Kibuli Muslim Hospital, Kampala Uganda
Authors: Aremu Abdulmujeeb Babatunde
Abstract:
Background: Low back pain was not only considered to be the most common reason for functional disability worldwide, but also estimated to have affected 90% of the universal population. This study aimed at determining the prevalence, consequences and socio-demographic factors associated with low back pain. Methods; A cross-sectional survey was employed and a total number of 150 self-structured questionnaire was distributed among healthcare workers and this was used to determine the prevalence of low back pain and work related absenteeism. Data was entered using Epi info soft-ware and analyzed using SPSS. Results; An overall response rate of 84% (n = 140) was achieved. The study established that majority (37%) of the respondents were in the age bracket of 20-39 years, 57% female (n=59) and 64% of them were married. the pint prevalence was 84%, 31% of the respondents took leave from work as a result of low back pain. There was high prevalence of sick leave among nursing staff 45.2%, Chi-square test shows that there was a statistically significant association between the respondents occupations and daily time spent during their work (P value 0.011 and 0.042) respectively. Socio-demographic factors like age, marital status and gender were not statistically significant at P<0.05. Conclusions; The medical and socio-professional consequences of low back pain among healthcare workers was as a result of their occupation designations and the daily time spent in carry out this occupations.Keywords: low back pain, healthcare workers, prevalence, sick leave
Procedia PDF Downloads 30734345 Bilateral Telecontrol of AutoMerlin Mobile Robot Using Time Domain Passivity Control
Authors: Aamir Shahzad, Hubert Roth
Abstract:
This paper is presenting the bilateral telecontrol of AutoMerlin Mobile Robot having communication delay. Passivity Observers has been designed to monitor the net energy at both ports of a two port network and if any or both ports become active making net energy negative, then the passivity controllers dissipate the proper energy to make the overall system passive in the presence of time delay. The environment force is modeled and sent back to human operator so that s/he can feel it and has additional information about the environment in the vicinity of mobile robot. The experimental results have been presented to show the performance and stability of bilateral controller. The results show the whenever the passivity observers observe active behavior then the passivity controller come into action to neutralize the active behavior to make overall system passive.Keywords: bilateral control, human operator, haptic device, communication network, time domain passivity control, passivity observer, passivity controller, time delay, mobile robot, environment force
Procedia PDF Downloads 39334344 Homosexuality and Culture: A Case Study Depicting the Struggles of a Married Lady
Authors: Athulya Jayakumar, M. Manjula
Abstract:
Though there has been a shift in the understanding of homosexuality from being a sin, crime or pathology in the medical and legal perspectives, the acceptance of homosexuality still remains very scanty in the Indian subcontinent. The present case study is a 24-year-old female who has completed a diploma in polytechnic engineering and residing in the state of Kerala. She initially presented with her husband with complaints of lack of sexual desire and non-cooperation from the index client. After an initial few sessions, the client revealed, in an individual session, about her homosexual orientation which was unknown to her family. She has had multiple short-term relations with females and never had any heterosexual orientation/interest. During her adolescence, she was wondering if she could change herself into a male. However, currently, she accepts her gender. She never wanted a heterosexual marriage; but, had to succumb to the pressure of mother, as a result of a series of unexpected incidents at home and had to agree for the marriage, also with a hope that she may change herself into a bi-sexual. The client was able to bond with the husband emotionally but the multiple attempts at sexual intercourse, at the insistence of the husband, had always been non-pleasurable and induced a sense of disgust. Currently, for several months, there has not been any sexual activity. Also, she actively avoids any chance to have a warm communication with him so that she can avoid chances of him approaching her in a sexual manner. The case study is an attempt to highlight the culture and the struggles of a homosexual individual who comes to therapy for wanting to be a ‘normal wife’ despite having knowledge of legal rights and scenario. There is a scarcity of Indian literature that has systematically investigated issues related to homosexuality. Data on prevalence, emotional problems faced and clinical services available are sparse though it is crucial for increasing understanding of sexual behaviour, orientation and difficulties faced in India.Keywords: case study, culture, cognitive behavior therapy, female homosexuality
Procedia PDF Downloads 34534343 Information Retrieval from Internet Using Hand Gestures
Authors: Aniket S. Joshi, Aditya R. Mane, Arjun Tukaram
Abstract:
In the 21st century, in the era of e-world, people are continuously getting updated by daily information such as weather conditions, news, stock exchange market updates, new projects, cricket updates, sports and other such applications. In the busy situation, they want this information on the little use of keyboard, time. Today in order to get such information user have to repeat same mouse and keyboard actions which includes time and inconvenience. In India due to rural background many people are not much familiar about the use of computer and internet also. Also in small clinics, small offices, and hotels and in the airport there should be a system which retrieves daily information with the minimum use of keyboard and mouse actions. We plan to design application based project that can easily retrieve information with minimum use of keyboard and mouse actions and make our task more convenient and easier. This can be possible with an image processing application which takes real time hand gestures which will get matched by system and retrieve information. Once selected the functions with hand gestures, the system will report action information to user. In this project we use real time hand gesture movements to select required option which is stored on the screen in the form of RSS Feeds. Gesture will select the required option and the information will be popped and we got the information. A real time hand gesture makes the application handier and easier to use.Keywords: hand detection, hand tracking, hand gesture recognition, HSV color model, Blob detection
Procedia PDF Downloads 29034342 Real-Time Loop-Mediated Isothermal Amplification Assay for Rapid Detection of Human Papillomavirus 16 in Oral Squamous Cell Carcinoma
Authors: Suharni Mohamad Suharni Mohamad, Nurul Izzati Hamzan Nurul Izzati Hamzan, Norhayu Abdul Rahman Norhayu Abdul Rahman, Siti Suraiya Md Noor Siti Suraiya Md Noor
Abstract:
Human papillomavirus (HPV) is an important risk factor for development of oral cancer. HPV16 is the most common type found in HPV-positive squamous cell carcinoma. In the present study, we established a real-time loop-mediated isothermal amplification (real-time LAMP) for detection of HPV16. A set of six primers was specially designed to recognize eight distinct sequences of HPV16-E6. Detection and quantification was achieved by real-time monitoring using a real-time turbidimeter based on threshold time required for turbidity in the LAMP reaction. LAMP reagents (MgSO4, dNTPs, Bst polymerase concentrations) and various incubation times and temperatures were optimized. The sensitivity was determined using 10-fold serial dilutions of HPV16 standard strain. The specificity of was evaluated using other HPV genotypes. The optimized method was established with specifically designed primers by real-time detection in approximately 30 min at 65°C. The limit of detection of HPV16 using the LAMP assay was 10 pg/ml that could be detected in 30 min. The LAMP assay was 10 times more sensitive than the conventional PCR in detecting HPV16. No cross-reactivity with other HPV genotypes was observed. This quantitative real-time LAMP assay may improve diagnostic potential for the detection and quantification of HPV16 in clinical samples and epidemiological studies due to its rapidity, simplicity, high sensitivity and specificity. This assay will be further evaluated with HPV DNAs of saliva from patients with oral squamous cell carcinoma. Acknowledgement: This study was financially supported by the ScienceFund Grant, Ministry of Science, Technology and Innovation (305/PPSG/6113219).Keywords: Oral Squamous Cell Carcinoma (OSCC), Human Papillomavirus 16 (HPV16), Loop-Mediated Isothermal Amplification (LAMP), rapid detection
Procedia PDF Downloads 40634341 A Geoprocessing Tool for Early Civil Work Notification to Optimize Fiber Optic Cable Installation Cost
Authors: Hussain Adnan Alsalman, Khalid Alhajri, Humoud Alrashidi, Abdulkareem Almakrami, Badie Alguwaisem, Said Alshahrani, Abdullah Alrowaished
Abstract:
Most of the cost of installing a new fiber optic cable is attributed to civil work-trenching-cost. In many cases, information technology departments receive project proposals in their eReview system, but not all projects are visible to everyone. Additionally, if there was no IT scope in the proposed project, it is not likely to be visible to IT. Sometimes it is too late to add IT scope after project budgets have been finalized. Finally, the eReview system is a repository of PDF files for each project, which commits the reviewer to manual work and limits automation potential. This paper details a solution to address the late notification of the eReview system by integrating IT Sites GIS data-sites locations-with land use permit (LUP) data-civil work activity, which is the first step before securing the required land usage authorizations and means no detailed designs for any relevant project before an approved LUP request. To address the manual nature of eReview system, both the LUP System and IT data are using ArcGIS Desktop, which enables the creation of a geoprocessing tool with either Python or Model Builder to automate finding and evaluating potentially usable LUP requests to reduce trenching between two sites in need of a new FOC. To achieve this, a weekly dump was taken from LUP system production data and loaded manually onto ArcMap Desktop. Then a custom tool was developed in model builder, which consisted of a table of two columns containing all the pairs of sites in need of new fiber connectivity. The tool then iterates all rows of this table, taking the sites’ pair one at a time and finding potential LUPs between them, which satisfies the provided search radius. If a group of LUPs was found, an iterator would go through each LUP to find the required civil work between the two sites and the LUP Polyline feature and the distance through the line, which would be counted as cost avoidance if an IT scope had been added. Finally, the tool will export an Excel file named with sites pair, and it will contain as many rows as the number of LUPs, which met the search radius containing trenching and pulling information and cost. As a result, multiple projects have been identified – historical, missed opportunity, and proposed projects. For the proposed project, the savings were about 75% ($750,000) to install a new fiber with the Euclidean distance between Abqaiq GOSP2 and GOSP3 DCOs. In conclusion, the current tool setup identifies opportunities to bundle civil work on single projects at a time and between two sites. More work is needed to allow the bundling of multiple projects between two sites to achieve even more cost avoidance in both capital cost and carbon footprint.Keywords: GIS, fiber optic cable installation optimization, eliminate redundant civil work, reduce carbon footprint for fiber optic cable installation
Procedia PDF Downloads 21934340 A Real Time Ultra-Wideband Location System for Smart Healthcare
Authors: Mingyang Sun, Guozheng Yan, Dasheng Liu, Lei Yang
Abstract:
Driven by the demand of intelligent monitoring in rehabilitation centers or hospitals, a high accuracy real-time location system based on UWB (ultra-wideband) technology was proposed. The system measures precise location of a specific person, traces his movement and visualizes his trajectory on the screen for doctors or administrators. Therefore, doctors could view the position of the patient at any time and find them immediately and exactly when something emergent happens. In our design process, different algorithms were discussed, and their errors were analyzed. In addition, we discussed about a , simple but effective way of correcting the antenna delay error, which turned out to be effective. By choosing the best algorithm and correcting errors with corresponding methods, the system attained a good accuracy. Experiments indicated that the ranging error of the system is lower than 7 cm, the locating error is lower than 20 cm, and the refresh rate exceeds 5 times per second. In future works, by embedding the system in wearable IoT (Internet of Things) devices, it could provide not only physical parameters, but also the activity status of the patient, which would help doctors a lot in performing healthcare.Keywords: intelligent monitoring, ultra-wideband technology, real-time location, IoT devices, smart healthcare
Procedia PDF Downloads 14034339 Cost-Effective Mechatronic Gaming Device for Post-Stroke Hand Rehabilitation
Authors: A. Raj Kumar, S. Bilaloglu
Abstract:
Stroke is a leading cause of adult disability worldwide. We depend on our hands for our activities of daily living(ADL). Although many patients regain the ability to walk, they continue to experience long-term hand motor impairments. As the number of individuals with young stroke is increasing, there is a critical need for effective approaches for rehabilitation of hand function post-stroke. Motor relearning for dexterity requires task-specific kinesthetic, tactile and visual feedback. However, when a stroke results in both sensory and motor impairment, it becomes difficult to ascertain when and what type of sensory substitutions can facilitate motor relearning. In an ideal situation, real-time task-specific data on the ability to learn and data-driven feedback to assist such learning will greatly assist rehabilitation for dexterity. We have found that kinesthetic and tactile information from the unaffected hand can assist patients re-learn the use of optimal fingertip forces during a grasp and lift task. Measurement of fingertip grip force (GF), load forces (LF), their corresponding rates (GFR and LFR), and other metrics can be used to gauge the impairment level and progress during learning. Currently ATI mini force-torque sensors are used in research settings to measure and compute the LF, GF, and their rates while grasping objects of different weights and textures. Use of the ATI sensor is cost prohibitive for deployment in clinical or at-home rehabilitation. A cost effective mechatronic device is developed to quantify GF, LF, and their rates for stroke rehabilitation purposes using off-the-shelf components such as load cells, flexi-force sensors, and an Arduino UNO microcontroller. A salient feature of the device is its integration with an interactive gaming environment to render a highly engaging user experience. This paper elaborates the integration of kinesthetic and tactile sensing through computation of LF, GF and their corresponding rates in real time, information processing, and interactive interfacing through augmented reality for visual feedback.Keywords: feedback, gaming, kinesthetic, rehabilitation, tactile
Procedia PDF Downloads 24034338 Evidence Theory Based Emergency Multi-Attribute Group Decision-Making: Application in Facility Location Problem
Authors: Bidzina Matsaberidze
Abstract:
It is known that, in emergency situations, multi-attribute group decision-making (MAGDM) models are characterized by insufficient objective data and a lack of time to respond to the task. Evidence theory is an effective tool for describing such incomplete information in decision-making models when the expert and his knowledge are involved in the estimations of the MAGDM parameters. We consider an emergency decision-making model, where expert assessments on humanitarian aid from distribution centers (HADC) are represented in q-rung ortho-pair fuzzy numbers, and the data structure is described within the data body theory. Based on focal probability construction and experts’ evaluations, an objective function-distribution centers’ selection ranking index is constructed. Our approach for solving the constructed bicriteria partitioning problem consists of two phases. In the first phase, based on the covering’s matrix, we generate a matrix, the columns of which allow us to find all possible partitionings of the HADCs with the service centers. Some constraints are also taken into consideration while generating the matrix. In the second phase, based on the matrix and using our exact algorithm, we find the partitionings -allocations of the HADCs to the centers- which correspond to the Pareto-optimal solutions. For an illustration of the obtained results, a numerical example is given for the facility location-selection problem.Keywords: emergency MAGDM, q-rung orthopair fuzzy sets, evidence theory, HADC, facility location problem, multi-objective combinatorial optimization problem, Pareto-optimal solutions
Procedia PDF Downloads 9234337 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 4034336 Validating Condition-Based Maintenance Algorithms through Simulation
Authors: Marcel Chevalier, Léo Dupont, Sylvain Marié, Frédérique Roffet, Elena Stolyarova, William Templier, Costin Vasile
Abstract:
Industrial end-users are currently facing an increasing need to reduce the risk of unexpected failures and optimize their maintenance. This calls for both short-term analysis and long-term ageing anticipation. At Schneider Electric, we tackle those two issues using both machine learning and first principles models. Machine learning models are incrementally trained from normal data to predict expected values and detect statistically significant short-term deviations. Ageing models are constructed by breaking down physical systems into sub-assemblies, then determining relevant degradation modes and associating each one to the right kinetic law. Validating such anomaly detection and maintenance models is challenging, both because actual incident and ageing data are rare and distorted by human interventions, and incremental learning depends on human feedback. To overcome these difficulties, we propose to simulate physics, systems, and humans -including asset maintenance operations- in order to validate the overall approaches in accelerated time and possibly choose between algorithmic alternatives.Keywords: degradation models, ageing, anomaly detection, soft sensor, incremental learning
Procedia PDF Downloads 12634335 To Investigate a Discharge Planning Connect with Long Term Care 2.0 Program in a Medical Center in Taiwan
Authors: Chan Hui-Ya, Ding Shin-Tan
Abstract:
Background and Aim: The discharge planning is considered helpful to reduce the hospital length of stay and readmission rate, and then increased satisfaction with healthcare for patients and professionals. In order to decrease the waiting time of long-term care and boost the care quality of patients after discharge from the hospital, the Ministry of Health and Welfare department in Taiwan initiates a program “discharge planning connects with long-term care 2.0 services” in 2017. The purpose of this study is to investigate the outcome of the pilot of this program in a medical center. Methods: By purpose sampling, the study chose five wards in a medical center as pilot units. The researchers compared the beds of service, the numbers of cases which were transferred to the long-term care center and transferred rates per month between the pilot units and the other units, and analyze the basic data, the long-term care service needs and the approval service items of cases transfer to the long-term care center in pilot units. Results: From June to September 2017, a total of 92 referrals were made, and 51 patients were enrolled into the pilot program. There is a significant difference of transferring rate between the pilot units and the other units (χ = 702.6683, p < 0.001). Only 20 cases (39.2% success rate) were approved to accept the parts of service items of long-term care in the pilot units. The most approval item was respite care service (n = 13; 65%), while it was third at needs ranking of service lists during linking services process. Among the reasons of patients who cancelled the request, 38.71% reasons were related to the services which could not match the patients’ needs and expectation. Conclusion: The results indicate there is a requirement to modify the long-term care services to fit the needs of cases. The researchers suggest estimating the potential cases by screening data from hospital informatics systems and to hire more case manager according the service time of potential cases. Meanwhile, the strategies shortened the assessment scale and authorized hospital case managers to approve some items of long-term care should be considered.Keywords: discharge planning, long-term care, case manager, patient care
Procedia PDF Downloads 28634334 Effect of Quenching Medium on the Hardness of Dual Phase Steel Heat Treated at a High Temperature
Authors: Tebogo Mabotsa, Tamba Jamiru, David Ibrahim
Abstract:
Dual phase(DP) steel consists essentially of fine grained equiaxial ferrite and a dispersion of martensite. Martensite is the primary precipitate in DP steels, it is the main resistance to dislocation motion within the material. The objective of this paper is to present a relation between the intercritical annealing holding time and the hardness of a dual phase steel. The initial heat treatment involved heating the specimens to 1000oC and holding the sample at that temperature for 30 minutes. After the initial heat treatment, the samples were heated to 770oC and held for a varying amount of time at constant temperature. The samples were held at 30, 60, and 90 minutes respectively. After heating and holding the samples at the austenite-ferrite phase field, the samples were quenched in water, brine, and oil for each holding time. The experimental results proved that an equation for predicting the hardness of a dual phase steel as a function of the intercritical holding time is possible. The relation between intercritical annealing holding time and hardness of a dual phase steel heat treated at high temperatures is parabolic in nature. Theoretically, the model isdependent on the cooling rate because the model differs for each quenching medium; therefore, a universal hardness equation can be derived where the cooling rate is a variable factor.Keywords: quenching medium, annealing temperature, dual phase steel, martensite
Procedia PDF Downloads 8234333 Computational Fluid Dynamics Simulation of Reservoir for Dwell Time Prediction
Authors: Nitin Dewangan, Nitin Kattula, Megha Anawat
Abstract:
Hydraulic reservoir is the key component in the mobile construction vehicles; most of the off-road earth moving construction machinery requires bigger side hydraulic reservoirs. Their reservoir construction is very much non-uniform and designers used such design to utilize the space available under the vehicle. There is no way to find out the space utilization of the reservoir by oil and validity of design except virtual simulation. Computational fluid dynamics (CFD) helps to predict the reservoir space utilization by vortex mapping, path line plots and dwell time prediction to make sure the design is valid and efficient for the vehicle. The dwell time acceptance criteria for effective reservoir design is 15 seconds. The paper will describe the hydraulic reservoir simulation which is carried out using CFD tool acuSolve using automated mesh strategy. The free surface flow and moving reference mesh is used to define the oil flow level inside the reservoir. The first baseline design is not able to meet the acceptance criteria, i.e., dwell time below 15 seconds because the oil entry and exit ports were very close. CFD is used to redefine the port locations for the reservoir so that oil dwell time increases in the reservoir. CFD also proposed baffle design the effective space utilization. The final design proposed through CFD analysis is used for physical validation on the machine.Keywords: reservoir, turbulence model, transient model, level set, free-surface flow, moving frame of reference
Procedia PDF Downloads 15234332 The Influence of Housing Choice Vouchers on the Private Rental Market
Authors: Randy D. Colon
Abstract:
Through a freedom of information request, data pertaining to Housing Choice Voucher (HCV) households has been obtained from the Chicago Housing Authority, including rent price and number of bedrooms per HCV household, community area, and zip code from 2013 to the first quarter of 2018. Similar data pertaining to the private rental market will be obtained through public records found through the United States Department of Housing and Urban Development. The datasets will be analyzed through statistical and mapping software to investigate the potential link between HCV households and distorted rent prices. Quantitative data will be supplemented by qualitative data to investigate the lived experience of Chicago residents. Qualitative data will be collected at community meetings in the Chicago Englewood neighborhood through participation in neighborhood meetings and informal interviews with residents and community leaders. The qualitative data will be used to gain insight on the lived experience of community leaders and residents of the Englewood neighborhood in relation to housing, the rental market, and HCV. While there is an abundance of quantitative data on this subject, this qualitative data is necessary to capture the lived experience of local residents effected by a changing rental market. This topic reflects concerns voiced by members of the Englewood community, and this study aims to keep the community relevant in its findings.Keywords: Chicago, housing, housing choice voucher program, housing subsidies, rental market
Procedia PDF Downloads 11834331 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy
Authors: Amir Tosson, Mohammad Reza, Christian Gutt
Abstract:
Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.Keywords: metadata, FAIR, data analysis, XPCS, IoT
Procedia PDF Downloads 6234330 Evaluation of Fluidized Bed Bioreactor Process for Mmabatho Waste Water Treatment Plant
Authors: Shohreh Azizi, Wag Nel
Abstract:
The rapid population growth in South Africa has increased the requirement of waste water treatment facilities. The aim of this study is to assess the potential use of Fluidized bed Bio Reactor for Mmabatho sewage treatment plant. The samples were collected from the Inlet and Outlet of reactor daily to analysis the pH, Chemical Oxygen Demand (COD), Biochemical Oxygen Demand (BOD), Total Suspended Solid (TSS) as per standard method APHA 2005. The studies were undertaken on a continue laboratory scale, and analytical data was collected before and after treatment. The reduction of 87.22 % COD, 89.80 BOD % was achieved. Fluidized Bed Bio Reactor remove Bod/COD removal as well as nutrient removal. The efforts also made to study the impact of the biological system if the domestic wastewater gets contaminated with any industrial contamination and the result shows that the biological system can tolerate high Total dissolved solids up to 6000 mg/L as well as high heavy metal concentration up to 4 mg/L. The data obtained through the experimental research are demonstrated that the FBBR may be used (<3 h total Hydraulic Retention Time) for secondary treatment in Mmabatho wastewater treatment plant.Keywords: fluidized bed bioreactor, wastewater treatment plant, biological system, high TDS, heavy metal
Procedia PDF Downloads 16734329 Minimizing Unscheduled Maintenance from an Aircraft and Rolling Stock Maintenance Perspective: Preventive Maintenance Model
Authors: Adel A. Ghobbar, Varun Raman
Abstract:
The Corrective maintenance of components and systems is a problem plaguing almost every industry in the world today. Train operators’ and the maintenance repair and overhaul subsidiary of the Dutch railway company is also facing this problem. A considerable portion of the maintenance activities carried out by the company are unscheduled. This, in turn, severely stresses and stretches the workforce and resources available. One possible solution is to have a robust preventive maintenance plan. The other possible solution is to plan maintenance based on real-time data obtained from sensor-based ‘Health and Usage Monitoring Systems.’ The former has been investigated in this paper. The preventive maintenance model developed for train operator will subsequently be extended, to tackle the unscheduled maintenance problem also affecting the aerospace industry. The extension of the model to the aerospace sector will be dealt with in the second part of the research, and it would, in turn, validate the soundness of the model developed. Thus, there are distinct areas that will be addressed in this paper, including the mathematical modelling of preventive maintenance and optimization based on cost and system availability. The results of this research will help an organization to choose the right maintenance strategy, allowing it to save considerable sums of money as opposed to overspending under the guise of maintaining high asset availability. The concept of delay time modelling was used to address the practical problem of unscheduled maintenance in this paper. The delay time modelling can be used to help with support planning for a given asset. The model was run using MATLAB, and the results are shown that the ideal inspection intervals computed using the extended from a minimal cost perspective were 29 days, and from a minimum downtime, perspective was 14 days. Risk matrix integration was constructed to represent the risk in terms of the probability of a fault leading to breakdown maintenance and its consequences in terms of maintenance cost. Thus, the choice of an optimal inspection interval of 29 days, resulted in a cost of approximately 50 Euros and the corresponding value of b(T) was 0.011. These values ensure that the risk associated with component X being maintained at an inspection interval of 29 days is more than acceptable. Thus, a switch in maintenance frequency from 90 days to 29 days would be optimal from the point of view of cost, downtime and risk.Keywords: delay time modelling, unscheduled maintenance, reliability, maintainability, availability
Procedia PDF Downloads 13234328 Web and Android-Based Applications as a Breakthrough in Preventing Non-System Fault Disturbances Due to Work Errors in the Transmission Unit
Authors: Dhany Irvandy, Ary Gemayel, Mohammad Azhar, Leidenti Dwijayanti, Iif Hafifah
Abstract:
Work safety is among the most important things in work execution. Unsafe conditions and actions are priorities in accident prevention in the world of work, especially in the operation and maintenance of electric power transmission. Considering the scope of work, operational work in the transmission has a very high safety risk. Various efforts have been made to avoid work accidents. However, accidents or disturbances caused by non-conformities in work implementation still often occur. Unsafe conditions or actions can cause these. Along with the development of technology, website-based applications and mobile applications have been widely used as a medium to monitor work in real-time and by more people. This paper explains the use of web and android-based applications to monitor work and work processes in the field to prevent work accidents or non-system fault disturbances caused by non-conformity of work implementation with predetermined work instructions. Because every job is monitored in real-time, recorded in time and documented systemically, this application can reduce the occurrence of possible unsafe actions carried out by job executors that can cause disruption or work accidents.Keywords: work safety, unsafe action, application, non-system fault, real-time.
Procedia PDF Downloads 4434327 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns
Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim
Abstract:
Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation
Procedia PDF Downloads 34034326 Screening for Women with Chorioamnionitis: An Integrative Literature Review
Authors: Allison Herlene Du Plessis, Dalena (R.M.) Van Rooyen, Wilma Ten Ham-Baloyi, Sihaam Jardien-Baboo
Abstract:
Introduction: Women die in pregnancy and childbirth for five main reasons—severe bleeding, infections, unsafe abortions, hypertensive disorders (pre-eclampsia and eclampsia), and medical complications including cardiac disease, diabetes, or HIV/AIDS complicated by pregnancy. In 2015, WHO classified sepsis as the third highest cause for maternal mortalities in the world. Chorioamnionitis is a clinical syndrome of intrauterine infection during any stage of the pregnancy and it refers to ascending bacteria from the vaginal canal up into the uterus, causing infection. While the incidence rates for chorioamnionitis are not well documented, complications related to chorioamnionitis are well documented and midwives still struggle to identify this condition in time due to its complex nature. Few diagnostic methods are available in public health services, due to escalated laboratory costs. Often the affordable biomarkers, such as C-reactive protein CRP, full blood count (FBC) and WBC, have low significance in diagnosing chorioamnionitis. A lack of screening impacts on effective and timeous management of chorioamnionitis, and early identification and management of risks could help to prevent neonatal complications and reduce the subsequent series of morbidities and healthcare costs of infants who are health foci of perinatal infections. Objective: This integrative literature review provides an overview of current best research evidence on the screening of women at risk for chorioamnionitis. Design: An integrative literature review was conducted using a systematic electronic literature search through EBSCOhost, Cochrane Online, Wiley Online, PubMed, Scopus and Google. Guidelines, research studies, and reports in English related to chorioamnionitis from 2008 up until 2020 were included in the study. Findings: After critical appraisal, 31 articles were included. More than one third (67%) of the literature included ranked on the three highest levels of evidence (Level I, II and III). Data extracted regarding screening for chorioamnionitis was synthesized into four themes, namely: screening by clinical signs and symptoms, screening by causative factors of chorioamnionitis, screening of obstetric history, and essential biomarkers to diagnose chorioamnionitis. Key conclusions: There are factors that can be used by midwives to identify women at risk for chorioamnionitis. However, there are a paucity of established sociological, epidemiological and behavioral factors to screen this population. Several biomarkers are available to diagnose chorioamnionitis. Increased Interleukin-6 in amniotic fluid is the better indicator and strongest predictor of histological chorioamnionitis, whereas the available rapid matrix-metalloproteinase-8 test requires further testing. Maternal white blood cells count (WBC) has shown poor selectivity and sensitivity, and C-reactive protein (CRP) thresholds varied among studies and are not ideal for conclusive diagnosis of subclinical chorioamnionitis. Implications for practice: Screening of women at risk for chorioamnionitis by health care providers providing care for pregnant women, including midwives, is important for diagnosis and management before complications arise, particularly in resource-constraint settings.Keywords: chorioamnionitis, guidelines, best evidence, screening, diagnosis, pregnant women
Procedia PDF Downloads 12334325 An Analytical Formulation of Pure Shear Boundary Condition for Assessing the Response of Some Typical Sites in Mumbai
Authors: Raj Banerjee, Aniruddha Sengupta
Abstract:
An earthquake event, associated with a typical fault rupture, initiates at the source, propagates through a rock or soil medium and finally daylights at a surface which might be a populous city. The detrimental effects of an earthquake are often quantified in terms of the responses of superstructures resting on the soil. Hence, there is a need for the estimation of amplification of the bedrock motions due to the influence of local site conditions. In the present study, field borehole log data of Mangalwadi and Walkeswar sites in Mumbai city are considered. The data consists of variation of SPT N-value with the depth of soil. A correlation between shear wave velocity (Vₛ) and SPT N value for various soil profiles of Mumbai city has been developed using various existing correlations which is used further for site response analysis. MATLAB program is developed for studying the ground response analysis by performing two dimensional linear and equivalent linear analysis for some of the typical Mumbai soil sites using pure shear (Multi Point Constraint) boundary condition. The model is validated in linear elastic and equivalent linear domain using the popular commercial program, DEEPSOIL. Three actual earthquake motions are selected based on their frequency contents and durations and scaled to a PGA of 0.16g for the present ground response analyses. The results are presented in terms of peak acceleration time history with depth, peak shear strain time history with depth, Fourier amplitude versus frequency, response spectrum at the surface etc. The peak ground acceleration amplification factors are found to be about 2.374, 3.239 and 2.4245 for Mangalwadi site and 3.42, 3.39, 3.83 for Walkeswar site using 1979 Imperial Valley Earthquake, 1989 Loma Gilroy Earthquake and 1987 Whitter Narrows Earthquake, respectively. In the absence of any site-specific response spectrum for the chosen sites in Mumbai, the generated spectrum at the surface may be utilized for the design of any superstructure at these locations.Keywords: deepsoil, ground response analysis, multi point constraint, response spectrum
Procedia PDF Downloads 18034324 Flip-Chip Bonding for Monolithic of Matrix-Addressable GaN-Based Micro-Light-Emitting Diodes Array
Authors: Chien-Ju Chen, Chia-Jui Yu, Jyun-Hao Liao, Chia-Ching Wu, Meng-Chyi Wu
Abstract:
A 64 × 64 GaN-based micro-light-emitting diode array (μLEDA) with 20 μm in pixel size and 40 μm in pitch by flip-chip bonding (FCB) is demonstrated in this study. Besides, an underfilling (UF) technology is applied to the process for improving the uniformity of device. With those configurations, good characteristics are presented, operation voltage and series resistance of a pixel in the 450 nm flip chip μLEDA are 2.89 V and 1077Ω (4.3 mΩ-cm²) at 25 A/cm², respectively. The μLEDA can sustain higher current density compared to conventional LED, and the power of the device is 9.5 μW at 100 μA and 0.42 mW at 20 mA.Keywords: GaN, micro-light-emitting diode array(μLEDA), flip-chip bonding, underfilling
Procedia PDF Downloads 42334323 Synthesis and Anticholinesterase Activity of Carvacrol Derivatives
Authors: Fatih Sonmez
Abstract:
Alzheimer’s disease (AD) is a progressive neurodegenerative disease and it is the most common form of dementia that affects aged people. Acetylcholinesterase is a hydrolase involved in the termination of impulse transmission at cholinergic synapses by rapid hydrolysis of the neurotransmitter ACh in the central and peripheral nervous system. Carvacrol (5-iso-propyl-2-methyl-phenol) is a main bioactive monoterpene isolated from many medicinal herbs, such as Thymus vulgaris, Monarda punctate and Origanum vulgare spp. It is known that carvacrol has been widely used as an active anti-inflammatory ingredient, which can inhibit the isoproterenol induced inflammation in myocardial infarcted rats. In this paper, a series of 12 carvacrol substituted carbamate derivatives (2a-l) was synthesized and their inhibitory activities on AChE and BuChE were evaluated. Among them, 2d exhibited the strongest inhibition against AChE with an IC50 value of 2.22 µM, which was 130-fold more than that of carvacrol (IC50 = 288.26 µM).Keywords: Acetylcholinesterase, Butyrylcholinesterase, Carbamate, Carvacrol
Procedia PDF Downloads 35334322 Statistical Model of Water Quality in Estero El Macho, Machala-El Oro
Authors: Rafael Zhindon Almeida
Abstract:
Surface water quality is an important concern for the evaluation and prediction of water quality conditions. The objective of this study is to develop a statistical model that can accurately predict the water quality of the El Macho estuary in the city of Machala, El Oro province. The methodology employed in this study is of a basic type that involves a thorough search for theoretical foundations to improve the understanding of statistical modeling for water quality analysis. The research design is correlational, using a multivariate statistical model involving multiple linear regression and principal component analysis. The results indicate that water quality parameters such as fecal coliforms, biochemical oxygen demand, chemical oxygen demand, iron and dissolved oxygen exceed the allowable limits. The water of the El Macho estuary is determined to be below the required water quality criteria. The multiple linear regression model, based on chemical oxygen demand and total dissolved solids, explains 99.9% of the variance of the dependent variable. In addition, principal component analysis shows that the model has an explanatory power of 86.242%. The study successfully developed a statistical model to evaluate the water quality of the El Macho estuary. The estuary did not meet the water quality criteria, with several parameters exceeding the allowable limits. The multiple linear regression model and principal component analysis provide valuable information on the relationship between the various water quality parameters. The findings of the study emphasize the need for immediate action to improve the water quality of the El Macho estuary to ensure the preservation and protection of this valuable natural resource.Keywords: statistical modeling, water quality, multiple linear regression, principal components, statistical models
Procedia PDF Downloads 9834321 Analyzing the Factors that Cause Parallel Performance Degradation in Parallel Graph-Based Computations Using Graph500
Authors: Mustafa Elfituri, Jonathan Cook
Abstract:
Recently, graph-based computations have become more important in large-scale scientific computing as they can provide a methodology to model many types of relations between independent objects. They are being actively used in fields as varied as biology, social networks, cybersecurity, and computer networks. At the same time, graph problems have some properties such as irregularity and poor locality that make their performance different than regular applications performance. Therefore, parallelizing graph algorithms is a hard and challenging task. Initial evidence is that standard computer architectures do not perform very well on graph algorithms. Little is known exactly what causes this. The Graph500 benchmark is a representative application for parallel graph-based computations, which have highly irregular data access and are driven more by traversing connected data than by computation. In this paper, we present results from analyzing the performance of various example implementations of Graph500, including a shared memory (OpenMP) version, a distributed (MPI) version, and a hybrid version. We measured and analyzed all the factors that affect its performance in order to identify possible changes that would improve its performance. Results are discussed in relation to what factors contribute to performance degradation.Keywords: graph computation, graph500 benchmark, parallel architectures, parallel programming, workload characterization.
Procedia PDF Downloads 14734320 Social Data Aggregator and Locator of Knowledge (STALK)
Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat
Abstract:
Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.Keywords: social network, analysis, Facebook, Linkedin, git, big data
Procedia PDF Downloads 44434319 A Xenon Mass Gauging through Heat Transfer Modeling for Electric Propulsion Thrusters
Authors: A. Soria-Salinas, M.-P. Zorzano, J. Martín-Torres, J. Sánchez-García-Casarrubios, J.-L. Pérez-Díaz, A. Vakkada-Ramachandran
Abstract:
The current state-of-the-art methods of mass gauging of Electric Propulsion (EP) propellants in microgravity conditions rely on external measurements that are taken at the surface of the tank. The tanks are operated under a constant thermal duty cycle to store the propellant within a pre-defined temperature and pressure range. We demonstrate using computational fluid dynamics (CFD) simulations that the heat-transfer within the pressurized propellant generates temperature and density anisotropies. This challenges the standard mass gauging methods that rely on the use of time changing skin-temperatures and pressures. We observe that the domes of the tanks are prone to be overheated, and that a long time after the heaters of the thermal cycle are switched off, the system reaches a quasi-equilibrium state with a more uniform density. We propose a new gauging method, which we call the Improved PVT method, based on universal physics and thermodynamics principles, existing TRL-9 technology and telemetry data. This method only uses as inputs the temperature and pressure readings of sensors externally attached to the tank. These sensors can operate during the nominal thermal duty cycle. The improved PVT method shows little sensitivity to the pressure sensor drifts which are critical towards the end-of-life of the missions, as well as little sensitivity to systematic temperature errors. The retrieval method has been validated experimentally with CO2 in gas and fluid state in a chamber that operates up to 82 bar within a nominal thermal cycle of 38 °C to 42 °C. The mass gauging error is shown to be lower than 1% the mass at the beginning of life, assuming an initial tank load at 100 bar. In particular, for a pressure of about 70 bar, just below the critical pressure of CO2, the error of the mass gauging in gas phase goes down to 0.1% and for 77 bar, just above the critical point, the error of the mass gauging of the liquid phase is 0.6% of initial tank load. This gauging method improves by a factor of 8 the accuracy of the standard PVT retrievals using look-up tables with tabulated data from the National Institute of Standards and Technology.Keywords: electric propulsion, mass gauging, propellant, PVT, xenon
Procedia PDF Downloads 34534318 A Randomised Controlled Trial on the Nurse-Led Smartphone-Based Self-Management Programme for Type 2 Diabetes Patients with Poor Glycemic Control
Authors: Wenru Wang
Abstract:
Over the past decades, Asia has emerged as the ‘diabetes epicentre’ in the world due to rapid economic development, urbanization and nutrition transition. There is an urgent need to develop more effective and cost-effective care management strategies in response to this rising diabetes epidemic. This study aims to develop and compare a nurse-led smartphone-based self-management programme with an existing nurse-led diabetes service on health-related outcomes among type 2 diabetes patients with poor glycemic control in Singapore. We proposed a randomized controlled trial with pre- and repeated post-tests control group design. A total of 128 type 2 diabetes patients with poor glycemic control will be recruited from the diabetes clinic of an acute public hospital in Singapore through convenience sampling. Study participants will be either randomly allocated to the experimental group or control group. Outcome measures used will include the 10-item General Self-Efficacy Scale, 11-item Revised Summary of Diabetes Self-care Activities, and 19-item Diabetes-Dependent Quality of Life. Data will be collected at 3-time points: baseline, three months and six months from the baseline, respectively. It is expected that this programme will be an alternative offered to diabetes patients to master their self-care management skills, in addition to the existing diabetes service provided in diabetes clinics in Singapore hospitals. Also, the self-supporting and less resource-intensive nature of this programme, through the use of smartphone app as a mode of intervention delivery, will greatly reduce nurses’ direct contact time with patients and allow more time to be allocated to those who require more attention. The study has been registered with clinicaltrials.gov. The trial registration number is NCT03088475.Keywords: type 2 diabetes, poor glycaemic control, nurse-led, smartphone-based, self-management, health-relevant outcomes
Procedia PDF Downloads 20034317 Evaluation of Duncan-Chang Deformation Parameters of Granular Fill Materials Using Non-Invasive Seismic Wave Methods
Authors: Ehsan Pegah, Huabei Liu
Abstract:
Characterizing the deformation properties of fill materials in a wide stress range always has been an important issue in geotechnical engineering. The hyperbolic Duncan-Chang model is a very popular model of stress-strain relationship that captures the nonlinear deformation of granular geomaterials in a very tractable manner. It consists of a particular set of the model parameters, which are generally measured from an extensive series of laboratory triaxial tests. This practice is both time-consuming and costly, especially in large projects. In addition, undesired effects caused by soil disturbance during the sampling procedure also may yield a large degree of uncertainty in the results. Accordingly, non-invasive geophysical seismic approaches may be utilized as the appropriate alternative surveys for measuring the model parameters based on the seismic wave velocities. To this end, the conventional seismic refraction profiles were carried out in the test sites with the granular fill materials to collect the seismic waves information. The acquired shot gathers are processed, from which the P- and S-wave velocities can be derived. The P-wave velocities are extracted from the Seismic Refraction Tomography (SRT) technique while S-wave velocities are obtained by the Multichannel Analysis of Surface Waves (MASW) method. The velocity values were then utilized with the equations resulting from the rigorous theories of elasticity and soil mechanics to evaluate the Duncan-Chang model parameters. The derived parameters were finally compared with those from laboratory tests to validate the reliability of the results. The findings of this study may confidently serve as the useful references for determination of nonlinear deformation parameters of granular fill geomaterials. Those are environmentally friendly and quite economic, which can yield accurate results under the actual in-situ conditions using the surface seismic methods.Keywords: Duncan-Chang deformation parameters, granular fill materials, seismic waves velocity, multichannel analysis of surface waves, seismic refraction tomography
Procedia PDF Downloads 182