Search results for: monitoring interface
3175 Usability Evaluation of a Self-Report Mobile App for COVID-19 Symptoms: Supporting Health Monitoring in the Work Context
Authors: Kevin Montanez, Patricia Garcia
Abstract:
The confinement and restrictions adopted to avoid an exponential spread of the COVID-19 have negatively impacted the Peruvian economy. In this context, Industries offering essential products could continue operating, but they have to follow safety protocols and implement strategies to ensure employee health. In view of the increasing internet access and mobile phone ownership, “Alerta Temprana”, a mobile app, was developed to self-report COVID-19 symptoms in the work context. In this study, the usability of the mobile app “Alerta Temprana” was evaluated from the perspective of health monitors and workers. In addition to reporting the metrics related to the usability of the application, the utility of the system is also evaluated from the monitors' perspective. In this descriptive study, the participants used the mobile app for two months. Afterwards, System Usability Scale (SUS) questionnaire was answered by the workers and monitors. A Usefulness questionnaire with open questions was also used for the monitors. The data related to the use of the application was collected during one month. Furthermore, descriptive statistics and bivariate analysis were used. The workers rated the application as good (70.39). In the case of the monitors, usability was excellent (83.0). The most important feature for the monitors were the emails generated by the application. The average interaction per user was 30 seconds and a total of 6172 self-reports were sent. Finally, a statistically significant association was found between the acceptability scale and the work area. The results of this study suggest that Alerta Temprana has the potential to be used for surveillance and health monitoring in any context of face-to-face modality. Participants reported a high degree of ease of use. However, from the perspective of workers, SUS cannot diagnose usability issues and we suggest we use another standard usability questionnaire to improve "Alerta Temprana" for future use.Keywords: public health in informatics, mobile app, usability, self-report
Procedia PDF Downloads 1173174 Simscape Library for Large-Signal Physical Network Modeling of Inertial Microelectromechanical Devices
Authors: S. Srinivasan, E. Cretu
Abstract:
The information flow (e.g. block-diagram or signal flow graph) paradigm for the design and simulation of Microelectromechanical (MEMS)-based systems allows to model MEMS devices using causal transfer functions easily, and interface them with electronic subsystems for fast system-level explorations of design alternatives and optimization. Nevertheless, the physical bi-directional coupling between different energy domains is not easily captured in causal signal flow modeling. Moreover, models of fundamental components acting as building blocks (e.g. gap-varying MEMS capacitor structures) depend not only on the component, but also on the specific excitation mode (e.g. voltage or charge-actuation). In contrast, the energy flow modeling paradigm in terms of generalized across-through variables offers an acausal perspective, separating clearly the physical model from the boundary conditions. This promotes reusability and the use of primitive physical models for assembling MEMS devices from primitive structures, based on the interconnection topology in generalized circuits. The physical modeling capabilities of Simscape have been used in the present work in order to develop a MEMS library containing parameterized fundamental building blocks (area and gap-varying MEMS capacitors, nonlinear springs, displacement stoppers, etc.) for the design, simulation and optimization of MEMS inertial sensors. The models capture both the nonlinear electromechanical interactions and geometrical nonlinearities and can be used for both small and large signal analyses, including the numerical computation of pull-in voltages (stability loss). Simscape behavioral modeling language was used for the implementation of reduced-order macro models, that present the advantage of a seamless interface with Simulink blocks, for creating hybrid information/energy flow system models. Test bench simulations of the library models compare favorably with both analytical results and with more in-depth finite element simulations performed in ANSYS. Separate MEMS-electronic integration tests were done on closed-loop MEMS accelerometers, where Simscape was used for modeling the MEMS device and Simulink for the electronic subsystem.Keywords: across-through variables, electromechanical coupling, energy flow, information flow, Matlab/Simulink, MEMS, nonlinear, pull-in instability, reduced order macro models, Simscape
Procedia PDF Downloads 1333173 Development the Potential of Parking Tax and Parking Retribution Revenues: Case Study in Bekasi City
Authors: Ivan Yudianto
Abstract:
The research objectives are to analyze the factors that impede the Parking Tax and Parking Retribution collection in Bekasi City Government, analyzing the factors that can increase local own revenue from the tax sector of parking tax and parking retribution, analyze monitoring the parking retribution collection by the Bekasi City Government, analyze strategies Bekasi City Government through the preparation of a roadmap and action plan to increase parking tax and parking retribution revenues. The approach used in this research is a qualitative approach. Qualitative research is used because the problem is not yet clear and the object to be studied will be holistic, complex, and dynamic, and the relationship will be interactive symptoms. Methods of data collection and technical analysis of the data was in-depth interviews, participant observation, documentary materials, literature, and triangulation, as well as new methods such as the methods of visual materials and internet browsing. The results showed that there are several factors that become an obstacle such as the parking taxpayer does not disclose the actual parking revenue, the parking taxpayer are late or do not pay Parking Tax, many parking locations controlled by illegal organizations, shortage of human resources in charge levy and supervise the parking tax and parking retribution collection in the Bekasi City Government, surveillance parking tax and parking retribution are not scheduled on a regular basis. Several strategic priorities in order to develop the potential of the Parking Tax and Parking Retribution in the Bekasi City Government, namely through increased controling and monitoring of the Parking Taxpayer, forming a team of auditors to audit the Parking Taxpayer, seek law enforcement persuasive and educative to reduce Parking Taxpayer wayward, providing strict sanctions against the Parking Taxpayer disobedient, revised regulations mayors about locations of parking in Bekasi City, rationalize revenues target of Parking Retribution, conducting takeover attempts parking location on the roadside of the individual or specific group, and drafting regional regulations on parking subscribe.Keywords: local own revenue, parking retribution, parking tax, parking taxpayer
Procedia PDF Downloads 3243172 Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction
Authors: Cherry Yieng Siang Ling, Joong Hee Lee, Myung Hwan Yun
Abstract:
The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.Keywords: usability, qualitative data, text-processing algorithm, natural language processing
Procedia PDF Downloads 2833171 Signal Processing of the Blood Pressure and Characterization
Authors: Hadj Abd El Kader Benghenia, Fethi Bereksi Reguig
Abstract:
In clinical medicine, blood pressure, raised blood hemodynamic monitoring is rich pathophysiological information of cardiovascular system, of course described through factors such as: blood volume, arterial compliance and peripheral resistance. In this work, we are interested in analyzing these signals to propose a detection algorithm to delineate the different sequences and especially systolic blood pressure (SBP), diastolic blood pressure (DBP), and the wave and dicrotic to do their analysis in order to extract the cardiovascular parameters.Keywords: blood pressure, SBP, DBP, detection algorithm
Procedia PDF Downloads 4373170 Serological IgG Testing to Diagnose Alimentary Induced Diseases and Monitoring Efficacy of an Individual Defined Diet in Dogs
Authors: Anne-Margré C. Vink
Abstract:
Background: Food-related allergies and intolerances are frequently occurring in dogs. Diagnosis and monitoring according to ‘Golden Standard’ of elimination efficiency are time-consuming, expensive, and requires expert clinical setting. In order to facilitate rapid and robust, quantitative testing of intolerance, and determining the individual offending foods, a serological test is implicated. Method: As we developed Medisynx IgG Human Screening Test ELISA before and the dog’s immune system is most similar to humans, we were able to develop Medisynx IgG Dog Screening Test ELISA as well. In this study, 47 dogs suffering from Canine Atopic Dermatitis (CAD) and several secondary induced reactions were included to participate in serological Medisynx IgG Dog Screening Test ELISA (within < 0,02 % SD). Results were expressed as titers relative to the standard OD readings to diagnose alimentary induced diseases and monitoring the efficacy of an individual eliminating diet in dogs. Split sample analysis was performed by independently sending 2 times 3 ml serum under two unique codes. Results: The veterinarian monitored these dogs to check dog’ results at least at 3, 7, 21, 49, 70 days and after period of 6 and 12 months on an individual negative diet and a positive challenge (retrospectively) at 6 months. Data of each dog were recorded in a screening form and reported that a complete recovery of all clinical manifestations was observed at or less than 70 days (between 50 and 70 days) in the majority of dogs(44 out of 47 dogs =93.6%). Conclusion: Challenge results showed a significant result of 100% in specificity as well as 100% positive predicted value. On the other hand, sensitivity was 95,7% and negative predictive value was 95,7%. In conclusion, an individual diet based on IgG ELISA in dogs provides a significant improvement of atopic dermatitis and pruritus including all other non-specific defined allergic skin reactions as erythema, itching, biting and gnawing at toes, as well as to several secondary manifestations like chronic diarrhoea, chronic constipation, otitis media, obesity, laziness or inactive behaviour, pain and muscular stiffness causing a movement disorders, excessive lacrimation, hyper behaviour, nervous behaviour and not possible to stay alone at home, anxiety, biting and aggressive behaviour and disobedience behaviour. Furthermore, we conclude that a relatively more severe systemic candidiasis, as shown by relatively higher titer (class 3 and 4 IgG reactions to Candida albicans), influence the duration of recovery from clinical manifestations in affected dogs. These findings are consistent with our preliminary human clinical studies.Keywords: allergy, canine atopic dermatitis, CAD, food allergens, IgG-ELISA, food-incompatibility
Procedia PDF Downloads 3203169 The Misuse of Free Cash and Earnings Management: An Analysis of the Extent to Which Board Tenure Mitigates Earnings Management
Authors: Michael McCann
Abstract:
Managerial theories propose that, in joint stock companies, executives may be tempted to waste excess free cash on unprofitable projects to keep control of resources. In order to conceal their projects' poor performance, they may seek to engage in earnings management. On the one hand, managers may manipulate earnings upwards in order to post ‘good’ performances and safeguard their position. On the other, since managers pursuit of unrewarding investments are likely to lead to low long-term profitability, managers will use negative accruals to reduce current year’s earnings, smoothing earnings over time in order to conceal the negative effects. Agency models argue that boards of directors are delegated by shareholders to ensure that companies are governed properly. Part of that responsibility is ensuring the reliability of financial information. Analyses of the impact of board characteristics, particularly board independence on the misuse of free cash flow and earnings management finds conflicting evidence. However, existing characterizations of board independence do not account for such directors gaining firm-specific knowledge over time, influencing their monitoring ability. Further, there is little analysis of the influence of the relative experience of independent directors and executives on decisions surrounding the use of free cash. This paper contributes to this literature regarding the heterogeneous characteristics of boards by investigating the influence of independent director tenure on earnings management and the relative tenures of independent directors and Chief Executives. A balanced panel dataset comprising 51 companies across 11 annual periods from 2005 to 2015 is used for the analysis. In each annual period, firms were classified as conducting earnings management if they had discretionary accruals in the bottom quartile (downwards) and top quartile (upwards) of the distributed values for the sample. Logistical regressions were conducted to determine the marginal impact of independent board tenure and a number of control variables on the probability of conducting earnings management. The findings indicate that both absolute and relative measures of board independence and experience do not have a significant impact on the likelihood of earnings management. It is the level of free cash flow which is the major influence on the probability of earnings management. Higher free cash flow increases the probability of earnings management significantly. The research also investigates whether board monitoring of earnings management is contingent on the level of free cash flow. However, the results suggest that board monitoring is not amplified when free cash flow is higher. This suggests that the extent of earnings management in companies is determined by a range of company, industry and situation-specific factors.Keywords: corporate governance, boards of directors, agency theory, earnings management
Procedia PDF Downloads 2333168 Development of an Interagency Crime Management System for Nigeria’s Law Enforcement Agencies
Authors: Muhammad Abba Jallo, Fred Fudah Moveh
Abstract:
This study addresses the challenges faced by Nigerian law enforcement agencies due to the lack of an integrated crime management system. While various agencies use ICT-based systems, the absence of interoperability creates barriers to effective collaboration and information sharing. The research proposes the development of an Interagency Crime Management System (ICMS), which integrates the Crime Management Systems (CMS) of different agencies through an Application Program Interface (API). The system is designed to allow all law enforcement agencies to input data using a standardized format, improving crime tracking, reporting, and management across Nigeria. This paper details the design and implementation process, highlighting the benefits of enhanced collaboration for crime management.Keywords: crime management, Nigeria, law enforcement, ICT
Procedia PDF Downloads 183167 User Experience in Relation to Eye Tracking Behaviour in VR Gallery
Authors: Veslava Osinska, Adam Szalach, Dominik Piotrowski
Abstract:
Contemporary VR technologies allow users to explore virtual 3D spaces where they can work, socialize, learn, and play. User's interaction with GUI and the pictures displayed implicate perceptual and also cognitive processes which can be monitored due to neuroadaptive technologies. These modalities provide valuable information about the users' intentions, situational interpretations, and emotional states, to adapt an application or interface accordingly. Virtual galleries outfitted by specialized assets have been designed using the Unity engine BITSCOPE project in the frame of CHIST-ERA IV program. Users interaction with gallery objects implies the questions about his/her visual interests in art works and styles. Moreover, an attention, curiosity, and other emotional states are possible to be monitored and analyzed. Natural gaze behavior data and eye position were recorded by built-in eye-tracking module within HTC Vive headset gogle for VR. Eye gaze results are grouped due to various users’ behavior schemes and the appropriate perpetual-cognitive styles are recognized. Parallelly usability tests and surveys were adapted to identify the basic features of a user-centered interface for the virtual environments across most of the timeline of the project. A total of sixty participants were selected from the distinct faculties of University and secondary schools. Users’ primary knowledge about art and was evaluated during pretest and this way the level of art sensitivity was described. Data were collected during two months. Each participant gave written informed consent before participation. In data analysis reducing the high-dimensional data into a relatively low-dimensional subspace ta non linear algorithms were used such as multidimensional scaling and novel technique technique t-Stochastic Neighbor Embedding. This way it can classify digital art objects by multi modal time characteristics of eye tracking measures and reveal signatures describing selected artworks. Current research establishes the optimal place on aesthetic-utility scale because contemporary interfaces of most applications require to be designed in both functional and aesthetical ways. The study concerns also an analysis of visual experience for subsamples of visitors, differentiated, e.g., in terms of frequency of museum visits, cultural interests. Eye tracking data may also show how to better allocate artefacts and paintings or increase their visibility when possible.Keywords: eye tracking, VR, UX, visual art, virtual gallery, visual communication
Procedia PDF Downloads 423166 Scalable UI Test Automation for Large-scale Web Applications
Authors: Kuniaki Kudo, Raviraj Solanki, Kaushal Patel, Yash Virani
Abstract:
This research mainly concerns optimizing UI test automation for large-scale web applications. The test target application is the HHAexchange homecare management WEB application that seamlessly connects providers, state Medicaid programs, managed care organizations (MCOs), and caregivers through one platform with large-scale functionalities. This study focuses on user interface automation testing for the WEB application. The quality assurance team must execute many manual users interface test cases in the development process to confirm no regression bugs. The team automated 346 test cases; the UI automation test execution time was over 17 hours. The business requirement was reducing the execution time to release high-quality products quickly, and the quality assurance automation team modernized the test automation framework to optimize the execution time. The base of the WEB UI automation test environment is Selenium, and the test code is written in Python. Adopting a compilation language to write test code leads to an inefficient flow when introducing scalability into a traditional test automation environment. In order to efficiently introduce scalability into Test Automation, a scripting language was adopted. The scalability implementation is mainly implemented with AWS's serverless technology, an elastic container service. The definition of scalability here is the ability to automatically set up computers to test automation and increase or decrease the number of computers running those tests. This means the scalable mechanism can help test cases run parallelly. Then test execution time is dramatically decreased. Also, introducing scalable test automation is for more than just reducing test execution time. There is a possibility that some challenging bugs are detected by introducing scalable test automation, such as race conditions, Etc. since test cases can be executed at same timing. If API and Unit tests are implemented, the test strategies can be adopted more efficiently for this scalability testing. However, in WEB applications, as a practical matter, API and Unit testing cannot cover 100% functional testing since they do not reach front-end codes. This study applied a scalable UI automation testing strategy to the large-scale homecare management system. It confirmed the optimization of the test case execution time and the detection of a challenging bug. This study first describes the detailed architecture of the scalable test automation environment, then describes the actual performance reduction time and an example of challenging issue detection.Keywords: aws, elastic container service, scalability, serverless, ui automation test
Procedia PDF Downloads 1053165 Mitigating Food Insecurity and Malnutrition by Promoting Carbon Farming via a Solar-Powered Enzymatic Composting Bioreactor with Arduino-Based Sensors
Authors: Molin A., De Ramos J. M., Cadion L. G., Pico R. L.
Abstract:
Malnutrition and food insecurity represent significant global challenges affecting millions of individuals, particularly in low-income and developing regions. The researchers created a solar-powered enzymatic composting bioreactor with an Arduino-based monitoring system for pH, humidity, and temperature. It manages mixed municipal solid wastes incorporating industrial enzymes and whey additives for accelerated composting and minimized carbon footprint. Within 15 days, the bioreactor yielded 54.54% compost compared to 44.85% from traditional methods, increasing yield by nearly 10%. Tests showed that the bioreactor compost had 4.84% NPK, passing metal analysis standards, while the traditional pit compost had 3.86% NPK; both are suitable for agriculture. Statistical analyses, including ANOVA and Tukey's HSD test, revealed significant differences in agricultural yield across different compost types based on leaf length, width, and number of leaves. The study compared the effects of different composts on Brassica rapa subsp. Chinesis (Petchay) and Brassica juncea (Mustasa) plant growth. For Pechay, significant effects of compost type on plant leaf length (F(5,84) = 62.33, η² = 0.79) and leaf width (F(5,84) = 12.35, η² = 0.42) were found. For Mustasa, significant effects of compost type on leaf length (F(4,70) = 20.61, η² = 0.54), leaf width (F(4,70) = 19.24, η² = 0.52), and number of leaves (F(4,70) = 13.17, η² = 0.43) were observed. This study explores the effectiveness of the enzymatic composting bioreactor and its viability in promoting carbon farming as a solution to food insecurity and malnutrition.Keywords: malnutrition, food insecurity, enzymatic composting bioreactor, arduino-based monitoring system, enzymes, carbon farming, whey additive, NPK level
Procedia PDF Downloads 563164 Ground Track Assessment Using Electrical Resistivity Tomography Application
Authors: Noryani Natasha Yahaya, Anas Ibrahim, Juraidah Ahmad, Azura Ahmad, Mohd Ikmal Fazlan Rosli, Zailan Ramli, Muhd Sidek Muhd Norhasri
Abstract:
The subgrade formation is an important element of the railway structure which holds overall track stability. Conventional track maintenance involves many substructure component replacements, as well as track re-ballasting on a regular basis is partially contributed to the embankment's long-term settlement problem. For subgrade long-term stability analysis, the geophysical method is commonly being used to diagnose those hidden sources/mechanisms of track deterioration problems that the normal visual method is unable to detect. Electrical resistivity tomography (ERT) is one of the applicable geophysical tools that are helpful in railway subgrade inspection/track monitoring due to its flexibility and reliability of the analysis. The ERT was conducted at KM 23.0 of Pinang Tunggal track to investigate the subgrade of railway track through the characterization/mapping on track formation profiling which was directly generated using 2D analysis of Res2dinv software. The profiles will allow examination of the presence and spatial extent of a significant subgrade layer and screening of any poor contact of soil boundary. Based on the finding, there is a mix/interpretation/intermixing of an interlayer between the sub-ballast and the sand. Although the embankment track considered here is at no immediate risk of settlement effect or any failure, the regular monitoring of track’s location will allow early correction maintenance if necessary. The developed data of track formation clearly shows the similarity of the side view with the assessed track. The data visualization in the 2D section of the track embankment agreed well with the initial assumption based on the main element structure general side view.Keywords: ground track, assessment, resistivity, geophysical railway, method
Procedia PDF Downloads 1543163 Thermal Modelling and Experimental Comparison for a Moving Pantograph Strip
Authors: Nicolas Delcey, Philippe Baucour, Didier Chamagne, Geneviève Wimmer, Auditeau Gérard, Bausseron Thomas, Bouger Odile, Blanvillain Gérard
Abstract:
This paper proposes a thermal study of the catenary/pantograph interface for a train in motion. A 2.5D complex model of the pantograph strip has been defined and created by a coupling between a 1D and a 2D model. Experimental and simulation results are presented and with a comparison allow validating the 2.5D model. Some physical phenomena are described and presented with the help of the model such as the stagger motion thermal effect, particular heats and the effect of the material characteristics. Finally it is possible to predict the critical thermal configuration during a train trip.Keywords: electro-thermal studies, mathematical optimizations, multi-physical approach, numerical model, pantograph strip wear
Procedia PDF Downloads 3233162 Saving Energy through Scalable Architecture
Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala
Abstract:
In this paper, we focus on the importance of scalable architecture for data centers and buildings in general to help an enterprise achieve environmental sustainability. The scalable architecture helps in many ways, such as adaptability to the business and user requirements, promotes high availability and disaster recovery solutions that are cost effective and low maintenance. The scalable architecture also plays a vital role in three core areas of sustainability: economy, environment, and social, which are also known as the 3 pillars of a sustainability model. If the architecture is scalable, it has many advantages. A few examples are that scalable architecture helps businesses and industries to adapt to changing technology, drive innovation, promote platform independence, and build resilience against natural disasters. Most importantly, having a scalable architecture helps industries bring in cost-effective measures for energy consumption, reduce wastage, increase productivity, and enable a robust environment. It also helps in the reduction of carbon emissions with advanced monitoring and metering capabilities. Scalable architectures help in reducing waste by optimizing the designs to utilize materials efficiently, minimize resources, decrease carbon footprints by using low-impact materials that are environmentally friendly. In this paper we also emphasize the importance of cultural shift towards the reuse and recycling of natural resources for a balanced ecosystem and maintain a circular economy. Also, since all of us are involved in the use of computers, much of the scalable architecture we have studied is related to data centers.Keywords: scalable architectures, sustainability, application design, disruptive technology, machine learning and natural language processing, AI, social media platform, cloud computing, advanced networking and storage devices, advanced monitoring and metering infrastructure, climate change
Procedia PDF Downloads 1033161 Comparison of Two Home Sleep Monitors Designed for Self-Use
Authors: Emily Wood, James K. Westphal, Itamar Lerner
Abstract:
Background: Polysomnography (PSG) recordings are regularly used in research and clinical settings to study sleep and sleep-related disorders. Typical PSG studies are conducted in professional laboratories and performed by qualified researchers. However, the number of sleep labs worldwide is disproportionate to the increasing number of individuals with sleep disorders like sleep apnea and insomnia. Consequently, there is a growing need to supply cheaper yet reliable means to measure sleep, preferably autonomously by subjects in their own home. Over the last decade, a variety of devices for self-monitoring of sleep became available in the market; however, very few have been directly validated against PSG to demonstrate their ability to perform reliable automatic sleep scoring. Two popular mobile EEG-based systems that have published validation results, the DREEM 3 headband and the Z-Machine, have never been directly compared one to the other by independent researchers. The current study aimed to compare the performance of DREEM 3 and the Z-Machine to help investigators and clinicians decide which of these devices may be more suitable for their studies. Methods: 26 participants have completed the study for credit or monetary compensation. Exclusion criteria included any history of sleep, neurological or psychiatric disorders. Eligible participants arrived at the lab in the afternoon and received the two devices. They then spent two consecutive nights monitoring their sleep at home. Participants were also asked to keep a sleep log, indicating the time they fell asleep, woke up, and the number of awakenings occurring during the night. Data from both devices, including detailed sleep hypnograms in 30-second epochs (differentiating Wake, combined N1/N2, N3; and Rapid Eye Movement sleep), were extracted and aligned upon retrieval. For analysis, the number of awakenings each night was defined as four or more consecutive wake epochs between sleep onset and termination. Total sleep time (TST) and the number of awakenings were compared to subjects’ sleep logs to measure consistency with the subjective reports. In addition, the sleep scores from each device were compared epoch-by-epoch to calculate the agreement between the two devices using Cohen’s Kappa. All analysis was performed using Matlab 2021b and SPSS 27. Results/Conclusion: Subjects consistently reported longer times spent asleep than the time reported by each device (M= 448 minutes for sleep logs compared to M= 406 and M= 345 minutes for the DREEM and Z-Machine, respectively; both ps<0.05). Linear correlations between the sleep log and each device were higher for the DREEM than the Z-Machine for both TST and the number of awakenings, and, likewise, the mean absolute bias between the sleep logs and each device was higher for the Z-Machine for both TST (p<0.001) and awakenings (p<0.04). There was some indication that these effects were stronger for the second night compared to the first night. Epoch-by-epoch comparisons showed that the main discrepancies between the devices were for detecting N2 and REM sleep, while N3 had a high agreement. Overall, the DREEM headband seems superior for reliably scoring sleep at home.Keywords: DREEM, EEG, seep monitoring, Z-machine
Procedia PDF Downloads 1063160 Secondary Charged Fragments Tracking for On-Line Beam Range Monitoring in Particle Therapy
Authors: G. Traini, G. Battistoni, F. Collamati, E. De Lucia, R. Faccini, C. Mancini-Terracciano, M. Marafini, I. Mattei, S. Muraro, A. Sarti, A. Sciubba, E. Solfaroli Camillocci, M. Toppi, S. M. Valle, C. Voena, V. Patera
Abstract:
In Particle Therapy (PT) treatments a large amount of secondary particles, whose emission point is correlated to the dose released in the crossed tissues, is produced. The measurement of the secondary charged fragments component could represent a valid technique to monitor the beam range during the PT treatments, that is a still missing item in the clinical practice. A sub-millimetrical precision on the beam range measurement is required to significantly optimise the technique and to improve the treatment quality. In this contribution, a detector, named Dose Profiler (DP), is presented. It is specifically planned to monitor on-line the beam range exploiting the secondary charged particles produced in PT Carbon ions treatment. In particular, the DP is designed to track the secondary fragments emitted at large angles with respect to the beam direction (mainly protons), with the aim to reconstruct the spatial coordinates of the fragment emission point extrapolating the measured track toward the beam axis. The DP is currently under development within of the INSIDE collaboration (Innovative Solutions for In-beam Dosimetry in hadrontherapy). The tracker is made by six layers (20 × 20 cm²) of BCF-12 square scintillating fibres (500 μm) coupled to Silicon Photo-Multipliers, followed by two plastic scintillator layers of 6 mm thickness. A system of front-end boards based on FPGAs arranged around the detector provides the data acquisition. The detector characterization with cosmic rays is currently undergoing, and a data taking campaign with protons will take place in May 2017. The DP design and the performances measured with using MIPs and protons beam will be reviewed.Keywords: fragmentation, monitoring, particle therapy, tracking
Procedia PDF Downloads 2323159 Study of a Fabry-Perot Resonator
Authors: F. Hadjaj, A. Belghachi, A. Halmaoui, M. Belhadj, H. Mazouz
Abstract:
A laser is essentially an optical oscillator consisting of a resonant cavity, an amplifying medium and a pumping source. In semiconductor diode lasers, the cavity is created by the boundary between the cleaved face of the semiconductor crystal and air and also has reflective properties as a result of the differing refractive indices of the two media. For a GaAs-air interface a reflectance of 0.3 is typical and therefore the length of the semiconductor junction forms the resonant cavity. To prevent light, being emitted in unwanted directions from the junction and Sides perpendicular to the required direction are roughened. The objective of this work is to simulate the optical resonator Fabry-Perot and explore its main characteristics, such as FSR, Finesse, Linewidth, Transmission and so on that describe the performance of resonator.Keywords: Fabry-Perot Resonator, laser diod, reflectance, semiconductor
Procedia PDF Downloads 3503158 Gas-Solid Nitrocarburizing of Steels: Kinetic Modelling and Experimental Validation
Authors: L. Torchane
Abstract:
This study is devoted to defining the optimal conditions for the nitriding of pure iron at atmospheric pressure by using NH3-Ar-C3H8 gas mixtures. After studying the mechanisms of phase formation and mass transfer at the gas-solid interface, a mathematical model is developed in order to predict the nitrogen transfer rate in the solid, the ε-carbonitride layer growth rate and the nitrogen and carbon concentration profiles. In order to validate the model and to show its possibilities, it is compared with thermogravimetric experiments, analyses and metallurgical observations (X-ray diffraction, optical microscopy and electron microprobe analysis). Results obtained allow us to demonstrate the sound correlation between the experimental results and the theoretical predictions.Keywords: gaseous nitrocarburizing, kinetic model, diffusion, layer growth kinetic
Procedia PDF Downloads 5333157 Optimising Participation in Physical Activity Research for Adults with Intellectual Disabilities
Authors: Yetunde M. Dairo, Johnny Collett, Helen Dawes
Abstract:
Background and Aim: Engagement with physical activity (PA) research is poor among adults with intellectual disabilities (ID), particularly in those from residential homes. This study explored why, by asking managers of residential homes, adults with ID and their carers. Methods: Participants: A convenient sample of 23 individuals from two UK local authorities, including a group of ID residential home managers, adults with ID and their support staff. Procedures: A) Residential home managers (n=6) were asked questions about their willingness to allow their residents to participate in PA research; B) eleven adults with ID and their support workers (n=6) were asked questions about their willingness to accept 7-day accelerometer monitoring and/or the International Physical Activity Questionnaire-short version (IPAQ-s) as PA measures. The IPAQ-s was administered by the researcher and they were each provided with samples of accelerometers to try on. Results: A) Five out of six managers said that the burden of wearing the accelerometer for seven days would be too high for the people they support, the majority of whom might be unable to express their wishes. They also said they would be unwilling to act as proxy respondents for the same reason. Additionally, they cited time pressure, understaffing, and reluctance to spend time on the research paperwork as further reasons for non-participation. B) All 11 individuals with ID completed the IPAQ-s while only three accepted the accelerometer, one of whom was deemed inappropriate to wear it. Reasons for rejecting accelerometers included statements from participants of: ‘too expensive’, ‘too heavy’, ‘uncomfortable’, and two people said they would not want to wear it for more than one day. All adults with ID (11) and their support workers (6) provided information about their physical activity levels through the IPAQ-s. Conclusions: Care home managers are a barrier to research participation. However, adults with ID would be happy for the IPAQ-s as a PA measure, but less so for the 7-day accelerometer monitoring. In order to improve participation in this population, the choice of PA measure is considered important. Moreover, there is a need for studies exploring how best to engage ID residential home managers in PA research.Keywords: intellectual disability, physical activity measurement, research engagement, research participation
Procedia PDF Downloads 3063156 Effects of Tenefovir Disiproxil Fumarate on the Renal Sufficiency of HIV Positive Patients
Authors: Londeka Ntuli, Frasia Oosthuizen
Abstract:
Background: Tenefovir disiproxil fumarate (TDF) is a nephrotoxic drug and has been proven to contribute to renal insufficiency necessitating intensive monitoring and management of adverse effects arising from prolonged exposure to the drug. TDF is one of the preferred first-line drugs used in combination therapy in most regions. There are estimated 300 000 patients being initiated on the Efavirenz/TDF/Emtricitabine first-line regimen annually in South Africa. It is against this background that this study aims to investigate the effects of TDF on renal sufficiency of HIV positive patients. Methodology: A retrospective quantitative study was conducted, analysing clinical charts of HIV positive patient’s older than 18 years of age and on a TDF-containing regimen for more than 1 year. Data were obtained from the analysis of patient files and was transcribed into Microsoft® Excel® spreadsheet. Extracted data were coded, categorised and analysed using STATA®. Results: A total of 275 patient files were included in this study. Renal function started decreasing after 3 months of treatment (with 93.5% patients having a normal EGFR), and kept on decreasing as time progressed with only 39.6% normal renal function at year 4. Additional risk factors for renal insufficiency included age below 25, female gender, and additional medication. Conclusion: It is clear from this study that the use of TDF necessitates intensive monitoring and management of adverse effects arising from prolonged exposure to the drug. The findings from this study generated pertinent information on the safety profile of the drug TDF in a resource-limited setting of a public health institution. The appropriate management is of tremendous importance in the South African context where the majority of HIV positive individuals are on the TDF containing regimen; thus it is beneficial to ascertain the possible level of toxicities these patients may be experiencing.Keywords: renal insufficiency, tenefovir, HIV, risk factors
Procedia PDF Downloads 1213155 Development of a Fuzzy Logic Based Model for Monitoring Child Pornography
Authors: Mariam Ismail, Kazeem Rufai, Jeremiah Balogun
Abstract:
A study was conducted to apply fuzzy logic to the development of a monitoring model for child pornography based on associated risk factors, which can be used by forensic experts or integrated into forensic systems for the early detection of child pornographic activities. A number of methods were adopted in the study, which includes an extensive review of related works was done in order to identify the factors that are associated with child pornography following which they were validated by an expert sex psychologist and guidance counselor, and relevant data was collected. Fuzzy membership functions were used to fuzzify the associated variables identified alongside the risk of the occurrence of child pornography based on the inference rules that were provided by the experts consulted, and the fuzzy logic expert system was simulated using the Fuzzy Logic Toolbox available in the MATLAB Software Release 2016. The results of the study showed that there were 4 categories of risk factors required for assessing the risk of a suspect committing child pornography offenses. The results of the study showed that 2 and 3 triangular membership functions were used to formulate the risk factors based on the 2 and 3 number of labels assigned, respectively. The results of the study showed that 5 fuzzy logic models were formulated such that the first 4 was used to assess the impact of each category on child pornography while the last one takes the 4 outputs from the 4 fuzzy logic models as inputs required for assessing the risk of child pornography. The following conclusion was made; there were factors that were related to personal traits, social traits, history of child pornography crimes, and self-regulatory deficiency traits by the suspects required for the assessment of the risk of child pornography crimes committed by a suspect. Using the values of the identified risk factors selected for this study, the risk of child pornography can be easily assessed from their values in order to determine the likelihood of a suspect perpetuating the crime.Keywords: fuzzy, membership functions, pornography, risk factors
Procedia PDF Downloads 1273154 Coastal Modelling Studies for Jumeirah First Beach Stabilization
Authors: Zongyan Yang, Gagan K. Jena, Sankar B. Karanam, Noora M. A. Hokal
Abstract:
Jumeirah First beach, a segment of coastline of length 1.5 km, is one of the popular public beaches in Dubai, UAE. The stability of the beach has been affected by several coastal developmental projects, including The World, Island 2 and La Mer. A comprehensive stabilization scheme comprising of two composite groynes (of lengths 90 m and 125m), modification to the northern breakwater of Jumeirah Fishing Harbour and beach re-nourishment was implemented by Dubai Municipality in 2012. However, the performance of the implemented stabilization scheme has been compromised by La Mer project (built in 2016), which modified the wave climate at the Jumeirah First beach. The objective of the coastal modelling studies is to establish design basis for further beach stabilization scheme(s). Comprehensive coastal modelling studies had been conducted to establish the nearshore wave climate, equilibrium beach orientations and stable beach plan forms. Based on the outcomes of the modeling studies, recommendation had been made to extend the composite groynes to stabilize the Jumeirah First beach. Wave transformation was performed following an interpolation approach with wave transformation matrixes derived from simulations of a possible range of wave conditions in the region. The Dubai coastal wave model is developed with MIKE21 SW. The offshore wave conditions were determined from PERGOS wave data at 4 offshore locations with consideration of the spatial variation. The lateral boundary conditions corresponding to the offshore conditions, at Dubai/Abu Dhabi and Dubai Sharjah borders, were derived with application of LitDrift 1D wave transformation module. The Dubai coastal wave model was calibrated with wave records at monitoring stations operated by Dubai Municipality. The wave transformation matrix approach was validated with nearshore wave measurement at a Dubai Municipality monitoring station in the vicinity of the Jumeirah First beach. One typical year wave time series was transformed to 7 locations in front of the beach to count for the variation of wave conditions which are affected by adjacent and offshore developments. Equilibrium beach orientations were estimated with application of LitDrift by finding the beach orientations with null annual littoral transport at the 7 selected locations. The littoral transport calculation results were compared with beach erosion/accretion quantities estimated from the beach monitoring program (twice a year including bathymetric and topographical surveys). An innovative integral method was developed to outline the stable beach plan forms from the estimated equilibrium beach orientations, with predetermined minimum beach width. The optimal lengths for the composite groyne extensions were recommended based on the stable beach plan forms.Keywords: composite groyne, equilibrium beach orientation, stable beach plan form, wave transformation matrix
Procedia PDF Downloads 2623153 Improving Fingerprinting-Based Localization System Using Generative AI
Authors: Getaneh Berie Tarekegn
Abstract:
A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. It also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 573152 SLIITBOT: Design of a Socially Assistive Robot for SLIIT
Authors: Chandimal Jayawardena, Ridmal Mendis, Manoji Tennakoon, Theekshana Wijayathilaka, Randima Marasinghe
Abstract:
This research paper defines the research area of the implementation of the socially assistive robot (SLIITBOT). It consists of the overall process implemented within the robot’s system and limitations, along with a literature survey. This project considers developing a socially assistive robot called SLIITBOT that will interact using its voice outputs and graphical user interface with people within the university and benefit them with updates and tasks. The robot will be able to detect a person when he/she enters the room, navigate towards the position the human is standing, welcome and greet the particular person with a simple conversation using its voice, introduce the services through its voice, and provide the person with services through an electronic input via an app while guiding the person with voice outputs.Keywords: application, detection, dialogue, navigation
Procedia PDF Downloads 1673151 Soil Bioremediation Monitoring Systems Powered by Microbial Fuel Cells
Authors: András Fülöp, Lejla Heilmann, Zsolt Szabó, Ákos Koós
Abstract:
Microbial fuel cells (MFCs) present a sustainable biotechnological solution to future energy demands. The aim of this study was to construct soil based, single cell, membrane-less MFC systems, operated without treatment to continuously power on-site monitoring and control systems during the soil bioremediation processes. Our Pseudomonas aeruginosa 541 isolate is an ideal choice for MFCs, because it is able to produce pyocyanin which behaves as electron-shuttle molecule, furthermore, it also has a significant antimicrobial effect. We tested several materials and structural configurations to obtain long term high power output. Comparing different configurations, a proton exchange membrane-less, 0.6 m long with 0.05 m diameter MFC tubes offered the best long-term performances. The long-term electricity production were tested from starch, yeast extract (YE), carboxymethyl cellulose (CMC) with humic acid (HA) as a mediator. In all cases, 3 kΩ external load have been used. The two best-operated systems were the Pseudomonas aeruginosa 541 containing MFCs with 1 % carboxymethyl cellulose and the MFCs with 1% yeast extract in the anode area and 35% hydrogel in the cathode chamber. The first had 3.3 ± 0.033 mW/m2 and the second had 4.1 ± 0.065 mW/m2 power density values. These systems have operated for 230 days without any treatment. The addition of 0.2 % HA and 1 % YE referred to the volume of the anode area resulted in 1.4 ± 0.035 mW/m2 power densities. The mixture of 1% starch with 0.2 % HA gave 1.82 ± 0.031 mW/m2. Using CMC as retard carbon source takes effect in the long-term bacterial survivor, thus enable the expression of the long term power output. The application of hydrogels in the cathode chamber significantly increased the performance of the MFC units due to their good water retention capacity.Keywords: microbial fuel cell, bioremediation, Pseudomonas aeruginosa, biotechnological solution
Procedia PDF Downloads 2893150 The Solution of Nonlinear Partial Differential Equation for The Phenomenon of Instability in Homogeneous Porous Media by Homotopy Analysis Method
Authors: Kajal K. Patel, M. N. Mehta, T. R. Singh
Abstract:
When water is injected in oil formatted area in secondary oil recovery process the instability occurs near common interface due to viscosity difference of injected water and native oil. The governing equation gives rise to the non-linear partial differential equation and its solution has been obtained by Homotopy analysis method with appropriate guess value of the solution together with some conditions and standard relations. The solution gives the average cross-sectional area occupied by the schematic fingers during the occurs of instability phenomenon. The numerical and graphical presentation has developed by using Maple software.Keywords: capillary pressure, homotopy analysis method, instability phenomenon, viscosity
Procedia PDF Downloads 4943149 Improving Alkaline Water Electrolysis by Using an Asymmetrical Electrode Cell Design
Authors: Gabriel Wosiak, Felipe Staciaki, Eryka Nobrega, Ernesto Pereira
Abstract:
Hydrogen is an energy carrier with potential applications in various industries. Alkaline electrolysis is a commonly used method for hydrogen production; however, its energy cost remains relatively high compared to other methods. This is due in part to interfacial pH changes that occur during the electrolysis process. Interfacial pH changes refer to the changes in pH that occur at the interface between the cathode electrode and the electrolyte solution. These changes are caused by the electrochemical reactions at both electrodes, which consume or produces hydroxide ions (OH-) from the electrolyte solution. This results in an important change in the local pH at the electrode surface, which can have several impacts on the energy consumption and durability of electrolysers. One impact of interfacial pH changes is an increase in the overpotential required for hydrogen production. Overpotential is the difference between the theoretical potential required for a reaction to occur and the actual potential that is applied to the electrodes. In the case of water electrolysis, the overpotential is caused by a number of factors, including the mass transport of reactants and products to and from the electrodes, the kinetics of the electrochemical reactions, and the interfacial pH. An increase in the interfacial pH at the anode surface in alkaline conditions can lead to an increase in the overpotential for hydrogen production. This is because the lower local pH makes it more difficult for the hydroxide ions to be oxidized. As a result, there is an increase in the required energy to the process occur. In addition to increasing the overpotential, interfacial pH changes can also lead to the degradation of the electrodes. This is because the lower pH can make the electrode more susceptible to corrosion. As a result, the electrodes may need to be replaced more frequently, which can increase the overall cost of water electrolysis. The method presented in the paper addresses the issue of interfacial pH changes by using a cell design with a different cell design, introducing the electrode asymmetry. This design helps to mitigate the pH gradient at the anode/electrolyte interface, which reduces the overpotential and improves the energy efficiency of the electrolyser. The method was tested using a multivariate approach in both laboratory and industrial current density conditions and validated the results with numerical simulations. The results demonstrated a clear improvement (11.6%) in energy efficiency, providing an important contribution to the field of sustainable energy production. The findings of the paper have important implications for the development of cost-effective and sustainable hydrogen production methods. By mitigating interfacial pH changes, it is possible to improve the energy efficiency of alkaline electrolysis and make it a more competitive option for hydrogen production.Keywords: electrolyser, interfacial pH, numerical simulation, optimization, asymmetric cell
Procedia PDF Downloads 683148 An Informative Marketing Platform: Methodology and Architecture
Authors: Martina Marinelli, Samanta Vellante, Francesco Pilotti, Daniele Di Valerio, Gaetanino Paolone
Abstract:
Any development in web marketing technology requires changes in information engineering to identify instruments and techniques suitable for the production of software applications for informative marketing. Moreover, for large web solutions, designing an interface that enables human interactions is a complex process that must bridge between informative marketing requirements and the developed solution. A user-friendly interface in web marketing applications is crucial for a successful business. The paper introduces mkInfo - a software platform that implements informative marketing. Informative marketing is a new interpretation of marketing which places the information at the center of every marketing action. The creative team includes software engineering researchers who have recently authored an article on automatic code generation. The authors have created the mkInfo software platform to generate informative marketing web applications. For each web application, it is possible to automatically implement an opt in page, a landing page, a sales page, and a thank you page: one only needs to insert the content. mkInfo implements an autoresponder to send mail according to a predetermined schedule. The mkInfo platform also includes e-commerce for a product or service. The stakeholder can access any opt-in page and get basic information about a product or service. If he wants to know more, he will need to provide an e-mail address to access a landing page that will generate an e-mail sequence. It will provide him with complete information about the product or the service. From this point on, the stakeholder becomes a user and is now able to purchase the product or related services through the mkInfo platform. This paper suggests a possible definition for Informative Marketing, illustrates its basic principles, and finally details the mkInfo platform that implements it. This paper also offers some Informative Marketing models, which are implemented in the mkInfo platform. Informative marketing can be applied to products or services. It is necessary to realize a web application for each product or service. The mkInfo platform enables the product or the service producer to send information concerning a specific product or service to all stakeholders. In conclusion, the technical contributions of this paper are: a different interpretation of marketing based on information; a modular architecture for web applications, particularly for one with standard features such as information storage, exchange, and delivery; multiple models to implement informative marketing; a software platform enabling the implementation of such models in a web application. Future research aims to enable stakeholders to provide information about a product or a service so that the information gathered about a product or a service includes both the producer’s and the stakeholders' point of view. The purpose is to create an all-inclusive management system of the knowledge regarding a specific product or service: a system that includes everything about the product or service and is able to address even unexpected questions.Keywords: informative marketing, opt in page, software platform, web application
Procedia PDF Downloads 1263147 Evaluation of the Cytotoxicity and Genotoxicity of Chemical Material in Filters PM2.5 of the Monitoring Stations of the Network of Air Quality in the Valle De Aburrá, Colombia
Authors: Alejandra Betancur Sánchez, Carmen Elena Zapata Sánchez, Juan Bautista López Ortiz
Abstract:
Adverse effects and increased air pollution has raised concerns about regulatory policies and has fostered the development of new air quality standards; this is due to the complexity of the composition and the poorly understood reactions in the atmospheric environment. Toxic compounds act as environmental agents having various effects, from irritation to death of cells and tissues. A toxic agent is defined an adverse response in a biological system. There is a particular class that produces some kind of alteration in the genetic material or associated components, so they are recognized as genotoxic agents. Within cells, they interact directly or indirectly with DNA, causing mutations or interfere with some enzymatic repair processes or in the genesis or polymerization of proteinaceous material involved in chromosome segregation. An air pollutant may cause or contribute to increased mortality or serious illness and even pose a potential danger to human health. The aim of this study was to evaluate the effect on the viability and the genotoxic potential on the cell lines CHO-K1 and Jurkat and peripheral blood of particulate matter PM T lymphocytes 2.5 obtained from filters collected three monitoring stations network air quality Aburrá Valley. Tests, reduction of MTT, trypan blue, NRU, comet assay, sister chromatid exchange (SCE) and chromosomal aberrations allowed evidence reduction in cell viability in cell lines CHO-K1 and Jurkat and damage to the DNA from cell line CHOK1, however, no significant effects were observed in the number of SCEs and chromosomal aberrations. The results suggest that PM2.5 material has genotoxic potential and can induce cancer development, as has been suggested in other studies.Keywords: PM2.5, cell line Jurkat, cell line CHO-K1, cytotoxicity, genotoxicity
Procedia PDF Downloads 2633146 Investigation on Pull-Out-Behavior and Interface Critical Parameters of Polymeric Fibers Embedded in Concrete and Their Correlation with Particular Fiber Characteristics
Authors: Michael Sigruener, Dirk Muscat, Nicole Struebbe
Abstract:
Fiber reinforcement is a state of the art to enhance mechanical properties in plastics. For concrete and civil engineering, steel reinforcements are commonly used. Steel reinforcements show disadvantages in their chemical resistance and weight, whereas polymer fibers' major problems are in fiber-matrix adhesion and mechanical properties. In spite of these facts, longevity and easy handling, as well as chemical resistance motivate researches to develop a polymeric material for fiber reinforced concrete. Adhesion and interfacial mechanism in fiber-polymer-composites are already studied thoroughly. For polymer fibers used as concrete reinforcement, the bonding behavior still requires a deeper investigation. Therefore, several differing polymers (e.g., polypropylene (PP), polyamide 6 (PA6) and polyetheretherketone (PEEK)) were spun into fibers via single screw extrusion and monoaxial stretching. Fibers then were embedded in a concrete matrix, and Single-Fiber-Pull-Out-Tests (SFPT) were conducted to investigate bonding characteristics and microstructural interface of the composite. Differences in maximum pull-out-force, displacement and slope of the linear part of force vs displacement-function, which depicts the adhesion strength and the ductility of the interfacial bond were studied. In SFPT fiber, debonding is an inhomogeneous process, where the combination of interfacial bonding and friction mechanisms add up to a resulting value. Therefore, correlations between polymeric properties and pull-out-mechanisms have to be emphasized. To investigate these correlations, all fibers were introduced to a series of analysis such as differential scanning calorimetry (DSC), contact angle measurement, surface roughness and hardness analysis, tensile testing and scanning electron microscope (SEM). Of each polymer, smooth and abraded fibers were tested, first to simulate the abrasion and damage caused by a concrete mixing process and secondly to estimate the influence of mechanical anchoring of rough surfaces. In general, abraded fibers showed a significant increase in maximum pull-out-force due to better mechanical anchoring. Friction processes therefore play a major role to increase the maximum pull-out-force. The polymer hardness affects the tribological behavior and polymers with high hardness lead to lower surface roughness verified by SEM and surface roughness measurements. This concludes into a decreased maximum pull-out-force for hard polymers. High surface energy polymers show better interfacial bonding strength in general, which coincides with the conducted SFPT investigation. Polymers such as PEEK or PA6 show higher bonding strength in smooth and roughened fibers, revealed through high pull-out-force and concrete particles bonded on the fiber surface pictured via SEM analysis. The surface energy divides into dispersive and polar part, at which the slope is correlating with the polar part. Only polar polymers increase their SFPT-function slope due to better wetting abilities when showing a higher bonding area through rough surfaces. Hence, the maximum force and the bonding strength of an embedded fiber is a function of polarity, hardness, and consequently surface roughness. Other properties such as crystallinity or tensile strength do not affect bonding behavior. Through the conducted analysis, it is now feasible to understand and resolve different effects in pull-out-behavior step-by-step based on the polymer properties itself. This investigation developed a roadmap on how to engineer high adhering polymeric materials for fiber reinforcement of concrete.Keywords: fiber-matrix interface, polymeric fibers, fiber reinforced concrete, single fiber pull-out test
Procedia PDF Downloads 111