Search results for: third party monitoring software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8058

Search results for: third party monitoring software

6288 Developing Stability Monitoring Parameters for NIPRIMAL®: A Monoherbal Formulation for the Treatment of Uncomplicated Malaria

Authors: Ekere E. Kokonne, Isimi C. Yetunde, Okoh E. Judith, Okafor E. Ijeoma, Ajeh J. Isaac, Olobayo O. Kunle, Emeje O. Martins

Abstract:

NIPRIMAL® is a mono herbal formulation of Nauclea latifolia used in the treatment of malaria. The stability of extracts made from plant material is essential to ensure the quality, safety and efficacy of the finished product. This study assessed the stability of the formulation under three different storage conditions; normal room temperature, infrared and under refrigeration. Differential Scanning Calorimetry (DSC) and Thin Layer Chromatography (TLC) were used to monitor the formulations. The DSC analysis was done from 0oC to 350oC under the three storage conditions. Results obtained indicate that NIPRIMAL® was stable at all the storage conditions investigated. Thin layer chromatography (TLC) after 6 months showed there was no significant difference between retention factor (RF) values for the various storage conditions. The reference sample had four spots with RF values of 0.47, 0.68, 0.76, 0.82 respectively and these spots were retained in the test formulations with corresponding RF values were after 6 months at room temperature and refrigerated temperature been 0.56, 0.73, 0.80, 0.92 and 0.47, 0.68, 0.76, 0.82 respectively. On the other hand, the RF values (0.55, 0.74, 0.77, 0.93) obtained under infrared after 1 month varied slightly from the reference. The sample exposed to infrared had a lower heat capacity compared to that stored under room temperature or refrigeration. A combination of TLC and DSC measurements has been applied for assessing the stability of NIPRIMAL®. Both methods were found to be rapid, sensitive and reliable in determining its stability. It is concluded that NIPRIMAL® can be stored under any of the tested conditions without degradation. This study is a major contribution towards developing appropriate stability monitoring parameters for herbal products.

Keywords: differential scanning calorimetry, formulation, NIPRIMAL®, stability, thin layer hromatography

Procedia PDF Downloads 257
6287 Data and Spatial Analysis for Economy and Education of 28 E.U. Member-States for 2014

Authors: Alexiou Dimitra, Fragkaki Maria

Abstract:

The objective of the paper is the study of geographic, economic and educational variables and their contribution to determine the position of each member-state among the EU-28 countries based on the values of seven variables as given by Eurostat. The Data Analysis methods of Multiple Factorial Correspondence Analysis (MFCA) Principal Component Analysis and Factor Analysis have been used. The cross tabulation tables of data consist of the values of seven variables for the 28 countries for 2014. The data are manipulated using the CHIC Analysis V 1.1 software package. The results of this program using MFCA and Ascending Hierarchical Classification are given in arithmetic and graphical form. For comparison reasons with the same data the Factor procedure of Statistical package IBM SPSS 20 has been used. The numerical and graphical results presented with tables and graphs, demonstrate the agreement between the two methods. The most important result is the study of the relation between the 28 countries and the position of each country in groups or clouds, which are formed according to the values of the corresponding variables.

Keywords: Multiple Factorial Correspondence Analysis, Principal Component Analysis, Factor Analysis, E.U.-28 countries, Statistical package IBM SPSS 20, CHIC Analysis V 1.1 Software, Eurostat.eu Statistics

Procedia PDF Downloads 511
6286 Comparative Study of Accuracy of Land Cover/Land Use Mapping Using Medium Resolution Satellite Imagery: A Case Study

Authors: M. C. Paliwal, A. K. Jain, S. K. Katiyar

Abstract:

Classification of satellite imagery is very important for the assessment of its accuracy. In order to determine the accuracy of the classified image, usually the assumed-true data are derived from ground truth data using Global Positioning System. The data collected from satellite imagery and ground truth data is then compared to find out the accuracy of data and error matrices are prepared. Overall and individual accuracies are calculated using different methods. The study illustrates advanced classification and accuracy assessment of land use/land cover mapping using satellite imagery. IRS-1C-LISS IV data were used for classification of satellite imagery. The satellite image was classified using the software in fourteen classes namely water bodies, agricultural fields, forest land, urban settlement, barren land and unclassified area etc. Classification of satellite imagery and calculation of accuracy was done by using ERDAS-Imagine software to find out the best method. This study is based on the data collected for Bhopal city boundaries of Madhya Pradesh State of India.

Keywords: resolution, accuracy assessment, land use mapping, satellite imagery, ground truth data, error matrices

Procedia PDF Downloads 508
6285 Field Evaluation of Concrete Using Hawaiian Aggregates for Alkali Silica Reaction

Authors: Ian N. Robertson

Abstract:

Alkali Silica Reaction (ASR) occurs in concrete when the alkali hydroxides (Na, K and OH) from the cement react with unstable silica, SiO2, in some types of aggregate. The gel that forms during this reaction will expand when it absorbs water, potentially leading to cracking and overall expansion of the concrete. ASR has resulted in accelerated deterioration of concrete highways, dams and other structures that are exposed to moisture during their service life. Concrete aggregates available in Hawaii have not demonstrated a history of ASR, however, accelerated laboratory tests using ASTM 1260 indicated a potential for ASR with some aggregates. Certain clients are now requiring import of aggregates from the US mainland at great expense. In order to assess the accuracy of the laboratory test results, a long-term field study of the potential for ASR in concretes made with Hawaiian aggregates was initiated in 2011 with funding from the US Federal Highway Administration and Hawaii Department of Transportation. Thirty concrete specimens were constructed of various concrete mixtures using aggregates from all Hawaiian aggregate sources, and some US mainland aggregates known to exhibit ASR expansion. The specimens are located in an open field site in Manoa valley on the Hawaiian Island of Oahu, exposed to relatively high humidity and frequent rainfall. A weather station at the site records the ambient conditions on a continual basis. After two years of monitoring, only one of the Hawaiian aggregates showed any sign of expansion. Ten additional specimens were fabricated with this aggregate to confirm the earlier observations. Admixtures known to mitigate ASR, such as fly ash and lithium, were included in some specimens to evaluate their effect on the concrete expansion. This paper describes the field evaluation program and presents the results for all forty specimens after four years of monitoring.

Keywords: aggregate, alkali silica reaction, concrete durability, field exposure

Procedia PDF Downloads 247
6284 System and Method for Providing Web-Based Remote Application Service

Authors: Shuen-Tai Wang, Yu-Ching Lin, Hsi-Ya Chang

Abstract:

With the development of virtualization technologies, a new type of service named cloud computing service is produced. Cloud users usually encounter the problem of how to use the virtualized platform easily over the web without requiring the plug-in or installation of special software. The object of this paper is to develop a system and a method enabling process interfacing within an automation scenario for accessing remote application by using the web browser. To meet this challenge, we have devised a web-based interface that system has allowed to shift the GUI application from the traditional local environment to the cloud platform, which is stored on the remote virtual machine. We designed the sketch of web interface following the cloud virtualization concept that sought to enable communication and collaboration among users. We describe the design requirements of remote application technology and present implementation details of the web application and its associated components. We conclude that this effort has the potential to provide an elastic and resilience environment for several application services. Users no longer have to burden the system maintenances and reduce the overall cost of software licenses and hardware. Moreover, this remote application service represents the next step to the mobile workplace, and it lets user to use the remote application virtually from anywhere.

Keywords: virtualization technology, virtualized platform, web interface, remote application

Procedia PDF Downloads 288
6283 Decision Analysis Module for Excel

Authors: Radomir Perzina, Jaroslav Ramik

Abstract:

The Analytic Hierarchy Process is frequently used approach for solving decision making problems. There exists wide range of software programs utilizing that approach. Their main disadvantage is that they are relatively expensive and missing intermediate calculations. This work introduces a Microsoft Excel add-in called DAME – Decision Analysis Module for Excel. Comparing to other computer programs DAME is free, can work with scenarios or multiple decision makers and displays intermediate calculations. Users can structure their decision models into three levels – scenarios/users, criteria and variants. Items on all levels can be evaluated either by weights or pair-wise comparisons. There are provided three different methods for the evaluation of the weights of criteria, the variants as well as the scenarios – Saaty’s Method, Geometric Mean Method and Fuller’s Triangle Method. Multiplicative and additive syntheses are supported. The proposed software package is demonstrated on couple of illustrating examples of real life decision problems.

Keywords: analytic hierarchy process, multi-criteria decision making, pair-wise comparisons, Microsoft Excel, scenarios

Procedia PDF Downloads 452
6282 An Effective Route to Control of the Safety of Accessing and Storing Data in the Cloud-Based Data Base

Authors: Omid Khodabakhshi, Amir Rozdel

Abstract:

The subject of cloud computing security research has allocated a number of challenges and competitions because the data center is comprised of complex private information and are always faced various risks of information disclosure by hacker attacks or internal enemies. Accordingly, the security of virtual machines in the cloud computing infrastructure layer is very important. So far, there are many software solutions to develop security in virtual machines. But using software alone is not enough to solve security problems. The purpose of this article is to examine the challenges and security requirements for accessing and storing data in an insecure cloud environment. In other words, in this article, a structure is proposed for the implementation of highly isolated security-sensitive codes using secure computing hardware in virtual environments. It also allows remote code validation with inputs and outputs. We provide these security features even in situations where the BIOS, the operating system, and even the super-supervisor are infected. To achieve these goals, we will use the hardware support provided by the new Intel and AMD processors, as well as the TPM security chip. In conclusion, the use of these technologies ultimately creates a root of dynamic trust and reduces TCB to security-sensitive codes.

Keywords: code, cloud computing, security, virtual machines

Procedia PDF Downloads 191
6281 Postgraduate Supervision Relationship: Practices, Challenges, and Strategies of Stakeholders in the Côte d’Ivoire University System

Authors: Akuélé Radha Kondo, Kathrin Heitz-Tokpa, Bassirou Bonfoh, Francis Akindes

Abstract:

Postgraduate supervision contributes significantly to a student’s academic career, a supervisor’s promotion, and a university’s reputation. Despite this, the length of graduation in the Côte d’Ivoire University system is beyond the normal duration, two years for a master's and three years for a PhD. The paper analyses supervision practices regarding the challenges and strategies mobilised by students, supervisors, and administration staff to manage various relationships. Using a qualitative research design, this study was conducted at three public universities in Côte d’Ivoire. Data were generated from thirty-two postgraduate students, seventeen supervisors, and four administration staff through semi-structured interviews. Data were analysed using content analysis and presented thematically. Findings revealed delegated supervision and co-supervision, two types of supervision relationship practices. Students pointed out that feedback is often delayed from their supervisors in delegation supervision. However, they acknowledged receiving input and scientific guidance. All students believed that their role is to be proactive, not to wait to receive everything from the supervisor, and need to be more autonomous and hardworking. They developed strategies related to these qualities. Supervisors were considered to guide, give advice, control, motivate, provide critical feedback, and validate the work. The administration was rather absent in monitoring supervision delays. Major challenges were related to the supervision relationships and access to the research funds. The study showed that more engagement of the main supervisor, administration monitoring, and secured funding would reduce the time and increase the completion rate.

Keywords: Côte d’Ivoire, postgraduate supervision, practices, strategies

Procedia PDF Downloads 97
6280 The Event of Extreme Precipitation Occurred in the Metropolitan Mesoregion of the Capital of Para

Authors: Natasha Correa Vitória Bandeira, Lais Cordeiro Soares, Claudineia Brazil, Luciane Teresa Salvi

Abstract:

The intense rain event that occurred between February 16 and 18, 2018, in the city of Barcarena in Pará, located in the North region of Brazil, demonstrates the importance of analyzing this type of event. The metropolitan mesoregion of Belem was severely punished by rains much above the averages normally expected for that time of year; this phenomenon affected, in addition to the capital, the municipalities of Barcarena, Murucupi and Muruçambá. Resulting in a great flood in the rivers of the region, whose basins were affected with great intensity of precipitation, causing concern for the local population because in this region, there are located companies that accumulate ore tailings, and in this specific case, the dam of any of these companies, leaching the ore to the water bodies of the Murucupi River Basin. This article aims to characterize this phenomenon through a special analysis of the distribution of rainfall, using data from atmospheric soundings, satellite images, radar images and data from the GPCP (Global Precipitation Climatology Project), in addition to rainfall stations located in the study region. The results of the work demonstrated a dissociation between the data measured in the meteorological stations and the other forms of analysis of this extreme event. Monitoring carried out solely on the basis of data from pluviometric stations is not sufficient for monitoring and/or diagnosing extreme weather events, and investment by the competent bodies is important to install a larger network of pluviometric stations sufficient to meet the demand in a given region.

Keywords: extreme precipitation, great flood, GPCP, ore dam

Procedia PDF Downloads 108
6279 The Application of a Neural Network in the Reworking of Accu-Chek to Wrist Bands to Monitor Blood Glucose in the Human Body

Authors: J. K Adedeji, O. H Olowomofe, C. O Alo, S.T Ijatuyi

Abstract:

The issue of high blood sugar level, the effects of which might end up as diabetes mellitus, is now becoming a rampant cardiovascular disorder in our community. In recent times, a lack of awareness among most people makes this disease a silent killer. The situation calls for urgency, hence the need to design a device that serves as a monitoring tool such as a wrist watch to give an alert of the danger a head of time to those living with high blood glucose, as well as to introduce a mechanism for checks and balances. The neural network architecture assumed 8-15-10 configuration with eight neurons at the input stage including a bias, 15 neurons at the hidden layer at the processing stage, and 10 neurons at the output stage indicating likely symptoms cases. The inputs are formed using the exclusive OR (XOR), with the expectation of getting an XOR output as the threshold value for diabetic symptom cases. The neural algorithm is coded in Java language with 1000 epoch runs to bring the errors into the barest minimum. The internal circuitry of the device comprises the compatible hardware requirement that matches the nature of each of the input neurons. The light emitting diodes (LED) of red, green, and yellow colors are used as the output for the neural network to show pattern recognition for severe cases, pre-hypertensive cases and normal without the traces of diabetes mellitus. The research concluded that neural network is an efficient Accu-Chek design tool for the proper monitoring of high glucose levels than the conventional methods of carrying out blood test.

Keywords: Accu-Check, diabetes, neural network, pattern recognition

Procedia PDF Downloads 147
6278 Design and Implementation of Remote Application Virtualization in Cloud Environments

Authors: Shuen-Tai Wang, Ying-Chuan Chen, Hsi-Ya Chang

Abstract:

Cloud computing is a paradigm of computing that shifts the way computing has been done in the past. The users can use cloud resources such as application software or storage space from the cloud without needing to own them. This paper is focused on solutions that are anticipated to introduce IaaS idea to build cloud base services and enable the individual remote user's applications in cloud environments, which appear as if they are running on the end user's local computer. The available features of application delivery solution have been developed based on our previous research on the virtualization technology to offer applications independent of location so that the users can work online, offline, anywhere, with appropriate device and at any time. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for cloud service. Users no longer need to burden the system managers and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible remote application virtualization service represents the next significant step to the mobile workplace, and it lets users access their applications remotely through cloud services anywhere. This is also made possible by the low administrative costs as well as relatively inexpensive end-user terminals and reduced energy expenses.

Keywords: cloud computing, IaaS, virtualization, application delivery

Procedia PDF Downloads 281
6277 Assessing the Effect of Underground Tunnel Diameter on Structure-Foundation-Soil Performance under the Kobe Earthquake

Authors: Masoud Mahdavi

Abstract:

Today, developed and industrial cities have all kinds of sewage and water transfer canals, subway tunnels, infrastructure facilities, etc., which have caused underground cavities to be created under the buildings. The presence of these cavities causes behavioral changes in the structural behavior that must be fully evaluated. In the present study, using Abaqus finite element software, the effect of cavities with 0.5 and 1.5 meters in diameter at a depth of 2.5 meters from the earth's surface (with a circular cross-section) on the performance of the foundation and the ground (soil) has been evaluated. For this purpose, the Kobe earthquake was applied to the models for 10 seconds. Also, pore water pressure and weight were considered on the models to get complete results. The results showed that by creating and increasing the diameter of circular cavities in the soil, three indicators; 1) von Mises stress, 2) displacement and 3) plastic strain have had oscillating, ascending and ascending processes, respectively, which shows the relationship between increasing the diameter index of underground cavities and structural indicators of structure-foundation-soil.

Keywords: underground excavations, foundation, structural substrates, Abaqus software, Kobe earthquake, time history analysis

Procedia PDF Downloads 121
6276 Specification of Requirements to Ensure Proper Implementation of Security Policies in Cloud-Based Multi-Tenant Systems

Authors: Rebecca Zahra, Joseph G. Vella, Ernest Cachia

Abstract:

The notion of cloud computing is rapidly gaining ground in the IT industry and is appealing mostly due to making computing more adaptable and expedient whilst diminishing the total cost of ownership. This paper focuses on the software as a service (SaaS) architecture of cloud computing which is used for the outsourcing of databases with their associated business processes. One approach for offering SaaS is basing the system’s architecture on multi-tenancy. Multi-tenancy allows multiple tenants (users) to make use of the same single application instance. Their requests and configurations might then differ according to specific requirements met through tenant customisation through the software. Despite the known advantages, companies still feel uneasy to opt for the multi-tenancy with data security being a principle concern. The fact that multiple tenants, possibly competitors, would have their data located on the same server process and share the same database tables heighten the fear of unauthorised access. Security is a vital aspect which needs to be considered by application developers, database administrators, data owners and end users. This is further complicated in cloud-based multi-tenant system where boundaries must be established between tenants and additional access control models must be in place to prevent unauthorised cross-tenant access to data. Moreover, when altering the database state, the transactions need to strictly adhere to the tenant’s known business processes. This paper focuses on the fact that security in cloud databases should not be considered as an isolated issue. Rather it should be included in the initial phases of the database design and monitored continuously throughout the whole development process. This paper aims to identify a number of the most common security risks and threats specifically in the area of multi-tenant cloud systems. Issues and bottlenecks relating to security risks in cloud databases are surveyed. Some techniques which might be utilised to overcome them are then listed and evaluated. After a description and evaluation of the main security threats, this paper produces a list of software requirements to ensure that proper security policies are implemented by a software development team when designing and implementing a multi-tenant based SaaS. This would then assist the cloud service providers to define, implement, and manage security policies as per tenant customisation requirements whilst assuring security for the customers’ data.

Keywords: cloud computing, data management, multi-tenancy, requirements, security

Procedia PDF Downloads 156
6275 Water Monitoring Sentinel Cloud Platform: Water Monitoring Platform Based on Satellite Imagery and Modeling Data

Authors: Alberto Azevedo, Ricardo Martins, André B. Fortunato, Anabela Oliveira

Abstract:

Water is under severe threat today because of the rising population, increased agricultural and industrial needs, and the intensifying effects of climate change. Due to sea-level rise, erosion, and demographic pressure, the coastal regions are of significant concern to the scientific community. The Water Monitoring Sentinel Cloud platform (WORSICA) service is focused on providing new tools for monitoring water in coastal and inland areas, taking advantage of remote sensing, in situ and tidal modeling data. WORSICA is a service that can be used to determine the coastline, coastal inundation areas, and the limits of inland water bodies using remote sensing (satellite and Unmanned Aerial Vehicles - UAVs) and in situ data (from field surveys). It applies to various purposes, from determining flooded areas (from rainfall, storms, hurricanes, or tsunamis) to detecting large water leaks in major water distribution networks. This service was built on components developed in national and European projects, integrated to provide a one-stop-shop service for remote sensing information, integrating data from the Copernicus satellite and drone/unmanned aerial vehicles, validated by existing online in-situ data. Since WORSICA is operational using the European Open Science Cloud (EOSC) computational infrastructures, the service can be accessed via a web browser and is freely available to all European public research groups without additional costs. In addition, the private sector will be able to use the service, but some usage costs may be applied, depending on the type of computational resources needed by each application/user. Although the service has three main sub-services i) coastline detection; ii) inland water detection; iii) water leak detection in irrigation networks, in the present study, an application of the service to Óbidos lagoon in Portugal is shown, where the user can monitor the evolution of the lagoon inlet and estimate the topography of the intertidal areas without any additional costs. The service has several distinct methodologies implemented based on the computations of the water indexes (e.g., NDWI, MNDWI, AWEI, and AWEIsh) retrieved from the satellite image processing. In conjunction with the tidal data obtained from the FES model, the system can estimate a coastline with the corresponding level or even topography of the inter-tidal areas based on the Flood2Topo methodology. The outcomes of the WORSICA service can be helpful for several intervention areas such as i) emergency by providing fast access to inundated areas to support emergency rescue operations; ii) support of management decisions on hydraulic infrastructures operation to minimize damage downstream; iii) climate change mitigation by minimizing water losses and reduce water mains operation costs; iv) early detection of water leakages in difficult-to-access water irrigation networks, promoting their fast repair.

Keywords: remote sensing, coastline detection, water detection, satellite data, sentinel, Copernicus, EOSC

Procedia PDF Downloads 126
6274 Characterization and Monitoring of the Yarn Faults Using Diametric Fault System

Authors: S. M. Ishtiaque, V. K. Yadav, S. D. Joshi, J. K. Chatterjee

Abstract:

The DIAMETRIC FAULTS system has been developed that captures a bi-directional image of yarn continuously in sequentially manner and provides the detailed classification of faults. A novel mathematical framework developed on the acquired bi-directional images forms the basis of fault classification in four broad categories, namely, Thick1, Thick2, Thin and Normal Yarn. A discretised version of Radon transformation has been used to convert the bi-directional images into one-dimensional signals. Images were divided into training and test sample sets. Karhunen–Loève Transformation (KLT) basis is computed for the signals from the images in training set for each fault class taking top six highest energy eigen vectors. The fault class of the test image is identified by taking the Euclidean distance of its signal from its projection on the KLT basis for each sample realization and fault class in the training set. Euclidean distance applied using various techniques is used for classifying an unknown fault class. An accuracy of about 90% is achieved in detecting the correct fault class using the various techniques. The four broad fault classes were further sub classified in four sub groups based on the user set boundary limits for fault length and fault volume. The fault cross-sectional area and the fault length defines the total volume of fault. A distinct distribution of faults is found in terms of their volume and physical dimensions which can be used for monitoring the yarn faults. It has been shown from the configurational based characterization and classification that the spun yarn faults arising out of mass variation, exhibit distinct characteristics in terms of their contours, sizes and shapes apart from their frequency of occurrences.

Keywords: Euclidean distance, fault classification, KLT, Radon Transform

Procedia PDF Downloads 265
6273 Assertion-Driven Test Repair Based on Priority Criteria

Authors: Ruilian Zhao, Shukai Zhang, Yan Wang, Weiwei Wang

Abstract:

Repairing broken test cases is an expensive and challenging task in evolving software systems. Although an automated repair technique with intent preservation has been proposed, but it does not take into account the association between test repairs and assertions, leading to a large number of irrelevant candidates and decreasing the repair capability. This paper proposes an assertion-driven test repair approach. Furthermore, an intent-oriented priority criterion is raised to guide the repair candidate generation, making the repairs closer to the intent of the test. In more detail, repair targets are determined through post-dominance relations between assertions and the methods that directly cause compilation errors. Then, test repairs are generated from the target in a bottom-up way, guided by the intent-oriented priority criteria. Finally, the generated repair candidates are prioritized to match the original test intent. The approach is implemented and evaluated on the benchmark of 4 open-source programs and 91 broken test cases. The result shows that the approach can fix 89% (81/91) of broken test cases, which is more effective than the existing intentpreserved test repair approach, and our intent-oriented priority criteria work well.

Keywords: test repair, test intent, software test, test case evolution

Procedia PDF Downloads 129
6272 Design of an Innovative Geothermal Heat Pump with a PCM Thermal Storage

Authors: Emanuele Bonamente, Andrea Aquino

Abstract:

This study presents an innovative design for geothermal heat pumps with the goal of maximizing the system efficiency (COP - Coefficient of Performance), reducing the soil use (e.g. length/depth of geothermal boreholes) and initial investment costs. Based on experimental data obtained from a two-year monitoring of a working prototype implemented for a commercial building in the city of Perugia, Italy, an upgrade of the system is proposed and the performance is evaluated via CFD simulations. The prototype was designed to include a thermal heat storage (i.e. water), positioned between the boreholes and the heat pump, acting as a flywheel. Results from the monitoring campaign show that the system is still capable of providing the required heating and cooling energy with a reduced geothermal installation (approx. 30% of the standard length). In this paper, an optimization of the system is proposed, re-designing the heat storage to include phase change materials (PCMs). Two stacks of PCMs, characterized by melting temperatures equal to those needed to maximize the system COP for heating and cooling, are disposed within the storage. During the working cycle, the latent heat of the PCMs is used to heat (cool) the water used by the heat pump while the boreholes independently cool (heat) the storage. The new storage is approximately 10 times smaller and can be easily placed close to the heat pump in the technical room. First, a validation of the CFD simulation of the storage is performed against experimental data. The simulation is then used to test possible alternatives of the original design and it is finally exploited to evaluate the PCM-storage performance for two different configurations (i.e. single- and double-loop systems).

Keywords: geothermal heat pump, phase change materials (PCM), energy storage, renewable energies

Procedia PDF Downloads 314
6271 Quantifying Fatigue during Periods of Intensified Competition in Professional Ice Hockey Players: Magnitude of Fatigue in Selected Markers

Authors: Eoin Kirwan, Christopher Nulty, Declan Browne

Abstract:

The professional ice hockey season consists of approximately 60 regular season games with periods of fixture congestion occurring several times in the average season. These periods of congestion provide limited time for recovery, exposing the athletes to the risk of competing whilst not fully recovered. Although a body of research is growing with respect to monitoring fatigue, particularly during periods of congested fixtures in team sports such as rugby and soccer, it has received little to no attention thus far in ice hockey athletes. Consequently, there is limited knowledge on monitoring tools that might effectively detect a fatigue response and the magnitude of fatigue that can accumulate when recovery is limited by competitive fixtures. The benefit of quantifying and establishing fatigue status is the ability to optimise training and provide pertinent information on player health, injury risk, availability and readiness. Some commonly used methods to assess fatigue and recovery status of athletes include the use of perceived fatigue and wellbeing questionnaires, tests of muscular force and ratings of perceive exertion (RPE). These measures are widely used in popular team sports such as soccer and rugby and show promise as assessments of fatigue and recovery status for ice hockey athletes. As part of a larger study, this study explored the magnitude of changes in adductor muscle strength after game play and throughout a period of fixture congestion and examined the relationship between internal game load and perceived wellbeing with adductor muscle strength. Methods 8 professional ice hockey players from a British Elite League club volunteered to participate (age = 29.3 ± 2.49 years, height = 186.15 ± 6.75 cm, body mass = 90.85 ± 8.64 kg). Prior to and after competitive games each player performed trials of the adductor squeeze test at 0˚ hip flexion with the lead investigator using hand-held dynamometry. Rate of perceived exertion was recorded for each game and from data of total ice time individual session RPE was calculated. After each game players completed a 5- point questionnaire to assess perceived wellbeing. Data was collected from six competitive games, 1 practice and 36 hours post the final game, over a 10 – day period. Results Pending final data collection in February Conclusions Pending final data collection in February.

Keywords: Conjested fixtures, fatigue monitoring, ice hockey, readiness

Procedia PDF Downloads 142
6270 R Statistical Software Applied in Reliability Analysis: Case Study of Diesel Generator Fans

Authors: Jelena Vucicevic

Abstract:

Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. This paper will try to introduce another way of calculating reliability by using R statistical software. R is a free software environment for statistical computing and graphics. It compiles and runs on a wide variety of UNIX platforms, Windows and MacOS. The R programming environment is a widely used open source system for statistical analysis and statistical programming. It includes thousands of functions for the implementation of both standard and new statistical methods. R does not limit user only to operation related only to these functions. This program has many benefits over other similar programs: it is free and, as an open source, constantly updated; it has built-in help system; the R language is easy to extend with user-written functions. The significance of the work is calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. Seventy generators were studied. For each one, the number of hours of running time from its first being put into service until fan failure or until the end of the study (whichever came first) was recorded. Dataset consists of two variables: hours and status. Hours show the time of each fan working and status shows the event: 1- failed, 0- censored data. Censored data represent cases when we cannot track the specific case, so it could fail or success. Gaining the result by using R was easy and quick. The program will take into consideration censored data and include this into the results. This is not so easy in hand calculation. For the purpose of the paper results from R program have been compared to hand calculations in two different cases: censored data taken as a failure and censored data taken as a success. In all three cases, results are significantly different. If user decides to use the R for further calculations, it will give more precise results with work on censored data than the hand calculation.

Keywords: censored data, R statistical software, reliability analysis, time to failure

Procedia PDF Downloads 401
6269 Design of a Hand-Held, Clamp-on, Leakage Current Sensor for High Voltage Direct Current Insulators

Authors: Morné Roman, Robert van Zyl, Nishanth Parus, Nishal Mahatho

Abstract:

Leakage current monitoring for high voltage transmission line insulators is of interest as a performance indicator. Presently, to the best of our knowledge, there is no commercially available, clamp-on type, non-intrusive device for measuring leakage current on energised high voltage direct current (HVDC) transmission line insulators. The South African power utility, Eskom, is investigating the development of such a hand-held sensor for two important applications; first, for continuous real-time condition monitoring of HVDC line insulators and, second, for use by live line workers to determine if it is safe to work on energised insulators. In this paper, a DC leakage current sensor based on magnetic field sensing techniques is developed. The magnetic field sensor used in the prototype can also detect alternating current up to 5 MHz. The DC leakage current prototype detects the magnetic field associated with the current flowing on the surface of the insulator. Preliminary HVDC leakage current measurements are performed on glass insulators. The results show that the prototype can accurately measure leakage current in the specified current range of 1-200 mA. The influence of external fields from the HVDC line itself on the leakage current measurements is mitigated through a differential magnetometer sensing technique. Thus, the developed sensor can perform measurements on in-service HVDC insulators. The research contributes to the body of knowledge by providing a sensor to measure leakage current on energised HVDC insulators non-intrusively. This sensor can also be used by live line workers to inform them whether or not it is safe to perform maintenance on energized insulators.

Keywords: direct current, insulator, leakage current, live line, magnetic field, sensor, transmission lines

Procedia PDF Downloads 173
6268 Dimensionality Reduction in Modal Analysis for Structural Health Monitoring

Authors: Elia Favarelli, Enrico Testi, Andrea Giorgetti

Abstract:

Autonomous structural health monitoring (SHM) of many structures and bridges became a topic of paramount importance for maintenance purposes and safety reasons. This paper proposes a set of machine learning (ML) tools to perform automatic feature selection and detection of anomalies in a bridge from vibrational data and compare different feature extraction schemes to increase the accuracy and reduce the amount of data collected. As a case study, the Z-24 bridge is considered because of the extensive database of accelerometric data in both standard and damaged conditions. The proposed framework starts from the first four fundamental frequencies extracted through operational modal analysis (OMA) and clustering, followed by density-based time-domain filtering (tracking). The fundamental frequencies extracted are then fed to a dimensionality reduction block implemented through two different approaches: feature selection (intelligent multiplexer) that tries to estimate the most reliable frequencies based on the evaluation of some statistical features (i.e., mean value, variance, kurtosis), and feature extraction (auto-associative neural network (ANN)) that combine the fundamental frequencies to extract new damage sensitive features in a low dimensional feature space. Finally, one class classifier (OCC) algorithms perform anomaly detection, trained with standard condition points, and tested with normal and anomaly ones. In particular, a new anomaly detector strategy is proposed, namely one class classifier neural network two (OCCNN2), which exploit the classification capability of standard classifiers in an anomaly detection problem, finding the standard class (the boundary of the features space in normal operating conditions) through a two-step approach: coarse and fine boundary estimation. The coarse estimation uses classics OCC techniques, while the fine estimation is performed through a feedforward neural network (NN) trained that exploits the boundaries estimated in the coarse step. The detection algorithms vare then compared with known methods based on principal component analysis (PCA), kernel principal component analysis (KPCA), and auto-associative neural network (ANN). In many cases, the proposed solution increases the performance with respect to the standard OCC algorithms in terms of F1 score and accuracy. In particular, by evaluating the correct features, the anomaly can be detected with accuracy and an F1 score greater than 96% with the proposed method.

Keywords: anomaly detection, frequencies selection, modal analysis, neural network, sensor network, structural health monitoring, vibration measurement

Procedia PDF Downloads 123
6267 Optimization of Fin Type and Fin per Inch on Heat Transfer and Pressure Drop of an Air Cooler

Authors: A. Falavand Jozaei, A. Ghafouri

Abstract:

Operation enhancement in an air cooler (heat exchanger) depends on the rate of heat transfer, and pressure drop. In this paper, for a given heat duty, study of the effects of FPI (fin per inch) and fin type (circular and hexagonal fins) on two parameters mentioned above is considered in an air cooler in Iran, Arvand petrochemical. A program in EES (Engineering Equations Solver) software moreover, Aspen B-JAC and HTFS+ software are used for this purpose to solve governing equations. At first the simulated results obtained from this program is compared to the experimental data for two cases of FPI. The effects of FPI from 3 to 15 over heat transfer (Q) to pressure drop ratio (Q/Δp ratio). This ratio is one of the main parameters in design, rating, and simulation heat exchangers. The results show that heat transfer (Q) and pressure drop increase with increasing FPI (fin per inch) steadily, and the Q/Δp ratio increases to FPI = 12 (for circular fins about 47% and for hexagonal fins about 69%) and then decreased gradually to FPI = 15 (for circular fins about 5% and for hexagonal fins about 8%), and Q/Δp ratio is maximum at FPI = 12. The FPI value selection between 8 and 12 obtained as a result to optimum heat transfer to pressure drop ratio. Also by contrast, between circular and hexagonal fins results, the Q/Δp ratio of hexagonal fins more than Q/Δp ratio of circular fins for FPI between 8 and 12 (optimum FPI).

Keywords: air cooler, circular and hexagonal fins, fin per inch, heat transfer and pressure drop

Procedia PDF Downloads 454
6266 Risk Based Building Information Modeling (BIM) for Urban Infrastructure Transportation Project

Authors: Debasis Sarkar

Abstract:

Building Information Modeling (BIM) is a holistic documentation process for operational visualization, design coordination, estimation and project scheduling. BIM software defines objects parametrically and it is a tool for virtual reality. Primary advantage of implementing BIM is the visual coordination of the building structure and systems such as Mechanical, Electrical and Plumbing (MEP) and it also identifies the possible conflicts between the building systems. This paper is an attempt to develop a risk based BIM model which would highlight the primary advantages of application of BIM pertaining to urban infrastructure transportation project. It has been observed that about 40% of the Architecture, Engineering and Construction (AEC) companies use BIM but primarily for their outsourced projects. Also, 65% of the respondents agree that BIM would be used quiet strongly for future construction projects in India. The 3D models developed with Revit 2015 software would reduce co-ordination problems amongst the architects, structural engineers, contractors and building service providers (MEP). Integration of risk management along with BIM would provide enhanced co-ordination, collaboration and high probability of successful completion of the complex infrastructure transportation project within stipulated time and cost frame.

Keywords: building information modeling (BIM), infrastructure transportation, project risk management, underground metro rail

Procedia PDF Downloads 310
6265 Evaluation of the Ability of COVID-19 Infected Sera to Induce Netosis Using an Ex-Vivo NETosis Monitoring Tool

Authors: Constant Gillot, Pauline Michaux, Julien Favresse, Jean-Michel Dogné, Jonathan Douxfils

Abstract:

Introduction: NETosis has emerged as a crucial yet paradoxical factor in severe COVID-19 cases. While neutrophil extracellular traps (NETs) help contain and eliminate viral particles, excessive NET formation can lead to hyperinflammation, exacerbating tissue damage and acute respiratory distress syndrome (ARDS). Aims: This study evaluates the relationship between COVID-19-infected sera and NETosis using an ex-vivo model. Methods: Sera from 8 post-admission COVID-19 patients, after receiving corticoid therapy, were used to induce NETosis in neutrophils from a healthy donor. NET formation was tracked using fluorescent markers for DNA and neutrophil elastase (NE) every 2 minutes for 8 hours. The results were expressed as a percentage of DNA/NE released over time. Key metrics, including T50 (time to 50% release) and AUC (area under the curve), representing total NETosis potential), were calculated. A 27-cytokine screening kit was used to assess the cytokine composition of the sera. Results: COVID-19 sera induced NETosis based on their cytokine profile. The AUC of NE and DNA release decreased with time following corticoid therapy, showing a significant reduction in 6 of the 8 patients (p<0.05). T50 also decreased in parallel with AUC for both markers. Cytokines concentration decrease with time after therapy administration. There is correlation between 14 cytokines concentration and NE release. Conclusion: This ex-vivo model successfully demonstrated the induction of NETosis by COVID-19 sera using two markers. A clear decrease in NETosis potential was observed over time with glucocorticoid therapy. This model can be a valuable tool for monitoring NETosis and investigating potential NETosis inducers and inhibitors.

Keywords: NETosis, COVID-19, cytokine storm, biomarkers

Procedia PDF Downloads 19
6264 Online Monitoring of Airborne Bioaerosols Released from a Composting, Green Waste Site

Authors: John Sodeau, David O'Connor, Shane Daly, Stig Hellebust

Abstract:

This study is the first to employ the online WIBS (Waveband Integrated Biosensor Sensor) technique for the monitoring of bioaerosol emissions and non-fluorescing “dust” released from a composting/green waste site. The purpose of the research was to provide a “proof of principle” for using WIBS to monitor such a location continually over days and nights in order to construct comparative “bioaerosol site profiles”. Current impaction/culturing methods take many days to achieve results available by the WIBS technique in seconds.The real-time data obtained was then used to assess variations of the bioaerosol counts as a function of size, “shape”, site location, working activity levels, time of day, relative humidity, wind speeds and wind directions. Three short campaigns were undertaken, one classified as a “light” workload period, another as a “heavy” workload period and finally a weekend when the site was closed. One main bioaerosol size regime was found to predominate: 0.5 micron to 3 micron with morphologies ranging from elongated to elipsoidal/spherical. The real-time number-concentration data were consistent with an Andersen sampling protocol that was employed at the site. The number-concentrations of fluorescent particles as a proportion of total particles counted amounted, on average, to ~1% for the “light” workday period, ~7% for the “heavy” workday period and ~18% for the weekend. The bioaerosol release profiles at the weekend were considerably different from those monitored during the working weekdays.

Keywords: bioaerosols, composting, fluorescence, particle counting in real-time

Procedia PDF Downloads 355
6263 TeleEmergency Medicine: Transforming Acute Care through Virtual Technology

Authors: Ashley L. Freeman, Jessica D. Watkins

Abstract:

TeleEmergency Medicine (TeleEM) is an innovative approach leveraging virtual technology to deliver specialized emergency medical care across diverse healthcare settings, including internal acute care and critical access hospitals, remote patient monitoring, and nurse triage escalation, in addition to external emergency departments, skilled nursing facilities, and community health centers. TeleEM represents a significant advancement in the delivery of emergency medical care, providing healthcare professionals the capability to deliver expertise that closely mirrors in-person emergency medicine, exceeding geographical boundaries. Through qualitative research, the extension of timely, high-quality care has proven to address the critical needs of patients in remote and underserved areas. TeleEM’s service design allows for the expansion of existing services and the establishment of new ones in diverse geographic locations. This ensures that healthcare institutions can readily scale and adapt services to evolving community requirements by leveraging on-demand (non-scheduled) telemedicine visits through the deployment of multiple video solutions. In terms of financial management, TeleEM currently employs billing suppression and subscription models to enhance accessibility for a wide range of healthcare facilities. Plans are in motion to transition to a billing system routing charges through a third-party vendor, further enhancing financial management flexibility. To address state licensure concerns, a patient location verification process has been integrated through legal counsel and compliance authorities' guidance. The TeleEM workflow is designed to terminate if the patient is not physically located within licensed regions at the time of the virtual connection, alleviating legal uncertainties. A distinctive and pivotal feature of TeleEM is the introduction of the TeleEmergency Medicine Care Team Assistant (TeleCTA) role. TeleCTAs collaborate closely with TeleEM Physicians, leading to enhanced service activation, streamlined coordination, and workflow and data efficiencies. In the last year, more than 800 TeleEM sessions have been conducted, of which 680 were initiated by internal acute care and critical access hospitals, as evidenced by quantitative research. Without this service, many of these cases would have necessitated patient transfers. Barriers to success were examined through thorough medical record review and data analysis, which identified inaccuracies in documentation leading to activation delays, limitations in billing capabilities, and data distortion, as well as the intricacies of managing varying workflows and device setups. TeleEM represents a transformative advancement in emergency medical care that nurtures collaboration and innovation. Not only has advanced the delivery of emergency medicine care virtual technology through focus group participation with key stakeholders, rigorous attention to legal and financial considerations, and the implementation of robust documentation tools and the TeleCTA role, but it’s also set the stage for overcoming geographic limitations. TeleEM assumes a notable position in the field of telemedicine by enhancing patient outcomes and expanding access to emergency medical care while mitigating licensure risks and ensuring compliant billing.

Keywords: emergency medicine, TeleEM, rural healthcare, telemedicine

Procedia PDF Downloads 82
6262 Recession Rate of Gangotri and Its Tributary Glacier, Garhwal Himalaya, India through Kinematic GPS Survey and Satellite Data

Authors: Harish Bisht, Bahadur Singh Kotlia, Kireet Kumar

Abstract:

In order to reconstruct past retreating rates, total area loss, volume change and shift in snout position were measured through multi-temporal satellite data from 1989 to 2016 and kinematic GPS survey from 2015 to 2016. The results obtained from satellite data indicate that in the last 27 years, Chaturangi glacier snout has retreated 1172.57 ± 38.3 m (average 45.07 ± 4.31 m/year) with a total area and volume loss of 0.626 ± 0.001 sq. Km and 0.139 Km³, respectively. The field measurements through differential global positioning system survey revealed that the annual retreating rate was 22.84 ± 0.05 m/year. The large variations in results derived from both the methods are probably because of higher difference in their accuracy. Snout monitoring of the Gangotri glacier during the ablation season (May to September) in the years 2005 and 2015 reveals that the retreating rate has been comparatively more declined than that shown by the earlier studies. The GPS dataset shows that the average recession rate is 10.26 ± 0.05 m/year. In order to determine the possible causes of decreased retreating rate, a relationship between debris thickness and melt rate was also established by using ablation stakes. The present study concludes that remote sensing method is suitable for large area and long term study, while kinematic GPS is more appropriate for the annual monitoring of retreating rate of glacier snout. The present study also emphasizes on mapping of all the tributary glaciers in order to assess the overall changes in the main glacier system and its health.

Keywords: Chaturangi glacier, Gangotri glacier, glacier snout, kinematic global positioning system, retreat rate

Procedia PDF Downloads 145
6261 Using Electrical Impedance Tomography to Control a Robot

Authors: Shayan Rezvanigilkolaei, Shayesteh Vefaghnematollahi

Abstract:

Electrical impedance tomography is a non-invasive medical imaging technique suitable for medical applications. This paper describes an electrical impedance tomography device with the ability to navigate a robotic arm to manipulate a target object. The design of the device includes various hardware and software sections to perform medical imaging and control the robotic arm. In its hardware section an image is formed by 16 electrodes which are located around a container. This image is used to navigate a 3DOF robotic arm to reach the exact location of the target object. The data set to form the impedance imaging is obtained by having repeated current injections and voltage measurements between all electrode pairs. After performing the necessary calculations to obtain the impedance, information is transmitted to the computer. This data is fed and then executed in MATLAB which is interfaced with EIDORS (Electrical Impedance Tomography Reconstruction Software) to reconstruct the image based on the acquired data. In the next step, the coordinates of the center of the target object are calculated by image processing toolbox of MATLAB (IPT). Finally, these coordinates are used to calculate the angles of each joint of the robotic arm. The robotic arm moves to the desired tissue with the user command.

Keywords: electrical impedance tomography, EIT, surgeon robot, image processing of electrical impedance tomography

Procedia PDF Downloads 272
6260 Phylogenetic Studies of Six Egyptian Sheep Breeds Using Cytochrome B

Authors: Othman Elmahdy Othman, Agnés Germot, Daniel Petit, Muhammad Khodary, Abderrahman Maftah

Abstract:

Recently, the control (D-loop) and cytochrome b (Cyt b) regions of mtDNA have received more attention due to their role in the genetic diversity and phylogenetic studies in different livestock which give important knowledge towards the genetic resource conservation. Studies based on sequencing of sheep mitochondrial DNA showed that there are five maternal lineages in the world for domestic sheep breeds; A, B, C, D and E. By using cytochrome B sequencing, we aimed to clarify the genetic affinities and phylogeny of six Egyptian sheep breeds. Blood samples were collected from 111 animals belonging to six Egyptian sheep breeds; Barki, Rahmani, Ossimi, Saidi, Sohagi and Fallahi. The total DNA was extracted and the specific primers were used for conventional PCR amplification of the cytochrome B region of mtDNA. PCR amplified products were purified and sequenced. The alignment of sequences was done using BioEdit software and DnaSP 5.00 software was used to identify the sequence variation and polymorphic sites in the aligned sequences. The result showed that the presence of 39 polymorphic sites leading to the formation of 29 haplotypes. The haplotype diversity in six tested breeds ranged from 0.643 in Rahmani breed to 0.871 in Barki breed. The lowest genetic distance was observed between Rahmani and Saidi (D: 1.436 and Dxy: 0.00127) while the highest distance was observed between Ossimi and Sohagi (D: 6.050 and Dxy: 0.00534). Neighbour-joining (Phylogeny) tree was constructed using Mega 5.0 software. The sequences of 111 analyzed samples were aligned with references sequences of different haplogroups; A, B, C, D and E. The phylogeny result showed the presence of four haplogroups; HapA, HapB, HapC and HapE in the examined samples whereas the haplogroup D was not found. The result showed that 88 out of 111 tested animals cluster with haplogroup B (79.28%), whereas 12 tested animals cluster with haplogroup A (10.81%), 10 animals cluster with haplogroup C (9.01%) and one animal belongs to haplogroup E (0.90%).

Keywords: phylogeny, genetic biodiversity, MtDNA, cytochrome B, Egyptian sheep

Procedia PDF Downloads 347
6259 The Design of Intelligent Passenger Organization System for Metro Stations Based on Anylogic

Authors: Cheng Zeng, Xia Luo

Abstract:

Passenger organization has always been an essential part of China's metro operation and management. Facing the massive passenger flow, stations need to improve their intelligence and automation degree by an appropriate integrated system. Based on the existing integrated supervisory control system (ISCS) and simulation software (Anylogic), this paper designs an intelligent passenger organization system (IPOS) for metro stations. Its primary function includes passenger information acquisition, data processing and computing, visualization management, decision recommendations, and decision response based on interlocking equipment. For this purpose, the logical structure and intelligent algorithms employed are particularly devised. Besides, the structure diagram of information acquisition and application module, the application of Anylogic, the case library's function process are all given by this research. Based on the secondary development of Anylogic and existing technologies like video recognition, the IPOS is supposed to improve the response speed and address capacity in the face of emergent passenger flow of metro stations.

Keywords: anylogic software, decision-making support system, intellectualization, ISCS, passenger organization

Procedia PDF Downloads 176