Search results for: time delay neural network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22273

Search results for: time delay neural network

18553 Effect of T6 and Re-Aging Heat Treatment on Mechanical Properties of 7055 Aluminum Alloy

Authors: M. Esmailian, M. Shakouri, A. Mottahedi, S. G. Shabestari

Abstract:

Heat treatable aluminium alloys such as 7075 and 7055, because of high strength and low density, are used widely in aircraft industry. For best mechanical properties, T6 heat treatment has recommended for this regards, but this temper treatment is sensitive to corrosion induced and Stress Corrosion Cracking (SCC) damage. For improving this property, the over-aging treatment (T7) applies to this alloy, but it decreases the mechanical properties up to 30 percent. Hence, to increase the mechanical properties, without any remarkable decrease in SCC resistant, Retrogression and Re-Aging (RRA) heat treatment is used. This treatment performs in a relatively short time. In this paper, the RRA heat treatment was applied to 7055 aluminum alloy and then effect of RRA time on the mechanical properties of 7055 has been investigated. The results show that the 40 minute time is suitable time for retrogression of 7055 aluminum alloy and ultimate strength increases up to 625MPa.

Keywords: 7055 Aluminum alloy, mechanical properties, SCC resistance, heat Treatment

Procedia PDF Downloads 432
18552 Effects of Preparation Caused by Ischemic-Reperfusion along with Sodium Bicarbonate Supplementation on Submaximal Dynamic Force Production

Authors: Sara Nasiri Semnani, Alireza Ramzani

Abstract:

Background and Aims: Sodium bicarbonate is a supplementation that used to reduce fatigue and increase power output in short-term training. On the other hand, the Ischemic Reperfusion Preconditioning (IRPC) is an appropriate stimulus to increase the submaximal contractile response. Materials and methods: 9 female student-athletes in double-blind randomized crossover design were three mode, sodium bicarbonate + IRPC, sodium bicarbonate and placebo+ IRPC. Participants moved forward single arm dumbbell hand with a weight of 2 kg can be carried out most frequently. Results: The results showed that plasma lactate concentration and records of sodium bicarbonate + IRPC and sodium bicarbonate conditions were significantly different compared to placebo + IRPC (Respectively p=0.001, p=0/02). Conclusion: According to the research findings, bicarbonate supplementation in IRPC training condition increased force and delay fatigue in submaximal dynamic contraction.

Keywords: ischemic reperfusion, preconditioning, sodium bicarbonate, submaximal dynamic force

Procedia PDF Downloads 303
18551 Pump-as-Turbine: Testing and Characterization as an Energy Recovery Device, for Use within the Water Distribution Network

Authors: T. Lydon, A. McNabola, P. Coughlan

Abstract:

Energy consumption in the water distribution network (WDN) is a well established problem equating to the industry contributing heavily to carbon emissions, with 0.9 kg CO2 emitted per m3 of water supplied. It is indicated that 85% of energy wasted in the WDN can be recovered by installing turbines. Existing potential in networks is present at small capacity sites (5-10 kW), numerous and dispersed across networks. However, traditional turbine technology cannot be scaled down to this size in an economically viable fashion, thus alternative approaches are needed. This research aims to enable energy recovery potential within the WDN by exploring the potential of pumps-as-turbines (PATs), to realise this potential. PATs are estimated to be ten times cheaper than traditional micro-hydro turbines, presenting potential to contribute to an economically viable solution. However, a number of technical constraints currently prohibit their widespread use, including the inability of a PAT to control pressure, difficulty in the selection of PATs due to lack of performance data and a lack of understanding on how PATs can cater for fluctuations as extreme as +/- 50% of the average daily flow, characteristic of the WDN. A PAT prototype is undergoing testing in order to identify the capabilities of the technology. Results of preliminary testing, which involved testing the efficiency and power potential of the PAT for varying flow and pressure conditions, in order to develop characteristic and efficiency curves for the PAT and a baseline understanding of the technologies capabilities, are presented here: •The limitations of existing selection methods which convert BEP from pump operation to BEP in turbine operation was highlighted by the failure of such methods to reflect the conditions of maximum efficiency of the PAT. A generalised selection method for the WDN may need to be informed by an understanding of impact of flow variations and pressure control on system power potential capital cost, maintenance costs, payback period. •A clear relationship between flow and efficiency rate of the PAT has been established. The rate of efficiency reductions for flows +/- 50% BEP is significant and more extreme for deviations in flow above the BEP than below, but not dissimilar to the reaction of efficiency of other turbines. •PAT alone is not sufficient to regulate pressure, yet the relationship of pressure across the PAT is foundational in exploring ways which PAT energy recovery systems can maintain required pressure level within the WDN. Efficiencies of systems of PAT energy recovery systems operating conditions of pressure regulation, which have been conceptualise in current literature, need to be established. Initial results guide the focus of forthcoming testing and exploration of PAT technology towards how PATs can form part of an efficiency energy recovery system.

Keywords: energy recovery, pump-as-turbine, water distribution network, water distribution network

Procedia PDF Downloads 260
18550 An Ethnographic Study on Peer Support Work-Ers in a Peer Driven Non Governmental Organization: The Colorado Mental Wellness Network

Authors: Shawna M. Margesson

Abstract:

This research study seeks to explore the lived experience of peer support workers (PSWs) in a peer-led non-governmental organization in Denver, Colorado, USA. The Colorado Mental Wellness Network offers supportive wellness recovery services such as wellness recovery action plans (WRAP), advocacy trainings for anti-stigma campaigns, and PSWs to work with and for consumers in the community. This study suggests that a peer-run environment is a unique community setting for PSWs to work given all employees are living in mental wellness recovery. Little has been documented about PSWs' personal accounts of working within a recovery-oriented organization and their first-person accounts to working with consumers. The importance of this study is to provide an ethnographic account of both subjects; the lived experiences of PSWs of both organizational and consumer-driven recovery. This study seeks to add to the literature and the social work profession the personal accounts of PSWs as they provide services to others like themselves. It also will provide an additional lens to view the peer-driven movement in mental health and wellness recovery.

Keywords: peer to peer movement, mental health, ethnography, peer support workers

Procedia PDF Downloads 164
18549 Algorithm for Automatic Real-Time Electrooculographic Artifact Correction

Authors: Norman Sinnigen, Igor Izyurov, Marina Krylova, Hamidreza Jamalabadi, Sarah Alizadeh, Martin Walter

Abstract:

Background: EEG is a non-invasive brain activity recording technique with a high temporal resolution that allows the use of real-time applications, such as neurofeedback. However, EEG data are susceptible to electrooculographic (EOG) and electromyography (EMG) artifacts (i.e., jaw clenching, teeth squeezing and forehead movements). Due to their non-stationary nature, these artifacts greatly obscure the information and power spectrum of EEG signals. Many EEG artifact correction methods are too time-consuming when applied to low-density EEG and have been focusing on offline processing or handling one single type of EEG artifact. A software-only real-time method for correcting multiple types of EEG artifacts of high-density EEG remains a significant challenge. Methods: We demonstrate an improved approach for automatic real-time EEG artifact correction of EOG and EMG artifacts. The method was tested on three healthy subjects using 64 EEG channels (Brain Products GmbH) and a sampling rate of 1,000 Hz. Captured EEG signals were imported in MATLAB with the lab streaming layer interface allowing buffering of EEG data. EMG artifacts were detected by channel variance and adaptive thresholding and corrected by using channel interpolation. Real-time independent component analysis (ICA) was applied for correcting EOG artifacts. Results: Our results demonstrate that the algorithm effectively reduces EMG artifacts, such as jaw clenching, teeth squeezing and forehead movements, and EOG artifacts (horizontal and vertical eye movements) of high-density EEG while preserving brain neuronal activity information. The average computation time of EOG and EMG artifact correction for 80 s (80,000 data points) 64-channel data is 300 – 700 ms depending on the convergence of ICA and the type and intensity of the artifact. Conclusion: An automatic EEG artifact correction algorithm based on channel variance, adaptive thresholding, and ICA improves high-density EEG recordings contaminated with EOG and EMG artifacts in real-time.

Keywords: EEG, muscle artifacts, ocular artifacts, real-time artifact correction, real-time ICA

Procedia PDF Downloads 180
18548 Batch-Oriented Setting Time`s Optimisation in an Aerodynamic Feeding System

Authors: Jan Busch, Maurice Schmidt, Peter Nyhuis

Abstract:

The change of conditions for production companies in high-wage countries is characterized by the globalization of competition and the transition of a supplier´s to a buyer´s market. The companies need to face the challenges of reacting flexibly to these changes. Due to the significant and increasing degree of automation, assembly has become the most expensive production process. Regarding the reduction of production cost, assembly consequently offers a considerable rationalizing potential. Therefore, an aerodynamic feeding system has been developed at the Institute of Production Systems and Logistics (IFA), Leibniz Universitaet Hannover. In former research activities, this system has been enabled to adjust itself using genetic algorithm. The longer the genetic algorithm is executed the better is the feeding quality. In this paper, the relation between the system´s setting time and the feeding quality is observed and a function which enables the user to achieve the minimum of the total feeding time is presented.

Keywords: aerodynamic feeding system, batch size, optimisation, setting time

Procedia PDF Downloads 257
18547 Variation of Litter Chemistry under Intensified Drought: Consequences on Flammability

Authors: E. Ormeno, C. Gutigny, J. Ruffault, J. Madrigal, M. Guijarro, C. Lecareux, C. Ballini

Abstract:

Mediterranean plant species feature numerous metabolic and morpho-physiological responses crucial to survive under both, typical Mediterranean drought conditions and future aggravated drought expected by climate change. Whether these adaptive responses will, in turn, increase the ecosystem perturbation in terms of fire hazard, is an issue that needs to be addressed. The aim of this study was to test whether recurrent and aggravated drought in the Mediterranean area favors the accumulation of waxes in leaf litter, with an eventual increase of litter flammability. The study was conducted in 2017 in a garrigue in Southern France dominated by Quercus coccifera, where two drought treatments were used: a treatment with recurrent aggravated drought consisting of ten rain exclusion structures which withdraw part of the annual precipitation since January 2012, and a natural drought treatment where Q. coccifera stands are free of such structures and thus grow under natural precipitation. Waxes were extracted with organic solvent and analyzed by GC-MS and litter flammability was assessed through measurements of the ignition delay, flame residence time and flame intensity (flame height) using an epiradiator as well as the heat of combustion using an oxygen bomb calorimeter. Results show that after 5 years of rain restriction, wax content in the cuticle of leaf litter increases significantly compared to shrubs growing under natural precipitation, in accordance with the theoretical knowledge which expects increases of cuticle waxes in green leaves in order to limit water evapotranspiration. Wax concentrations were also linearly and positively correlated to litter flammability, a correlation that lies on the high flammability own to the long-chain alkanes (C25-C31) found in leaf litter waxes. This innovative investigation shows that climate change is likely to favor ecosystem fire hazard through accumulation of highly flammable waxes in litter. It also adds valuable information about the types of metabolites that are associated with increasing litter flammability, since so far, within the leaf metabolic profile, only terpene-like compounds had been related to plant flammability.

Keywords: cuticular waxes, drought, flammability, litter

Procedia PDF Downloads 171
18546 A Study on How to Link BIM Services to Cloud Computing Architecture

Authors: Kim Young-Jin, Kim Byung-Kon

Abstract:

Although more efforts to expand the application of BIM (Building Information Modeling) technologies have be pursued in recent years than ever, it’s true that there have been various challenges in doing so, including a lack or absence of relevant institutions, lots of costs required to build BIM-related infrastructure, incompatible processes, etc. This, in turn, has led to a more prolonged delay in the expansion of their application than expected at an early stage. Especially, attempts to save costs for building BIM-related infrastructure and provide various BIM services compatible with domestic processes include studies to link between BIM and cloud computing technologies. Also in this study, the author attempted to develop a cloud BIM service operation model through analyzing the level of BIM applications for the construction sector and deriving relevant service areas, and find how to link BIM services to the cloud operation model, as through archiving BIM data and creating a revenue structure so that the BIM services may grow spontaneously, considering a demand for cloud resources.

Keywords: construction IT, BIM (building information modeling), cloud computing, BIM service based cloud computing

Procedia PDF Downloads 487
18545 A Study on the Relationship Between Adult Videogaming and Wellbeing, Health, and Labor Supply

Authors: William Marquis, Fang Dong

Abstract:

There has been a growing concern in recent years over the economic and social effects of adult video gaming. It has been estimated that the number of people who played video games during the COVID-19 pandemic is close to three billion, and there is evidence that this form of entertainment is here to stay. Many people are concerned that this growing use of time could crowd out time that could be spent on alternative forms of entertainment with family, friends, sports, and other social activities that build community. For example, recent studies of children suggest that playing videogames crowds out time that could be spent on homework, watching TV, or in other social activities. Similar studies of adults have shown that video gaming is negatively associated with earnings, time spent at work, and socializing with others. The primary objective of this paper is to examine how time adults spend on video gaming could displace time they could spend working and on activities that enhance their health and well-being. We use data from the American Time Use Survey (ATUS), maintained by the Bureau of Labor Statistics, to analyze the effects of time-use decisions on three measures of well-being. We pool the ATUS Well-being Module for multiple years, 2010, 2012, 2013, and 2021, along with the ATUS Activity and Who files for these years. This pooled data set provides three broad measures of well-being, e.g., health, life satisfaction, and emotional well-being. Seven variants of each are used as a dependent variable in different multivariate regressions. We add to the existing literature in the following ways. First, we investigate whether the time adults spend in video gaming crowds out time spent working or in social activities that promote health and life satisfaction. Second, we investigate the relationship between adult gaming and their emotional well-being, also known as negative or positive affect, a factor that is related to depression, health, and labor market productivity. The results of this study suggest that the time adult gamers spend on video gaming has no effect on their supply of labor, a negligible effect on their time spent socializing and studying, and mixed effects on their emotional well-being, such as increasing feelings of pain and reducing feelings of happiness and stress.

Keywords: online gaming, health, social capital, emotional wellbeing

Procedia PDF Downloads 45
18544 Water Droplet Impact on Vibrating Rigid Superhydrophobic Surfaces

Authors: Jingcheng Ma, Patricia B. Weisensee, Young H. Shin, Yujin Chang, Junjiao Tian, William P. King, Nenad Miljkovic

Abstract:

Water droplet impact on surfaces is a ubiquitous phenomenon in both nature and industry. The transfer of mass, momentum and energy can be influenced by the time of contact between droplet and surface. In order to reduce the contact time, we study the influence of substrate motion prior to impact on the dynamics of droplet recoil. Using optical high speed imaging, we investigated the impact dynamics of macroscopic water droplets (~ 2mm) on rigid nanostructured superhydrophobic surfaces vibrating at 60 – 300 Hz and amplitudes of 0 – 3 mm. In addition, we studied the influence of the phase of the substrate at the moment of impact on total contact time. We demonstrate that substrate vibration can alter droplet dynamics, and decrease total contact time by as much as 50% compared to impact on stationary rigid superhydrophobic surfaces. Impact analysis revealed that the vibration frequency mainly affected the maximum contact time, while the amplitude of vibration had little direct effect on the contact time. Through mathematical modeling, we show that the oscillation amplitude influences the possibility density function of droplet impact at a given phase, and thus indirectly influences the average contact time. We also observed more vigorous droplet splashing and breakup during impact at larger amplitudes. Through semi-empirical mathematical modeling, we describe the relationship between contact time and vibration frequency, phase, and amplitude of the substrate. We also show that the maximum acceleration during the impact process is better suited as a threshold parameter for the onset of splashing than a Weber-number criterion. This study not only provides new insights into droplet impact physics on vibrating surfaces, but develops guidelines for the rational design of surfaces to achieve controllable droplet wetting in applications utilizing vibration.

Keywords: contact time, impact dynamics, oscillation, pear-shape droplet

Procedia PDF Downloads 454
18543 The Use of Network Tool for Brain Signal Data Analysis: A Case Study with Blind and Sighted Individuals

Authors: Cleiton Pons Ferreira, Diana Francisca Adamatti

Abstract:

Advancements in computers technology have allowed to obtain information for research in biology and neuroscience. In order to transform the data from these surveys, networks have long been used to represent important biological processes, changing the use of this tools from purely illustrative and didactic to more analytic, even including interaction analysis and hypothesis formulation. Many studies have involved this application, but not directly for interpretation of data obtained from brain functions, asking for new perspectives of development in neuroinformatics using existent models of tools already disseminated by the bioinformatics. This study includes an analysis of neurological data through electroencephalogram (EEG) signals, using the Cytoscape, an open source software tool for visualizing complex networks in biological databases. The data were obtained from a comparative case study developed in a research from the University of Rio Grande (FURG), using the EEG signals from a Brain Computer Interface (BCI) with 32 eletrodes prepared in the brain of a blind and a sighted individuals during the execution of an activity that stimulated the spatial ability. This study intends to present results that lead to better ways for use and adapt techniques that support the data treatment of brain signals for elevate the understanding and learning in neuroscience.

Keywords: neuroinformatics, bioinformatics, network tools, brain mapping

Procedia PDF Downloads 182
18542 Machine Learning Based Smart Beehive Monitoring System Without Internet

Authors: Esra Ece Var

Abstract:

Beekeeping plays essential role both in terms of agricultural yields and agricultural economy; they produce honey, wax, royal jelly, apitoxin, pollen, and propolis. Nowadays, these natural products become more importantly suitable and preferable for nutrition, food supplement, medicine, and industry. However, to produce organic honey, majority of the apiaries are located in remote or distant rural areas where utilities such as electricity and Internet network are not available. Additionally, due to colony failures, world honey production decreases year by year despite the increase in the number of beehives. The objective of this paper is to develop a smart beehive monitoring system for apiaries including those that do not have access to Internet network. In this context, temperature and humidity inside the beehive, and ambient temperature were measured with RFID sensors. Control center, where all sensor data was sent and stored at, has a GSM module used to warn the beekeeper via SMS when an anomaly is detected. Simultaneously, using the collected data, an unsupervised machine learning algorithm is used for detecting anomalies and calibrating the warning system. The results show that the smart beehive monitoring system can detect fatal anomalies up to 4 weeks prior to colony loss.

Keywords: beekeeping, smart systems, machine learning, anomaly detection, apiculture

Procedia PDF Downloads 239
18541 Automatic Queuing Model Applications

Authors: Fahad Suleiman

Abstract:

Queuing, in medical system is the process of moving patients in a specific sequence to a specific service according to the patients’ nature of illness. The term scheduling stands for the process of computing a schedule. This may be done by a queuing based scheduler. This paper focuses on the medical consultancy system, the different queuing algorithms that are used in healthcare system to serve the patients, and the average waiting time. The aim of this paper is to build automatic queuing system for organizing the medical queuing system that can analyses the queue status and take decision which patient to serve. The new queuing architecture model can switch between different scheduling algorithms according to the testing results and the factor of the average waiting time. The main innovation of this work concerns the modeling of the average waiting time is taken into processing, in addition with the process of switching to the scheduling algorithm that gives the best average waiting time.

Keywords: queuing systems, queuing system models, scheduling algorithms, patients

Procedia PDF Downloads 354
18540 SNR Classification Using Multiple CNNs

Authors: Thinh Ngo, Paul Rad, Brian Kelley

Abstract:

Noise estimation is essential in today wireless systems for power control, adaptive modulation, interference suppression and quality of service. Deep learning (DL) has already been applied in the physical layer for modulation and signal classifications. Unacceptably low accuracy of less than 50% is found to undermine traditional application of DL classification for SNR prediction. In this paper, we use divide-and-conquer algorithm and classifier fusion method to simplify SNR classification and therefore enhances DL learning and prediction. Specifically, multiple CNNs are used for classification rather than a single CNN. Each CNN performs a binary classification of a single SNR with two labels: less than, greater than or equal. Together, multiple CNNs are combined to effectively classify over a range of SNR values from −20 ≤ SNR ≤ 32 dB.We use pre-trained CNNs to predict SNR over a wide range of joint channel parameters including multiple Doppler shifts (0, 60, 120 Hz), power-delay profiles, and signal-modulation types (QPSK,16QAM,64-QAM). The approach achieves individual SNR prediction accuracy of 92%, composite accuracy of 70% and prediction convergence one order of magnitude faster than that of traditional estimation.

Keywords: classification, CNN, deep learning, prediction, SNR

Procedia PDF Downloads 134
18539 Students’ Level of Knowledge Construction and Pattern of Social Interaction in an Online Forum

Authors: K. Durairaj, I. N. Umar

Abstract:

The asynchronous discussion forum is one of the most widely used activities in learning management system environment. Online forum allows participants to interact, construct knowledge, and can be used to complement face to face sessions in blended learning courses. However, to what extent do the students perceive the benefits or advantages of forum remain to be seen. Through content and social network analyses, instructors will be able to gauge the students’ engagement and knowledge construction level. Thus, this study aims to analyze the students’ level of knowledge construction and their participation level that occur through online discussion. It also attempts to investigate the relationship between the level of knowledge construction and their social interaction patterns. The sample involves 23 students undertaking a master course in one public university in Malaysia. The asynchronous discussion forum was conducted for three weeks as part of the course requirement. The finding indicates that the level of knowledge construction is quite low. Also, the density value of 0.11 indicating that the overall communication among the participants in the forum is low. This study reveals that strong and significant correlations between SNA measures (in-degree centrality, out-degree centrality) and level of knowledge construction. Thus, allocating these active students in a different groups aids the interactive discussion takes place. Finally, based upon the findings, some recommendations to increase students’ level of knowledge construction and also for further research are proposed.

Keywords: asynchronous discussion forums, content analysis, knowledge construction, social network analysis

Procedia PDF Downloads 373
18538 Gender and Advertisements: A Content Analysis of Pakistani Prime Time Advertisements

Authors: Aaminah Hassan

Abstract:

Advertisements carry a great potential to influence our lives because they are crafted to meet particular ends. Stereotypical representation in advertisements is capable of forming unconscious attitudes among people towards any gender and their abilities. This study focuses on gender representation in Pakistani prime time advertisements. For this purpose, 13 advertisements were selected from three different categories of foods and beverages, cosmetics, cell phones and cellular networks from the prime time slots of one of the leading Pakistani entertainment channel, ‘Urdu 1’. Both quantitative and qualitative analyses are carried out for range of variables like gender, age, roles, activities, setting, appearance and voice overs. The results revealed that gender representation in advertisements is stereotypical. Moreover, in few instances, the portrayal of women is not only culturally inappropriate but is demeaning to the image of women as well. Their bodily charm is used to promote products. Comparing different entertainment channels for their prime time advertisements and broadening the scope of this research will yield greater implications for the researchers who want to carry out the similar research. It is hoped that the current study would help in the promotion of media literacy among the viewers and media authorities in Pakistan.

Keywords: Advertisements, Content Analysis, Gender, Prime time

Procedia PDF Downloads 214
18537 Optimization of Operational Water Quality Parameters in a Drinking Water Distribution System Using Response Surface Methodology

Authors: Sina Moradi, Christopher W. K. Chow, John Van Leeuwen, David Cook, Mary Drikas, Patrick Hayde, Rose Amal

Abstract:

Chloramine is commonly used as a disinfectant in drinking water distribution systems (DWDSs), particularly in Australia and the USA. Maintaining a chloramine residual throughout the DWDS is important in ensuring microbiologically safe water is supplied at the customer’s tap. In order to simulate how chloramine behaves when it moves through the distribution system, a water quality network model (WQNM) can be applied. In this work, the WQNM was based on mono-chloramine decomposition reactions, which enabled prediction of mono-chloramine residual at different locations through a DWDS in Australia, using the Bentley commercial hydraulic package (Water GEMS). The accuracy of WQNM predictions is influenced by a number of water quality parameters. Optimization of these parameters in order to obtain the closest results in comparison with actual measured data in a real DWDS would result in both cost reduction as well as reduction in consumption of valuable resources such as energy and materials. In this work, the optimum operating conditions of water quality parameters (i.e. temperature, pH, and initial mono-chloramine concentration) to maximize the accuracy of mono-chloramine residual predictions for two water supply scenarios in an entire network were determined using response surface methodology (RSM). To obtain feasible and economical water quality parameters for highest model predictability, Design Expert 8.0 software (Stat-Ease, Inc.) was applied to conduct the optimization of three independent water quality parameters. High and low levels of the water quality parameters were considered, inevitably, as explicit constraints, in order to avoid extrapolation. The independent variables were pH, temperature and initial mono-chloramine concentration. The lower and upper limits of each variable for two water supply scenarios were defined and the experimental levels for each variable were selected based on the actual conditions in studied DWDS. It was found that at pH of 7.75, temperature of 34.16 ºC, and initial mono-chloramine concentration of 3.89 (mg/L) during peak water supply patterns, root mean square error (RMSE) of WQNM for the whole network would be minimized to 0.189, and the optimum conditions for averaged water supply occurred at pH of 7.71, temperature of 18.12 ºC, and initial mono-chloramine concentration of 4.60 (mg/L). The proposed methodology to predict mono-chloramine residual can have a great potential for water treatment plant operators in accurately estimating the mono-chloramine residual through a water distribution network. Additional studies from other water distribution systems are warranted to confirm the applicability of the proposed methodology for other water samples.

Keywords: chloramine decay, modelling, response surface methodology, water quality parameters

Procedia PDF Downloads 225
18536 GPRS Based Automatic Metering System

Authors: Constant Akama, Frank Kulor, Frederick Agyemang

Abstract:

All over the world, due to increasing population, electric power distribution companies are looking for more efficient ways of reading electricity meters. In Ghana, the prepaid metering system was introduced in 2007 to replace the manual system of reading which was fraught with inefficiencies. However, the prepaid system in Ghana is not capable of integration with online systems such as e-commerce platforms and remote monitoring systems. In this paper, we present a design framework for an automatic metering system that can be integrated with e-commerce platforms and remote monitoring systems. The meter was designed using ADE 7755 which reads the energy consumption and the reading is processed by a microcontroller connected to Sim900 General Packet Radio Service module containing a GSM chip provisioned with an Access Point Name. The system also has a billing server and a management server located at the premises of the utility company which communicate with the meter over a Virtual Private Network and GPRS. With this system, customers can buy credit online and the credit will be transferred securely to the meter. Also, when a fault is reported, the utility company can log into the meter remotely through the management server to troubleshoot the problem.

Keywords: access point name, general packet radio service, GSM, virtual private network

Procedia PDF Downloads 299
18535 BIM Modeling of Site and Existing Buildings: Case Study of ESTP Paris Campus

Authors: Rita Sassine, Yassine Hassani, Mohamad Al Omari, Stéphanie Guibert

Abstract:

Building Information Modelling (BIM) is the process of creating, managing, and centralizing information during the building lifecycle. BIM can be used all over a construction project, from the initiation phase to the planning and execution phases to the maintenance and lifecycle management phase. For existing buildings, BIM can be used for specific applications such as lifecycle management. However, most of the existing buildings don’t have a BIM model. Creating a compatible BIM for existing buildings is very challenging. It requires special equipment for data capturing and efforts to convert these data into a BIM model. The main difficulties for such projects are to define the data needed, the level of development (LOD), and the methodology to be adopted. In addition to managing information for an existing building, studying the impact of the built environment is a challenging topic. So, integrating the existing terrain that surrounds buildings into the digital model is essential to be able to make several simulations as flood simulation, energy simulation, etc. Making a replication of the physical model and updating its information in real-time to make its Digital Twin (DT) is very important. The Digital Terrain Model (DTM) represents the ground surface of the terrain by a set of discrete points with unique height values over 2D points based on reference surface (e.g., mean sea level, geoid, and ellipsoid). In addition, information related to the type of pavement materials, types of vegetation and heights and damaged surfaces can be integrated. Our aim in this study is to define the methodology to be used in order to provide a 3D BIM model for the site and the existing building based on the case study of “Ecole Spéciale des Travaux Publiques (ESTP Paris)” school of engineering campus. The property is located on a hilly site of 5 hectares and is composed of more than 20 buildings with a total area of 32 000 square meters and a height between 50 and 68 meters. In this work, the campus precise levelling grid according to the NGF-IGN69 altimetric system and the grid control points are computed according to (Réseau Gédésique Français) RGF93 – Lambert 93 french system with different methods: (i) Land topographic surveying methods using robotic total station, (ii) GNSS (Global Network Satellite sytem) levelling grid with NRTK (Network Real Time Kinematic) mode, (iii) Point clouds generated by laser scanning. These technologies allow the computation of multiple building parameters such as boundary limits, the number of floors, the floors georeferencing, the georeferencing of the 4 base corners of each building, etc. Once the entry data are identified, the digital model of each building is done. The DTM is also modeled. The process of altimetric determination is complex and requires efforts in order to collect and analyze multiple data formats. Since many technologies can be used to produce digital models, different file formats such as DraWinG (DWG), LASer (LAS), Comma-separated values (CSV), Industry Foundation Classes (IFC) and ReViT (RVT) will be generated. Checking the interoperability between BIM models is very important. In this work, all models are linked together and shared on 3DEXPERIENCE collaborative platform.

Keywords: building information modeling, digital terrain model, existing buildings, interoperability

Procedia PDF Downloads 112
18534 A Machine Learning Based Method to Detect System Failure in Resource Constrained Environment

Authors: Payel Datta, Abhishek Das, Abhishek Roychoudhury, Dhiman Chattopadhyay, Tanushyam Chattopadhyay

Abstract:

Machine learning (ML) and deep learning (DL) is most predominantly used in image/video processing, natural language processing (NLP), audio and speech recognition but not that much used in system performance evaluation. In this paper, authors are going to describe the architecture of an abstraction layer constructed using ML/DL to detect the system failure. This proposed system is used to detect the system failure by evaluating the performance metrics of an IoT service deployment under constrained infrastructure environment. This system has been tested on the manually annotated data set containing different metrics of the system, like number of threads, throughput, average response time, CPU usage, memory usage, network input/output captured in different hardware environments like edge (atom based gateway) and cloud (AWS EC2). The main challenge of developing such system is that the accuracy of classification should be 100% as the error in the system has an impact on the degradation of the service performance and thus consequently affect the reliability and high availability which is mandatory for an IoT system. Proposed ML/DL classifiers work with 100% accuracy for the data set of nearly 4,000 samples captured within the organization.

Keywords: machine learning, system performance, performance metrics, IoT, edge

Procedia PDF Downloads 195
18533 The Analysis of Space Syntax Used in the Development Explore of Hangzhou city’s Centratity

Authors: Liu Junzhu

Abstract:

In contemporary China,city is expanding with an amazing speed. And because of the unexpected events’ interference, spatial structure could change itself in a short time, That will lead to the new urban district livingness and unfortunately, this phenomenon is very common.On the one hand,it fail to achieve the goal of city planning, On the other hand,it is unfavourable to the sustainable development of city. Bill Hillier’stheory Space Syntax shows organzation pattern of each space,it explains the characteristics of urban spatial patterns and its transformation regulation from the point of self-organization in system and also, it gives confirmatory and predictive ways to the building and city. This paper used axial model to summarize Hangzhou City’s special structure and enhanced comprehensive understanding of macroscopic space and environment, space structure,developing trend, ect, by computer analysis of Space Syntax. From that, it helps us to know the operation law in the urban system and to understand Hangzhou City’s spatial pattern and indirect social effect it has mad more clearly, Thus, it could comply with the tendency of cities development in process and planning of policy and plan our cities’ future sustainably.

Keywords: sustainable urban design, space syntax, spatial network, segment angular analysis, social inclusion

Procedia PDF Downloads 462
18532 A Study of Lean Principles Implementation in the Libyan Healthcare and Industry Sectors

Authors: Nasser M. Amaitik, Ngwan F. Elsagzli

Abstract:

The Lean technique is very important in the service and industrial fields. It is defined as an effective tool to eliminate the wastes. In lean the wastes are defined as anything which does not add value to the end product. There are wastes that can be avoided, but some are unavoidable to many reasons. The present study aims to apply the principles of lean in two different sectors, healthcare, and industry. Two case studies have been selected to apply the experimental work. The first case was Al-Jalaa Hospital while the second case study was the Technical Company of Aluminum Sections in Benghazi, Libya. In both case studies the Value Stream Map (VSM) of the current state has been constructed. The proposed plans have been implemented by merging or eliminating procedures or processes. The results obtained from both case studies showed improvement in capacity, idle time and utilized time.

Keywords: healthcare service delivery, idle time, lean principles, utilized time, value stream mapping, wastes

Procedia PDF Downloads 287
18531 Management of Empty Containers by Consignees in the Hinterland

Authors: Benjamin Legros, Jan Fransoo, Oualid Jouini

Abstract:

This study aims to evaluate street-turn strategies for empty container repositioning in the hinterland. Containers arrive over time at the (importer) consignee, while the demand for containers arises from the (exporter) shipper. A match can be operated between an empty container from the consignee and the load from the shipper. Therefore, we model the system as a double-ended queue with non-zero matching time and a limited number of resources in order to optimize the reposition- ing decisions. We determine the performance measures when the consignee operates using a fixed withholding threshold policy. We show that the matching time mainly plays a role in the matching proportion, while under a certain duration, it only marginally impacts the consignee’s inventory policy and cost per container. Also, the withholding level is mainly determined by the shipper’s production rate.

Keywords: container, double-ended queue, inventory, Markov decision process, non-zero matching time, street-turn

Procedia PDF Downloads 143
18530 Representation Data without Lost Compression Properties in Time Series: A Review

Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction

Procedia PDF Downloads 428
18529 A Modified Estimating Equations in Derivation of the Causal Effect on the Survival Time with Time-Varying Covariates

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

a systematic observation from a defined time of origin up to certain failure or censor is known as survival data. Survival analysis is a major area of interest in biostatistics and biomedical researches. At the heart of understanding, the most scientific and medical research inquiries lie for a causality analysis. Thus, the main concern of this study is to investigate the causal effect of treatment on survival time conditional to the possibly time-varying covariates. The theory of causality often differs from the simple association between the response variable and predictors. A causal estimation is a scientific concept to compare a pragmatic effect between two or more experimental arms. To evaluate an average treatment effect on survival outcome, the estimating equation was adjusted for time-varying covariates under the semi-parametric transformation models. The proposed model intuitively obtained the consistent estimators for unknown parameters and unspecified monotone transformation functions. In this article, the proposed method estimated an unbiased average causal effect of treatment on survival time of interest. The modified estimating equations of semiparametric transformation models have the advantage to include the time-varying effect in the model. Finally, the finite sample performance characteristics of the estimators proved through the simulation and Stanford heart transplant real data. To this end, the average effect of a treatment on survival time estimated after adjusting for biases raised due to the high correlation of the left-truncation and possibly time-varying covariates. The bias in covariates was restored, by estimating density function for left-truncation. Besides, to relax the independence assumption between failure time and truncation time, the model incorporated the left-truncation variable as a covariate. Moreover, the expectation-maximization (EM) algorithm iteratively obtained unknown parameters and unspecified monotone transformation functions. To summarize idea, the ratio of cumulative hazards functions between the treated and untreated experimental group has a sense of the average causal effect for the entire population.

Keywords: a modified estimation equation, causal effect, semiparametric transformation models, survival analysis, time-varying covariate

Procedia PDF Downloads 175
18528 The Effect of Research Unit Clique-Diversity and Power Structure on Performance and Originality

Authors: Yue Yang, Qiang Wu, Xingyu Gao

Abstract:

"Organized research units" have always been an important part of academia. According to the type of organization, there are public research units, university research units, and corporate research units. Existing research has explored the research unit in some depth from several perspectives. However, there is a research gap on the closer interaction between the three from a network perspective and the impact of this interaction on their performance as well as originality. Cliques are a special kind of structure under the concept of cohesive subgroups in the field of social networks, representing particularly tightly knit teams in a network. This study develops the concepts of the diversity of clique types and the diversity of clique geography based on cliques, starting from the diversity of collaborative activities characterized by them. Taking research units as subjects and assigning values to their power in cliques based on occupational age, we explore the impact of clique diversity and clique power on their performance as well as originality and the moderating role of clique relationship strength and structural holes in them. By collecting 9094 articles published in the field of quantum communication at WoSCC over the 15 years 2007-2021, we processed them to construct annual collaborative networks between a total of 533 research units and measured the network characteristic variables using Ucinet. It was found that the type and geographic diversity of cliques promoted the performance and originality of the research units, and the strength of clique relationships positively moderated the positive effect of the diversity of clique types on performance and negatively affected the promotional relationship between the geographic diversity of cliques and performance. It also negatively affected the positive effects of clique-type diversity and clique-geography diversity on originality. Structural holes positively moderated the facilitating effect of both types of factional diversity on performance and originality. Clique power promoted the performance of the research unit, but unfavorably affected its performance on novelty. Faction relationship strength facilitated the relationship between faction rights and performance and showed negative insignificance for clique power and originality. Structural holes positively moderated the effect of clique power on performance and originality.

Keywords: research unit, social networks, clique structure, clique power, diversity

Procedia PDF Downloads 59
18527 Use of Social Media in Political Communications: Example of Facebook

Authors: Havva Nur Tarakci, Bahar Urhan Torun

Abstract:

The transformation that is seen in every area of life by technology, especially internet technology changes the structure of political communications too. Internet, which is at the top of new communication technologies, affects political communications with its structure in a way that no traditional communication tools ever have and enables interaction and the channel between receiver and sender, and it becomes one of the most effective tools preferred among the political communication applications. This state as a result of technological convergence makes Internet an unobtainable place for political communication campaigns. Political communications, which means every kind of communication strategies that political parties called 'actors of political communications' use with the aim of messaging their opinions and party programmes to their present and potential voters who are a target group for them, is a type of communication that is frequently used also among social media tools at the present day. The electorate consisting of different structures is informed, directed, and managed by social media tools. Political parties easily reach their electorate by these tools without any limitations of both time and place and also are able to take the opinions and reactions of their electorate by the element of interaction that is a feature of social media. In this context, Facebook, which is a place that political parties use in social media at most, is a communication network including in our daily life since 2004. As it is one of the most popular social networks today, it is among the most-visited websites in the global scale. In this way, the research is based on the question, “How do the political parties use Facebook at the campaigns, which they conduct during the election periods, for informing their voters?” and it aims at clarifying the Facebook using practices of the political parties. In direction of this objective the official Facebook accounts of the four political parties (JDP–AKParti, PDP–BDP, RPP-CHP, NMP-MHP), which reach their voters by social media besides other communication tools, are treated, and a frame for the politics of Turkey is formed. The time of examination is constricted with totally two weeks, one week before the mayoral elections and one week after the mayoral elections, when it is supposed that the political parties use their Facebook accounts in full swing. As a research method, the method of content analysis is preferred, and the texts and the visual elements that are gotten are interpreted based on this analysis.

Keywords: Facebook, political communications, social media, electrorate

Procedia PDF Downloads 383
18526 Systematic Study of Mutually Inclusive Influence of Temperature and Substitution on the Coordination Geometry of Co(II) in a Series of Coordination Polymer and Their Properties

Authors: Manasi Roy, Raju Mondal

Abstract:

During last two decades the synthesis and design of MOFs or novel coordination polymers (CPs) has flourished as an emerging area of research due to their role as functional materials. Accordingly, ten new cobalt-based MOFs have been synthesized using a simple bispyrazole ligand, 4,4′-methylene-bispyrazole (H2MBP), and isophthalic acid (H2IPA) and its four 5-substituted derivatives R-H2IPA (R = COOH, OH, tBu, NH2). The major aim of this study was to validate the mutual influence of temperature and substitutions on the final structural self-assembly. Five different isophthalic acid derivatives were used to study the influence of substituents while each reaction was carried out at two different temperatures to assess the temperature effect. A clear correlation was observed between the reaction temperature and the coordination number of the cobalt atoms which consequently changes the self assembly pattern. Another fact that the periodical change in coordination number did bring about some systematic changes in the structural network via secondary building unit selectivity. With the presence of a tunable cavity inside the network, and unsaturated metal centers, MOFs show highly encouraging photocatalytic degradation of toxic dye with a potential application in waste water purification. Another fascinating aspect of this work is the construction of magnetic coordination polymers with the occurrence of a not-so-common MCE behavior of cobalt-based MOF.

Keywords: MOFs, temperature effect, MCE, dye degradation

Procedia PDF Downloads 136
18525 Behaviour of an RC Circuit near Extreme Point

Authors: Tribhuvan N. Soorya

Abstract:

Charging and discharging of a capacitor through a resistor can be shown as exponential curve. Theoretically, it takes infinite time to fully charge or discharge a capacitor. The flow of charge is due to electrons having finite and fixed value of charge. If we carefully examine the charging and discharging process after several time constants, the points on q vs t graph become discrete and curve become discontinuous. Moreover for all practical purposes capacitor with charge (q0-e) can be taken as fully charged, as it introduces an error less than one part per million. Similar is the case for discharge of a capacitor, where the capacitor with the last electron (charge e) can be taken as fully discharged. With this, we can estimate the finite value of time for fully charging and discharging a capacitor.

Keywords: charging, discharging, RC Circuit, capacitor

Procedia PDF Downloads 443
18524 Qualitative Detection of HCV and GBV-C Co-infection in Cirrhotic Patients Using a SYBR Green Multiplex Real Time RT-PCR Technique

Authors: Shahzamani Kiana, Esmaeil Lashgarian Hamed, Merat Shahin

Abstract:

HCV and GBV-C belong to the Flaviviridae family of viruses and GBV-C is the closest virus to HCV genetically. Accumulative research is in progress all over the world to clarify clinical aspects of GBV-C. Possibility of interaction between HCV and GBV-C and also its consequence with other liver diseases are the most important clinical aspects which encourage researchers to develop a technique for simultaneous detection of these viruses. In this study a SYBR Green multiplex real time RT-PCR technique as a new economical and sensitive method was optimized for simultaneous detection of HCV/GBV-C in HCV positive plasma samples. After designing and selection of two pairs of specific primers for HCV and GBV-C, SYBR Green Real time RT-PCR technique optimization was performed separately for each virus. Establishment of multiplex PCR was the next step. Finally our technique was performed on positive and negative plasma samples. 89 cirrhotic HCV positive plasma samples (29 of genotype 3 a and 27 of genotype 1a) were collected from patients before receiving treatment. 14% of genotype 3a and 17.1% of genotype 1a showed HCV/GBV-C co-infection. As a result, 13.48% of 89 samples had HCV/GBV-C co-infection that was compatible with other results from all over the world. Data showed no apparent influence of HGV co-infection on the either clinical or virological aspect of HCV infection. Furthermore, with application of multiplex Real time RT-PCR technique, more time and cost could be saved in clinical-research settings.

Keywords: HCV, GBV-C, cirrhotic patients, multiplex real time RT- PCR

Procedia PDF Downloads 295