Search results for: fuzzy data
5007 A Procedure to Assess Streamflow Rating Curves and Streamflow Sequences
Authors: Elena Carcano, Mirzi Betasolo
Abstract:
This study aims to provide sub-hourly streamflow predictions and associated rating curves for small catchments of intermittent and torrential flow regime characterized by flash floods occurring especially during April and November. The methodology entails two lumped conceptual hydrological models which work in series. The total model is based upon eleven parameters and shows good flexibility in handling different input sets. Runoff Coefficient has contributed to improving the model’s performances and has been treated as an additional parameter; while Sensitivity Analysis has highlighted how slight changes in the model’s input can lead to changes in model’s output. The adopted procedure is steady and useful to give very practical engineering information at the expense of a parsimonious request both in input data and in the number of adopted parameters. According to the obtained results, the authors encourage the test of this combined procedure on different hydrological scenarios in order to provide information for poorly monitored catchments and not updated sites.
Keywords: Streamflow rating curve, chronological data, streamflow sequences, conceptual models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4205006 The Role and Importance of Genome Sequencing in Prediction of Cancer Risk
Authors: M. Sadeghi, H. Pezeshk, R. Tusserkani, A. Sharifi Zarchi, A. Malekpour, M. Foroughmand, S. Goliaei, M. Totonchi, N. Ansari–Pour
Abstract:
The role and relative importance of intrinsic and extrinsic factors in the development of complex diseases such as cancer still remains a controversial issue. Determining the amount of variation explained by these factors needs experimental data and statistical models. These models are nevertheless based on the occurrence and accumulation of random mutational events during stem cell division, thus rendering cancer development a stochastic outcome. We demonstrate that not only individual genome sequencing is uninformative in determining cancer risk, but also assigning a unique genome sequence to any given individual (healthy or affected) is not meaningful. Current whole-genome sequencing approaches are therefore unlikely to realize the promise of personalized medicine. In conclusion, since genome sequence differs from cell to cell and changes over time, it seems that determining the risk factor of complex diseases based on genome sequence is somewhat unrealistic, and therefore, the resulting data are likely to be inherently uninformative.
Keywords: Cancer risk, extrinsic factors, genome sequencing, intrinsic factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11175005 The Implementation of Self-Determination Theory on the Opportunities and Challenges for Blended e-Learning in Motivating Egyptian Logistic Learners
Authors: Aisha Tarek Noour, Nick Hubbard
Abstract:
Learner motivation is considered to be an important component for the Blended e-Learning (BL) Method. BL is an effective learning method in multiple domains, which opens several opportunities for its participants to engage in the learning environment. This research explores the learners’ perspective of BL according to the Self-Determination Theory (SDT). It identifies the opportunities and challenges for using the BL in Logistics Education (LE) in Egyptian Higher Education (HE). SDT is approached from different perspectives within the relationship between Intrinsic Motivation (IM), Extrinsic Motivation (EM) and Amotivation (AM). A self-administered face-to-face questionnaire was used to collect data from learners who were geographically widely spread around three colleges of International Transport and Logistics (CILTs) at the Arab Academy for Science, Technology and Maritime Transport (AAST&MT) in Egypt. Six hundred and sixteen undergraduates responded to a questionnaire survey. Respondents were drawn from three branches in Greater Cairo, Alexandria, and Port Said. The data analysis used was SPSS 22 and AMOS 18.
Keywords: Intrinsic Motivation, Extrinsic Motivation, Amotivation, Blended e-Learning, Self Determination Theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23775004 Correlations between Cleaning Frequency of Reservoir and Water Tower and Parameters of Water Quality
Authors: Chen Bi-Hsiang, Yang Hung-Wen, Lou Jie-Chung, Han Jia-Yun
Abstract:
This study was investigated on sampling and analyzing water quality in water reservoir & water tower installed in two kind of residential buildings and school facilities. Data of water quality was collected for correlation analysis with frequency of sanitization of water reservoir through questioning managers of building about the inspection charts recorded on equipment for water reservoir. Statistical software packages (SPSS) were applied to the data of two groups (cleaning frequency and water quality) for regression analysis to determine the optimal cleaning frequency of sanitization. The correlation coefficient (R) in this paper represented the degree of correlation, with values of R ranging from +1 to -1.After investigating three categories of drinking water users; this study found that the frequency of sanitization of water reservoir significantly influenced the water quality of drinking water. A higher frequency of sanitization (more than four times per 1 year) implied a higher quality of drinking water. Results indicated that sanitizing water reservoir & water tower should at least twice annually for achieving the aim of safety of drinking water.Keywords: cleaning frequency of sanitization, parameters ofwater quality, regression analysis, water reservoir & water tower
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17355003 Mathematical Modeling of Non-Isothermal Multi-Component Fluid Flow in Pipes Applying to Rapid Gas Decompression in Rich and Base Gases
Authors: Evgeniy Burlutskiy
Abstract:
The paper presents a one-dimensional transient mathematical model of compressible non-isothermal multicomponent fluid mixture flow in a pipe. The set of the mass, momentum and enthalpy conservation equations for gas phase is solved in the model. Thermo-physical properties of multi-component gas mixture are calculated by solving the Equation of State (EOS) model. The Soave-Redlich-Kwong (SRK-EOS) model is chosen. Gas mixture viscosity is calculated on the basis of the Lee-Gonzales- Eakin (LGE) correlation. Numerical analysis of rapid gas decompression process in rich and base natural gases is made on the basis of the proposed mathematical model. The model is successfully validated on the experimental data [1]. The proposed mathematical model shows a very good agreement with the experimental data [1] in a wide range of pressure values and predicts the decompression in rich and base gas mixtures much better than analytical and mathematical models, which are available from the open source literature.Keywords: Mathematical model, Multi-Component gas mixture flow, Rapid Gas Decompression
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19525002 Fault Detection of Pipeline in Water Distribution Network System
Authors: Shin Je Lee, Go Bong Choi, Jeong Cheol Seo, Jong Min Lee, Gibaek Lee
Abstract:
Water pipe network is installed underground and once equipped, it is difficult to recognize the state of pipes when the leak or burst happens. Accordingly, post management is often delayed after the fault occurs. Therefore, the systematic fault management system of water pipe network is required to prevent the accident and minimize the loss. In this work, we develop online fault detection system of water pipe network using data of pipes such as flow rate or pressure. The transient model describing water flow in pipelines is presented and simulated using MATLAB. The fault situations such as the leak or burst can be also simulated and flow rate or pressure data when the fault happens are collected. Faults are detected using statistical methods of fast Fourier transform and discrete wavelet transform, and they are compared to find which method shows the better fault detection performance.Keywords: fault detection, water pipeline model, fast Fourier transform, discrete wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23425001 Design and Analysis of an 8T Read Decoupled Dual Port SRAM Cell for Low Power High Speed Applications
Authors: Ankit Mitra
Abstract:
Speed, power consumption and area, are some of the most important factors of concern in modern day memory design. As we move towards Deep Sub-Micron Technologies, the problems of leakage current, noise and cell stability due to physical parameter variation becomes more pronounced. In this paper we have designed an 8T Read Decoupled Dual Port SRAM Cell with Dual Threshold Voltage and characterized it in terms of read and write delay, read and write noise margins, Data Retention Voltage and Leakage Current. Read Decoupling improves the Read Noise Margin and static power dissipation is reduced by using Dual-Vt transistors. The results obtained are compared with existing 6T, 8T, 9T SRAM Cells, which shows the superiority of the proposed design. The Cell is designed and simulated in TSPICE using 90nm CMOS process.
Keywords: CMOS, Dual-Port, Data Retention Voltage, 8T SRAM, Leakage Current, Noise Margin, Loop-cutting, Single-ended.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34705000 The Design of the HL7 RIM-based Sharing Components for Clinical Information Systems
Authors: Wei-Yi Yang, Li-Hui Lee, Hsiao-Li Gien, Hsing-Yi Chu, Yi-Ting Chou, Der-Ming Liou
Abstract:
The American Health Level Seven (HL7) Reference Information Model (RIM) consists of six back-bone classes that have different specialized attributes. Furthermore, for the purpose of enforcing the semantic expression, there are some specific mandatory vocabulary domains have been defined for representing the content values of some attributes. In the light of the fact that it is a duplicated effort on spending a lot of time and human cost to develop and modify Clinical Information Systems (CIS) for most hospitals due to the variety of workflows. This study attempts to design and develop sharing RIM-based components of the CIS for the different business processes. Therefore, the CIS contains data of a consistent format and type. The programmers can do transactions with the RIM-based clinical repository by the sharing RIM-based components. And when developing functions of the CIS, the sharing components also can be adopted in the system. These components not only satisfy physicians- needs in using a CIS but also reduce the time of developing new components of a system. All in all, this study provides a new viewpoint that integrating the data and functions with the business processes, it is an easy and flexible approach to build a new CIS.
Keywords: HL7, Reference Information Model (RIM), web service, process management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18864999 Dimensionality Reduction in Modal Analysis for Structural Health Monitoring
Authors: Elia Favarelli, Enrico Testi, Andrea Giorgetti
Abstract:
Autonomous structural health monitoring (SHM) of many structures and bridges became a topic of paramount importance for maintenance purposes and safety reasons. This paper proposes a set of machine learning (ML) tools to perform automatic feature selection and detection of anomalies in a bridge from vibrational data and compare different feature extraction schemes to increase the accuracy and reduce the amount of data collected. As a case study, the Z-24 bridge is considered because of the extensive database of accelerometric data in both standard and damaged conditions. The proposed framework starts from the first four fundamental frequencies extracted through operational modal analysis (OMA) and clustering, followed by time-domain filtering (tracking). The fundamental frequencies extracted are then fed to a dimensionality reduction block implemented through two different approaches: feature selection (intelligent multiplexer) that tries to estimate the most reliable frequencies based on the evaluation of some statistical features (i.e., entropy, variance, kurtosis), and feature extraction (auto-associative neural network (ANN)) that combine the fundamental frequencies to extract new damage sensitive features in a low dimensional feature space. Finally, one-class classification (OCC) algorithms perform anomaly detection, trained with standard condition points, and tested with normal and anomaly ones. In particular, principal component analysis (PCA), kernel principal component analysis (KPCA), and autoassociative neural network (ANN) are presented and their performance are compared. It is also shown that, by evaluating the correct features, the anomaly can be detected with accuracy and an F1 score greater than 95%.
Keywords: Anomaly detection, dimensionality reduction, frequencies selection, modal analysis, neural network, structural health monitoring, vibration measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7094998 A Settlement Strategy for Health Facilities in Emerging Countries: A Case Study in Brazil
Authors: Domenico Chizzoniti, Monica Moscatelli, Letizia Cattani, Piero Favino, Luca Preis
Abstract:
A settlement strategy is to anticipate and respond the needs of existing and future communities through the provision of primary health care facilities in marginalized areas. Access to a health care network is important to improving healthcare coverage, often lacking, in developing countries. The study explores that a good sanitary system strategy of rural contexts brings advantages to an existing settlement: improving transport, communication, water and social facilities. The objective of this paper is to define a possible methodology to implement primary health care facilities in disadvantaged areas of emerging countries. In this research, we analyze the case study of Lauro de Freitas, a municipality in the Brazilian state of Bahia, part of the Metropolitan Region of Salvador, with an area of 57,662 km² and 194.641 inhabitants. The health localization system in Lauro de Freitas is an integrated process that involves not only geographical aspects, but also a set of factors: population density, epidemiological data, allocation of services, road networks, and more. Data were collected also using semi-structured interviews and questionnaires to the local population. Synthesized data suggest that moving away from the coast where there is the greatest concentration of population and services, a network of primary health care facilities is able to improve the living conditions of small-dispersed communities. Based on the health service needs of populations, we have developed a methodological approach that is particularly useful in rural and remote contexts in emerging countries.Keywords: Primary health care, developing countries, policy health planning, settlement strategy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9814997 Generic Model for Timetabling Problems by Integer Linear Programming Approach
Authors: N. A. H. Aizam, V. Uvaraja
Abstract:
The agenda of showing the scheduled time for performing certain tasks is known as timetabling. It is widely used in many departments such as transportation, education, and production. Some difficulties arise to ensure all tasks happen in the time and place allocated. Therefore, many researchers invented various programming models to solve the scheduling problems from several fields. However, the studies in developing the general integer programming model for many timetabling problems are still questionable. Meanwhile, this thesis describes about creating a general model which solves different types of timetabling problems by considering the basic constraints. Initially, the common basic constraints from five different fields are selected and analyzed. A general basic integer programming model was created and then verified by using the medium set of data obtained randomly which is much similar to realistic data. The mathematical software, AIMMS with CPLEX as a solver has been used to solve the model. The model obtained is significant in solving many timetabling problems easily since it is modifiable to all types of scheduling problems which have same basic constraints.
Keywords: AIMMS mathematical software, integer linear programming, scheduling problems, timetabling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30344996 An Examination of the Factors Affecting the Adoption of Cloud Enterprise Resource Planning Systems in Egyptian Companies
Authors: Mayar A. Omar, Ismail Gomaa, Heba Badawy, Hosam Moubarak
Abstract:
Enterprise resource planning (ERP) is an integrated system that helps companies in managing their resources. There are two types of ERP systems, the traditional ERP systems, and the cloud ERP systems. Cloud ERP systems were introduced after the development of cloud computing technology. This research aims to identify the factors that affect the adoption of cloud ERP in Egyptian companies. Moreover, the aim of our study is to provide guidance to Egyptian companies in the cloud ERP adoption decision and to participate in increasing the number of the cloud ERP studies that are conducted in the Middle East and in developing countries. There are many factors influencing the adoption of cloud ERP in Egyptian organizations which are discussed and explained in the research. Those factors are examined through combining the Diffusion of Innovation theory (DOI) and technology-organization-environment framework (TOE). Data were collected through a survey that was developed using constructs from the existing studies of cloud computing and cloud ERP technologies and was then modified to fit our research. The analysis of the data was based on Structural Equation Modeling (SEM) using Smart PLS software that was used for the empirical analysis of the research model.
Keywords: cloud computing, cloud ERP systems, DOI, Egypt, SEM, TOE
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8844995 A Secure Auditing Framework for Load Balancing in Cloud Environment
Authors: R. Geetha, T. Padmavathy
Abstract:
Security audit is an important aspect or feature to be considered in cloud service customer. It is basically a certification process to audit the controls that deliver the security requirements. Security audits are conducted by trained and qualified staffs that belong to an independent auditing organization. Security audits must be carried as a standard of security controls. Proper check to be made that the cloud user has a proper reporting and logging facilities with the customer's system and hence ensuring appropriate business and operational flow of data through cloud service. We propose a cloud-based secure auditing framework, which enables confided in power to safely store their mystery information on the semi-believed cloud specialist co-ops, and specifically share their mystery information with a wide scope of information recipient, to diminish the key administration intricacy for power proprietors and information collectors. Unique in relation to past cloud-based information framework, data proprietors transfer their mystery information into cloud utilizing static and dynamic evaluating plan. Another propelled determination is, if any information beneficiary needs individual record to download, the information collector will send the solicitation to the expert. The specialist proprietor has the Access Control. At the off probability, the businessman must impart the primary record to the knowledge collector, acknowledge statistics beneficiary solicitation. Once the acknowledgement for the records is over, the recipient downloads the first record and this record shifting time with date and downloading time with date are monitored by the inspector. In addition to deduplication concept, diminished cloud memory area using dynamic document distribution has been proposed.
Keywords: Cloud computing, cloud storage auditing, data integrity, key exposure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11684994 Modeling and Simulation of a Hybrid System Solar Panel and Wind Turbine in the Quingeo Heritage Center in Ecuador
Authors: Juan Portoviejo Brito, Daniel Icaza Alvarez, Christian Castro Samaniego
Abstract:
In this article, we present the modeling, simulations, and energy conversion analysis of the solar-wind system for the Quingeo Heritage Center in Ecuador. A numerical model was constructed based on the 19 equations, it was coded in MATLAB R2017a, and the results were compared with the experimental data of the site. The model is built with the purpose of using it as a computer development for the optimization of resources and designs of hybrid systems in the Parish of Quingeo and its surroundings. The model obtained a fairly similar pattern compared to the data and curves obtained in the field experimentally and detailed in manuscript. It is important to indicate that this analysis has been carried out so that in the near future one or two of these power generation systems can be exploited in a massive way according to the budget assigned by the Parish GAD of Quingeo or other national or international organizations with the purpose of preserving this unique colonial helmet in Ecuador.
Keywords: Hybrid system, wind turbine, modeling, simulation, Smart Grid, Quingeo Azuay Ecuador.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7794993 Negative Selection as a Means of Discovering Unknown Temporal Patterns
Authors: Wanli Ma, Dat Tran, Dharmendra Sharma
Abstract:
The temporal nature of negative selection is an under exploited area. In a negative selection system, newly generated antibodies go through a maturing phase, and the survivors of the phase then wait to be activated by the incoming antigens after certain number of matches. These without having enough matches will age and die, while these with enough matches (i.e., being activated) will become active detectors. A currently active detector may also age and die if it cannot find any match in a pre-defined (lengthy) period of time. Therefore, what matters in a negative selection system is the dynamics of the involved parties in the current time window, not the whole time duration, which may be up to eternity. This property has the potential to define the uniqueness of negative selection in comparison with the other approaches. On the other hand, a negative selection system is only trained with “normal" data samples. It has to learn and discover unknown “abnormal" data patterns on the fly by itself. Consequently, it is more appreciate to utilize negation selection as a system for pattern discovery and recognition rather than just pattern recognition. In this paper, we study the potential of using negative selection in discovering unknown temporal patterns.
Keywords: Artificial Immune Systems, ComputationalIntelligence, Negative Selection, Pattern Discovery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16654992 Coastal Ecological Sensitivity and Risk Assessment: A Case Study of Sea Level Change in Apodi River (Atlantic Ocean), Northeast Brazil
Authors: Mukesh Singh Boori, Venerando Eustáquio Amaro, Helenice Vital
Abstract:
The present study has been carried out with a view to calculate the coastal vulnerability index (CVI) to know the high and low sensitive areas and area of inundation due to future SLR. Both conventional and remotely sensed data were used and analyzed through the modelling technique. Out of the total study area, 8.26% is very high risk, 14.21% high, 9.36% medium, 22.46% low and 7.35% in the very low vulnerable category, due to costal components. Results of the inundation analysis indicate that 225.2 km² and 397 km² of the land area will be submerged by flooding at 1m and 10m inundation levels. The most severely affected sectors are expected to be the residential, industrial and recreational areas. As this coast is planned for future coastal developmental activities, measures such as industrializations, building regulation, urban growth planning and agriculture, development of an integrated coastal zone management, strict enforcement of the Coastal Regulation Zone (CRZ) Act, monitoring of impacts and further research in this regard are recommended for the study area.
Keywords: Coastal planning, land use, satellite data, vulnerability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19864991 Implementation and Demonstration of Software-Defined Traffic Grooming
Authors: Lei Guo, Xu Zhang, Weigang Hou
Abstract:
Since the traditional network is closed and it has no architecture to create applications, it has been unable to evolve with changing demands under the rapid innovation in services. Additionally, due to the lack of the whole network profile, the quality of service cannot be well guaranteed in the traditional network. The Software Defined Network (SDN) utilizes global resources to support on-demand applications/services via open, standardized and programmable interfaces. In this paper, we implement the traffic grooming application under a real SDN environment, and the corresponding analysis is made. In our SDN: 1) we use OpenFlow protocol to control the entire network by using software applications running on the network operating system; 2) several virtual switches are combined into the data forwarding plane through Open vSwitch; 3) An OpenFlow controller, NOX, is involved as a logically centralized control plane that dynamically configures the data forwarding plane; 4) The traffic grooming based on SDN is demonstrated through dynamically modifying the idle time of flow entries. The experimental results demonstrate that the SDN-based traffic grooming effectively reduces the end-to-end delay, and the improvement ratio arrives to 99%.
Keywords: NOX, OpenFlow, software defined network, traffic grooming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10294990 Analysis of Target Location Estimation in High Performance Radar System
Authors: Jin-Hyeok Kim, Won-Chul Choi, Seung-Ri Jin, Dong-Jo Park
Abstract:
In this paper, an analysis of a target location estimation system using the best linear unbiased estimator (BLUE) for high performance radar systems is presented. In synthetic environments, we are here concerned with three key elements of radar system modeling, which makes radar systems operates accurately in strategic situation in virtual ground. Radar Cross Section (RCS) modeling is used to determine the actual amount of electromagnetic waves that are reflected from a tactical object. Pattern Propagation Factor (PPF) is an attenuation coefficient of the radar equation that contains the reflection from the surface of the earth, the diffraction, the refraction and scattering by the atmospheric environment. Clutter is the unwanted echoes of electronic systems. For the data fusion of output results from radar detection in synthetic environment, BLUE is used and compared with the mean values of each simulation results. Simulation results demonstrate the performance of the radar system.Keywords: Best linear unbiased estimator (BLUE) , data fusion, radar system modeling, target location estimation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20844989 Flexible Cities: A Multisided Spatial Application of Tracking Livability of Urban Environment
Authors: Maria Christofi, George Plastiras, Rafaella Elia, Vaggelis Tsiourtis, Theocharis Theocharides, Miltiadis Katsaros
Abstract:
The rapidly expanding urban areas of the world constitute a challenge of how we need to make the transition to "the next urbanization", which will be defined by new analytical tools and new sources of data. This paper is about the production of a spatial application, the ‘FUMapp’, where space and its initiative will be available literally, in meters, but also abstractly, at a sensed level. While existing spatial applications typically focus on illustrations of the urban infrastructure, the suggested application goes beyond the existing: It investigates how our environment's perception adapts to the alterations of the built environment through a dataset construction of biophysical measurements (eye-tracking, heart beating), and physical metrics (spatial characteristics, size of stimuli, rhythm of mobility). It explores the intersections between architecture, cognition, and computing where future design can be improved and identifies the flexibility and livability of the ‘available space’ of specific examined urban paths.
Keywords: Biophysical data, flexibility of urban, livability, next urbanization, spatial application.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10364988 The Security Trade-Offs in Resource Constrained Nodes for IoT Application
Authors: Sultan Alharby, Nick Harris, Alex Weddell, Jeff Reeve
Abstract:
The concept of the Internet of Things (IoT) has received much attention over the last five years. It is predicted that the IoT will influence every aspect of our lifestyles in the near future. Wireless Sensor Networks are one of the key enablers of the operation of IoTs, allowing data to be collected from the surrounding environment. However, due to limited resources, nature of deployment and unattended operation, a WSN is vulnerable to various types of attack. Security is paramount for reliable and safe communication between IoT embedded devices, but it does, however, come at a cost to resources. Nodes are usually equipped with small batteries, which makes energy conservation crucial to IoT devices. Nevertheless, security cost in terms of energy consumption has not been studied sufficiently. Previous research has used a security specification of 802.15.4 for IoT applications, but the energy cost of each security level and the impact on quality of services (QoS) parameters remain unknown. This research focuses on the cost of security at the IoT media access control (MAC) layer. It begins by studying the energy consumption of IEEE 802.15.4 security levels, which is followed by an evaluation for the impact of security on data latency and throughput, and then presents the impact of transmission power on security overhead, and finally shows the effects of security on memory footprint. The results show that security overhead in terms of energy consumption with a payload of 24 bytes fluctuates between 31.5% at minimum level over non-secure packets and 60.4% at the top security level of 802.15.4 security specification. Also, it shows that security cost has less impact at longer packet lengths, and more with smaller packet size. In addition, the results depicts a significant impact on data latency and throughput. Overall, maximum authentication length decreases throughput by almost 53%, and encryption and authentication together by almost 62%.Keywords: Internet of Things, IEEE 802.15.4, security cost evaluation, wireless sensor network, energy consumption.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14914987 Parametric Modeling Approach for Call Holding Times for IP based Public Safety Networks via EM Algorithm
Authors: Badarch Tuyatsetseg
Abstract:
This paper presents parametric probability density models for call holding times (CHTs) into emergency call center based on the actual data collected for over a week in the public Emergency Information Network (EIN) in Mongolia. When the set of chosen candidates of Gamma distribution family is fitted to the call holding time data, it is observed that the whole area in the CHT empirical histogram is underestimated due to spikes of higher probability and long tails of lower probability in the histogram. Therefore, we provide the Gaussian parametric model of a mixture of lognormal distributions with explicit analytical expressions for the modeling of CHTs of PSNs. Finally, we show that the CHTs for PSNs are fitted reasonably by a mixture of lognormal distributions via the simulation of expectation maximization algorithm. This result is significant as it expresses a useful mathematical tool in an explicit manner of a mixture of lognormal distributions.Keywords: A mixture of lognormal distributions, modeling call holding times, public safety network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16504986 Human Walking Vertical Force and Vertical Vibration of Pedestrian Bridge Induced by Its Higher Components
Authors: M. Yoneda
Abstract:
The purpose of this study is to identify human walking vertical force by using FFT power spectrum density from the experimental acceleration data of the human body. An experiment on human walking is carried out on a stationary floor especially paying attention to higher components of dynamic vertical walking force. Based on measured acceleration data of the human lumbar part, not only in-phase component with frequency of 2fw, 3fw, but also in-opposite-phase component with frequency of 0.5 fw, 1.5 fw, 2.5 fw where fw is the walking rate is observed. The vertical vibration of pedestrian bridge induced by higher components of human walking vertical force is also discussed in this paper. A full scale measurement for the existing pedestrian bridge with center span length of 33 m is carried out focusing on the resonance phenomenon due to higher components of human walking vertical force. Dynamic response characteristics excited by these vertical higher components of human walking are revealed from the dynamic design viewpoint of pedestrian bridge.
Keywords: Simplified method, Human walking vertical force, Higher component, Pedestrian bridge vibration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18164985 Improving 99mTc-tetrofosmin Myocardial Perfusion Images by Time Subtraction Technique
Authors: Yasuyuki Takahashi, Hayato Ishimura, Masao Miyagawa, Teruhito Mochizuki
Abstract:
Quantitative measurement of myocardium perfusion is possible with single photon emission computed tomography (SPECT) using a semiconductor detector. However, accumulation of 99mTc-tetrofosmin in the liver may make it difficult to assess that accurately in the inferior myocardium. Our idea is to reduce the high accumulation in the liver by using dynamic SPECT imaging and a technique called time subtraction. We evaluated the performance of a new SPECT system with a cadmium-zinc-telluride solid-state semi- conductor detector (Discovery NM 530c; GE Healthcare). Our system acquired list-mode raw data over 10 minutes for a typical patient. From the data, ten SPECT images were reconstructed, one for every minute of acquired data. Reconstruction with the semiconductor detector was based on an implementation of a 3-D iterative Bayesian reconstruction algorithm. We studied 20 patients with coronary artery disease (mean age 75.4 ± 12.1 years; range 42-86; 16 males and 4 females). In each subject, 259 MBq of 99mTc-tetrofosmin was injected intravenously. We performed both a phantom and a clinical study using dynamic SPECT. An approximation to a liver-only image is obtained by reconstructing an image from the early projections during which time the liver accumulation dominates (0.5~2.5 minutes SPECT image-5~10 minutes SPECT image). The extracted liver-only image is then subtracted from a later SPECT image that shows both the liver and the myocardial uptake (5~10 minutes SPECT image-liver-only image). The time subtraction of liver was possible in both a phantom and the clinical study. The visualization of the inferior myocardium was improved. In past reports, higher accumulation in the myocardium due to the overlap of the liver is un-diagnosable. Using our time subtraction method, the image quality of the 99mTc-tetorofosmin myocardial SPECT image is considerably improved.
Keywords: 99mTc-tetrofosmin, dynamic SPECT, time subtraction, semiconductor detector.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10334984 Reduction of Plutonium Production in Heavy Water Research Reactor: A Feasibility Study through Neutronic Analysis Using MCNPX2.6 and CINDER90 Codes
Authors: H. Shamoradifar, B. Teimuri, P. Parvaresh, S. Mohammadi
Abstract:
One of the main characteristics of Heavy Water Moderated Reactors is their high production of plutonium. This article demonstrates the possibility of reduction of plutonium and other actinides in Heavy Water Research Reactor. Among the many ways for reducing plutonium production in a heavy water reactor, in this research, changing the fuel from natural Uranium fuel to Thorium-Uranium mixed fuel was focused. The main fissile nucleus in Thorium-Uranium fuels is U-233 which would be produced after neutron absorption by Th-232, so the Thorium-Uranium fuels have some known advantages compared to the Uranium fuels. Due to this fact, four Thorium-Uranium fuels with different compositions ratios were chosen in our simulations; a) 10% UO2-90% THO2 (enriched= 20%); b) 15% UO2-85% THO2 (enriched= 10%); c) 30% UO2-70% THO2 (enriched= 5%); d) 35% UO2-65% THO2 (enriched= 3.7%). The natural Uranium Oxide (UO2) is considered as the reference fuel, in other words all of the calculated data are compared with the related data from Uranium fuel. Neutronic parameters were calculated and used as the comparison parameters. All calculations were performed by Monte Carol (MCNPX2.6) steady state reaction rate calculation linked to a deterministic depletion calculation (CINDER90). The obtained computational data showed that Thorium-Uranium fuels with four different fissile compositions ratios can satisfy the safety and operating requirements for Heavy Water Research Reactor. Furthermore, Thorium-Uranium fuels have a very good proliferation resistance and consume less fissile material than uranium fuels at the same reactor operation time. Using mixed Thorium-Uranium fuels reduced the long-lived α emitter, high radiotoxic wastes and the radio toxicity level of spent fuel.
Keywords: Burn-up, heavy water reactor, minor actinides, Monte Carlo, proliferation resistance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10054983 A Damage Level Assessment Model for Extra High Voltage Transmission Towers
Authors: Huan-Chieh Chiu, Hung-Shuo Wu, Chien-Hao Wang, Yu-Cheng Yang, Ching-Ya Tseng, Joe-Air Jiang
Abstract:
Power failure resulting from tower collapse due to violent seismic events might bring enormous and inestimable losses. The Chi-Chi earthquake, for example, strongly struck Taiwan and caused huge damage to the power system on September 21, 1999. Nearly 10% of extra high voltage (EHV) transmission towers were damaged in the earthquake. Therefore, seismic hazards of EHV transmission towers should be monitored and evaluated. The ultimate goal of this study is to establish a damage level assessment model for EHV transmission towers. The data of earthquakes provided by Taiwan Central Weather Bureau serve as a reference and then lay the foundation for earthquake simulations and analyses afterward. Some parameters related to the damage level of each point of an EHV tower are simulated and analyzed by the data from monitoring stations once an earthquake occurs. Through the Fourier transform, the seismic wave is then analyzed and transformed into different wave frequencies, and the data would be shown through a response spectrum. With this method, the seismic frequency which damages EHV towers the most is clearly identified. An estimation model is built to determine the damage level caused by a future seismic event. Finally, instead of relying on visual observation done by inspectors, the proposed model can provide a power company with the damage information of a transmission tower. Using the model, manpower required by visual observation can be reduced, and the accuracy of the damage level estimation can be substantially improved. Such a model is greatly useful for health and construction monitoring because of the advantages of long-term evaluation of structural characteristics and long-term damage detection.Keywords: Smart grid, EHV transmission tower, response spectrum, damage level monitoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10664982 Diagnosis of Multivariate Process via Nonlinear Kernel Method Combined with Qualitative Representation of Fault Patterns
Authors: Hyun-Woo Cho
Abstract:
The fault detection and diagnosis of complicated production processes is one of essential tasks needed to run the process safely with good final product quality. Unexpected events occurred in the process may have a serious impact on the process. In this work, triangular representation of process measurement data obtained in an on-line basis is evaluated using simulation process. The effect of using linear and nonlinear reduced spaces is also tested. Their diagnosis performance was demonstrated using multivariate fault data. It has shown that the nonlinear technique based diagnosis method produced more reliable results and outperforms linear method. The use of appropriate reduced space yielded better diagnosis performance. The presented diagnosis framework is different from existing ones in that it attempts to extract the fault pattern in the reduced space, not in the original process variable space. The use of reduced model space helps to mitigate the sensitivity of the fault pattern to noise.Keywords: Real-time Fault diagnosis, triangular representation of patterns in reduced spaces, Nonlinear kernel technique, multivariate statistical modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16044981 Internal Force State Recognition of Jiujiang Bridge Based on Cable Force-displacement Relationship
Authors: Weifeng Wang, Guoqing Huang, Xianwei Zeng
Abstract:
The nearly 21-year-old Jiujiang Bridge, which is suffering from uneven line shape, constant great downwarping of the main beam and cracking of the box girder, needs reinforcement and cable adjustment. It has undergone cable adjustment for twice with incomplete data. Therefore, the initial internal force state of the Jiujiang Bridge is identified as the key for the cable adjustment project. Based on parameter identification by means of static force test data, this paper suggests determining the initial internal force state of the cable-stayed bridge according to the cable force-displacement relationship parameter identification method. That is, upon measuring the displacement and the change in cable forces for twice, one can identify the parameters concerned by means of optimization. This method is applied to the cable adjustment, replacement and reinforcement project for the Jiujiang Bridge as a guidance for the cable adjustment and reinforcement project of the bridge.
Keywords: Cable-stayed bridge, cable force-displacement, parameter identification, internal force state
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15434980 The Mechanistic Deconvolutive Image Sensor Model for an Arbitrary Pan–Tilt Plane of View
Authors: S. H. Lim, T. Furukawa
Abstract:
This paper presents a generalized form of the mechanistic deconvolution technique (GMD) to modeling image sensors applicable in various pan–tilt planes of view. The mechanistic deconvolution technique (UMD) is modified with the given angles of a pan–tilt plane of view to formulate constraint parameters and characterize distortion effects, and thereby, determine the corrected image data. This, as a result, does not require experimental setup or calibration. Due to the mechanistic nature of the sensor model, the necessity for the sensor image plane to be orthogonal to its z-axis is eliminated, and it reduces the dependency on image data. An experiment was constructed to evaluate the accuracy of a model created by GMD and its insensitivity to changes in sensor properties and in pan and tilt angles. This was compared with a pre-calibrated model and a model created by UMD using two sensors with different specifications. It achieved similar accuracy with one-seventh the number of iterations and attained lower mean error by a factor of 2.4 when compared to the pre-calibrated and UMD model respectively. The model has also shown itself to be robust and, in comparison to pre-calibrated and UMD model, improved the accuracy significantly.Keywords: Image sensor modeling, mechanistic deconvolution, calibration, lens distortion
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15284979 The Role of Gender and Age on Students- Perceptions towards Online Education Case Study: Sakarya University, Vocational High School
Authors: Fahme Dabaj, Havva Başak
Abstract:
The aim of this study is to find out and analyze the role of gender and age on the perceptions of students to the distant online program offered by Vocational High School in Sakarya University. The research is based on a questionnaire as a mean of data collection method to find out the role of age and gender on the student-s perceptions toward online education, and the study progressed through finding relationships between the variables used in the data collection instrument. The findings of the analysis revealed that although the students registered to the online program by will, they preferred the traditional face-to-face education due to the difficulty of the nonverbal communication, their incompetence of using the technology required, and their belief in traditional face-toface learning more than online education. Regarding gender, the results showed that the female students have a better perception of the online education as opposed to the male students. Regarding age, the results showed that the older the students are the more is their preference towards attending face-toface classes.Keywords: Distance education, online education, interneteducation, student perceptions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18814978 Using Statistical Significance and Prediction to Test Long/Short Term Public Services and Patients Cohorts: A Case Study in Scotland
Authors: Sotirios Raptis
Abstract:
Health and Social care (HSc) services planning and scheduling are facing unprecedented challenges, due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven approaches can help to improve policies, plan and design services provision schedules using algorithms that assist healthcare managers to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as Classification and Regression Trees (CART), Random Forests (RF), and Logistic Regression (LGR). The significance tests Chi-Squared and Student’s test are used on data over a 39 years span for which data exist for services delivered in Scotland. The demands are associated using probabilities and are parts of statistical hypotheses. These hypotheses, as their NULL part, assume that the target demand is statistically dependent on other services’ demands. This linking is checked using the data. In addition, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus, groups of services. Statistical tests confirmed ML coupling and made the prediction statistically meaningful and proved that a target service can be matched reliably to other services while ML showed that such marked relationships can also be linear ones. Zero padding was used for missing years records and illustrated better such relationships both for limited years and for the entire span offering long-term data visualizations while limited years periods explained how well patients numbers can be related in short periods of time or that they can change over time as opposed to behaviours across more years. The prediction performance of the associations were measured using metrics such as Receiver Operating Characteristic (ROC), Area Under Curve (AUC) and Accuracy (ACC) as well as the statistical tests Chi-Squared and Student. Co-plots and comparison tables for the RF, CART, and LGR methods as well as the p-value from tests and Information Exchange (IE/MIE) measures are provided showing the relative performance of ML methods and of the statistical tests as well as the behaviour using different learning ratios. The impact of k-neighbours classification (k-NN), Cross-Correlation (CC) and C-Means (CM) first groupings was also studied over limited years and for the entire span. It was found that CART was generally behind RF and LGR but in some interesting cases, LGR reached an AUC = 0 falling below CART, while the ACC was as high as 0.912 showing that ML methods can be confused by zero-padding or by data’s irregularities or by the outliers. On average, 3 linear predictors were sufficient, LGR was found competing well RF and CART followed with the same performance at higher learning ratios. Services were packed only when a significance level (p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, low birth weights, alcoholism, drug abuse, and emergency admissions. The work found that different HSc services can be well packed as plans of limited duration, across various services sectors, learning configurations, as confirmed by using statistical hypotheses.
Keywords: Class, cohorts, data frames, grouping, prediction, probabilities, services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 461