Search results for: Neutronic Calculation
487 The Dynamics of a 3D Vibrating and Rotating Disc Gyroscope
Authors: Getachew T. Sedebo, Stephan V. Joubert, Michael Y. Shatalov
Abstract:
Conventional configuration of the vibratory disc gyroscope is based on in-plane non-axisymmetric vibrations of the disc with a prescribed circumferential wave number. Due to the Bryan's effect, the vibrating pattern of the disc becomes sensitive to the axial component of inertial rotation of the disc. Rotation of the vibrating pattern relative to the disc is proportional to the inertial angular rate and is measured by sensors. In the present paper, the authors investigate a possibility of making a 3D sensor on the basis of both in-plane and bending vibrations of the disc resonator. We derive equations of motion for the disc vibratory gyroscope, where both in-plane and bending vibrations are considered. Hamiltonian variational principle is used in setting up equations of motion and the corresponding boundary conditions. The theory of thin shells with the linear elasticity principles is used in formulating the problem and also the disc is assumed to be isotropic and obeys Hooke's Law. The governing equation for a specific mode is converted to an ODE to determine the eigenfunction. The resulting ODE has exact solution as a linear combination of Bessel and Neumann functions. We demonstrate how to obtain an explicit solution and hence the eigenvalues and corresponding eigenfunctions for annular disc with fixed inner boundary and free outer boundary. Finally, the characteristics equations are obtained and the corresponding eigenvalues are calculated. The eigenvalues are used for the calculation of tuning conditions of the 3D disc vibratory gyroscope.Keywords: Bryan’s effect, bending vibrations, disc gyroscope, eigenfunctions, eigenvalues, tuning conditions
Procedia PDF Downloads 322486 Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter
Authors: Ananya Das, Arun Kumar, Gazala Habib, Vivekanandan Perumal
Abstract:
Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.Keywords: air, component-specific toxicity, human health risks, particulate matter
Procedia PDF Downloads 311485 Machine Learning Based Approach for Measuring Promotion Effectiveness in Multiple Parallel Promotions’ Scenarios
Authors: Revoti Prasad Bora, Nikita Katyal
Abstract:
Promotion is a key element in the retail business. Thus, analysis of promotions to quantify their effectiveness in terms of Revenue and/or Margin is an essential activity in the retail industry. However, measuring the sales/revenue uplift is based on estimations, as the actual sales/revenue without the promotion is not present. Further, the presence of Halo and Cannibalization in a multiple parallel promotions’ scenario complicates the problem. Calculating Baseline by considering inter-brand/competitor items or using Halo and Cannibalization's impact on Revenue calculations by considering Baseline as an interpretation of items’ unit sales in neighboring nonpromotional weeks individually may not capture the overall Revenue uplift in the case of multiple parallel promotions. Hence, this paper proposes a Machine Learning based method for calculating the Revenue uplift by considering the Halo and Cannibalization impact on the Baseline and the Revenue. In the first section of the proposed methodology, Baseline of an item is calculated by incorporating the impact of the promotions on its related items. In the later section, the Revenue of an item is calculated by considering both Halo and Cannibalization impacts. Hence, this methodology enables correct calculation of the overall Revenue uplift due a given promotion.Keywords: Halo, Cannibalization, promotion, Baseline, temporary price reduction, retail, elasticity, cross price elasticity, machine learning, random forest, linear regression
Procedia PDF Downloads 177484 The Analysis of Education Sector and Poverty Alleviation with Benefit Incidence Analysis Approach Budget Allocation Policy in East Java
Authors: Wildan Syafitri
Abstract:
The main purpose of the development is to embody public welfare. Its indication is shown by the increasing of the public prosperity in which it will be related to the consumption level as a consequence of the increasing of public income. One of the government’s efforts to increase public welfare is to create development equity in order to alleviate poor people. Poverty’s problem is not merely about the number and percentage of the poor people, but also it includes the gap and severity of poverty.the analysis method used is Benefit Incidence Analysis (BIA) that is an analysis method used to disclose the impact of government policy or individual access based on the income distribution in society. Further, the finding of the study revealed is that the highest number of the poor people in the village is those who are unemployed and have family members who are still in the Junior High School. The income distribution calculation shows a fairly good budget allocation applied with good mass ratio that is 0.31. In addition, the finding of this study also discloses that Indonesian Government policy to subsidize education cost for Elementary and Junior High School students has reached the right target. It is indicated by more benefits received by Elementary and Junior High School students who are poor and very poor than other income group.Keywords: benefit incidence analysis, budget allocation, poverty, education
Procedia PDF Downloads 393483 A Case Study for User Rating Prediction on Automobile Recommendation System Using Mapreduce
Authors: Jiao Sun, Li Pan, Shijun Liu
Abstract:
Recommender systems have been widely used in contemporary industry, and plenty of work has been done in this field to help users to identify items of interest. Collaborative Filtering (CF, for short) algorithm is an important technology in recommender systems. However, less work has been done in automobile recommendation system with the sharp increase of the amount of automobiles. What’s more, the computational speed is a major weakness for collaborative filtering technology. Therefore, using MapReduce framework to optimize the CF algorithm is a vital solution to this performance problem. In this paper, we present a recommendation of the users’ comment on industrial automobiles with various properties based on real world industrial datasets of user-automobile comment data collection, and provide recommendation for automobile providers and help them predict users’ comment on automobiles with new-coming property. Firstly, we solve the sparseness of matrix using previous construction of score matrix. Secondly, we solve the data normalization problem by removing dimensional effects from the raw data of automobiles, where different dimensions of automobile properties bring great error to the calculation of CF. Finally, we use the MapReduce framework to optimize the CF algorithm, and the computational speed has been improved times. UV decomposition used in this paper is an often used matrix factorization technology in CF algorithm, without calculating the interpolation weight of neighbors, which will be more convenient in industry.Keywords: collaborative filtering, recommendation, data normalization, mapreduce
Procedia PDF Downloads 217482 Challenges in the Material and Action-Resistance Factor Design for Embedded Retaining Wall Limit State Analysis
Authors: Kreso Ivandic, Filip Dodigovic, Damir Stuhec
Abstract:
The paper deals with the proposed 'Material' and 'Action-resistance factor' design methods in designing the embedded retaining walls. The parametric analysis of evaluating the differences of the output values mutually and compared with classic approach computation was performed. There is a challenge with the criteria for choosing the proposed calculation design methods in Eurocode 7 with respect to current technical regulations and regular engineering practice. The basic criterion for applying a particular design method is to ensure minimum an equal degree of reliability in relation to the current practice. The procedure of combining the relevant partial coefficients according to design methods was carried out. The use of mentioned partial coefficients should result in the same level of safety, regardless of load combinations, material characteristics and problem geometry. This proposed approach of the partial coefficients related to the material and/or action-resistance should aimed at building a bridge between calculations used so far and pure probability analysis. The measure to compare the results was to determine an equivalent safety factor for each analysis. The results show a visible wide span of equivalent values of the classic safety factors.Keywords: action-resistance factor design, classic approach, embedded retaining wall, Eurocode 7, limit states, material factor design
Procedia PDF Downloads 231481 A Comparative Analysis of Solid Waste Treatment Technologies on Cost and Environmental Basis
Authors: Nesli Aydin
Abstract:
Waste management decision making in developing countries has moved towards being more pragmatic, transparent, sustainable and comprehensive. Turkey is required to make its waste related legislation compatible with European Legislation as it is a candidate country of the European Union. Improper Turkish practices such as open burning and open dumping practices must be abandoned urgently, and robust waste management systems have to be structured. The determination of an optimum waste management system in any region requires a comprehensive analysis in which many criteria are taken into account by stakeholders. In conducting this sort of analysis, there are two main criteria which are evaluated by waste management analysts; economic viability and environmentally friendliness. From an analytical point of view, a central characteristic of sustainable development is an economic-ecological integration. It is predicted that building a robust waste management system will need significant effort and cooperation between the stakeholders in developing countries such as Turkey. In this regard, this study aims to provide data regarding the cost and environmental burdens of waste treatment technologies such as an incinerator, an autoclave (with different capacities), a hydroclave and a microwave coupled with updated information on calculation methods, and a framework for comparing any proposed scenario performances on a cost and environmental basis.Keywords: decision making, economic viability, environmentally friendliness, waste management systems
Procedia PDF Downloads 305480 A Non-Destructive Estimation Method for Internal Time in Perilla Leaf Using Hyperspectral Data
Authors: Shogo Nagano, Yusuke Tanigaki, Hirokazu Fukuda
Abstract:
Vegetables harvested early in the morning or late in the afternoon are valued in plant production, and so the time of harvest is important. The biological functions known as circadian clocks have a significant effect on this harvest timing. The purpose of this study was to non-destructively estimate the circadian clock and so construct a method for determining a suitable harvest time. We took eight samples of green busil (Perilla frutescens var. crispa) every 4 hours, six times for 1 day and analyzed all samples at the same time. A hyperspectral camera was used to collect spectrum intensities at 141 different wavelengths (350–1050 nm). Calculation of correlations between spectrum intensity of each wavelength and harvest time suggested the suitability of the hyperspectral camera for non-destructive estimation. However, even the highest correlated wavelength had a weak correlation, so we used machine learning to raise the accuracy of estimation and constructed a machine learning model to estimate the internal time of the circadian clock. Artificial neural networks (ANN) were used for machine learning because this is an effective analysis method for large amounts of data. Using the estimation model resulted in an error between estimated and real times of 3 min. The estimations were made in less than 2 hours. Thus, we successfully demonstrated this method of non-destructively estimating internal time.Keywords: artificial neural network (ANN), circadian clock, green busil, hyperspectral camera, non-destructive evaluation
Procedia PDF Downloads 299479 Correlations between Wear Rate and Energy Dissipation Mechanisms in a Ti6Al4V–WC/Co Sliding Pair
Authors: J. S. Rudas, J. M. Gutiérrez Cabeza, A. Corz Rodríguez, L. M. Gómez, A. O. Toro
Abstract:
The prediction of the wear rate of rubbing pairs has attracted the interest of many researchers for years. It has been recently proposed that the sliding wear rate can be inferred from the calculation of the energy rate dissipated by the tribological pair. In this paper some of the dissipative mechanisms present in a pin-on-disc configuration are discussed and both analytical and numerical calculations are carried out. Three dissipative mechanisms were studied: First, the energy release due to temperature gradients within the solid; second, the heat flow from the solid to the environment, and third, the energy loss due to abrasive damage of the surface. The Finite Element Method was used to calculate the dynamics of heat transfer within the solid, with the aid of commercial software. Validation the FEM model was assisted by virtual and laboratory experimentation using different operating points (sliding velocity and geometry contact). The materials for the experiments were Ti6Al4V alloy and Tungsten Carbide (WC-Co). The results showed that the sliding wear rate has a linear relationship with the energy dissipation flow. It was also found that energy loss due to micro-cutting is relevant for the system. This mechanism changes if the sliding velocity and pin geometry are modified though the degradation coefficient continues to present a linear behavior. We found that the less relevant dissipation mechanism for all the cases studied is the energy release by temperature gradients in the solid.Keywords: degradation, dissipative mechanism, dry sliding, entropy, friction, wear
Procedia PDF Downloads 502478 Estimation of Normalized Glandular Doses Using a Three-Layer Mammographic Phantom
Authors: Kuan-Jen Lai, Fang-Yi Lin, Shang-Rong Huang, Yun-Zheng Zeng, Po-Chieh Hsu, Jay Wu
Abstract:
The normalized glandular dose (DgN) estimates the energy deposition of mammography in clinical practice. The Monte Carlo simulations frequently use uniformly mixed phantom for calculating the conversion factor. However, breast tissues are not uniformly distributed, leading to errors of conversion factor estimation. This study constructed a three-layer phantom to estimated more accurate of normalized glandular dose. In this study, MCNP code (Monte Carlo N-Particles code) was used to create the geometric structure. We simulated three types of target/filter combinations (Mo/Mo, Mo/Rh, Rh/Rh), six voltages (25 ~ 35 kVp), six HVL parameters and nine breast phantom thicknesses (2 ~ 10 cm) for the three-layer mammographic phantom. The conversion factor for 25%, 50% and 75% glandularity was calculated. The error of conversion factors compared with the results of the American College of Radiology (ACR) was within 6%. For Rh/Rh, the difference was within 9%. The difference between the 50% average glandularity and the uniform phantom was 7.1% ~ -6.7% for the Mo/Mo combination, voltage of 27 kVp, half value layer of 0.34 mmAl, and breast thickness of 4 cm. According to the simulation results, the regression analysis found that the three-layer mammographic phantom at 0% ~ 100% glandularity can be used to accurately calculate the conversion factors. The difference in glandular tissue distribution leads to errors of conversion factor calculation. The three-layer mammographic phantom can provide accurate estimates of glandular dose in clinical practice.Keywords: Monte Carlo simulation, mammography, normalized glandular dose, glandularity
Procedia PDF Downloads 189477 Research on the Optimization of the Facility Layout of Efficient Cafeterias for Troops
Authors: Qing Zhang, Jiachen Nie, Yujia Wen, Guanyuan Kou, Peng Yu, Kun Xia, Qin Yang, Li Ding
Abstract:
BACKGROUND: A facility layout problem (FLP) is an NP-complete (non-deterministic polynomial) problem, which is hard to obtain an exact optimal solution. FLP has been widely studied in various limited spaces and workflows. For example, cafeterias with many types of equipment for troops cause chaotic processes when dining. OBJECTIVE: This article tried to optimize the layout of troops’ cafeteria and to improve the overall efficiency of the dining process. METHODS: First, the original cafeteria layout design scheme was analyzed from an ergonomic perspective and two new design schemes were generated. Next, three facility layout models were designed, and further simulation was applied to compare the total time and density of troops between each scheme. Last, an experiment of the dining process with video observation and analysis verified the simulation results. RESULTS: In a simulation, the dining time under the second new layout is shortened by 2.25% and 1.89% (p<0.0001, p=0.0001) compared with the other two layouts, while troops-flow density and interference both greatly reduced in the two new layouts. In the experiment, process completing time and the number of interference reduced as well, which verified corresponding simulation results. CONCLUSIONS: Our two new layout schemes are tested to be optimal by a series of simulation and space experiments. In future research, similar approaches could be applied when taking layout-design algorithm calculation into consideration.Keywords: layout optimization, dining efficiency, troops’ cafeteria, anylogic simulation, field experiment
Procedia PDF Downloads 143476 The Effect of Body Positioning on Upper-Limb Arterial Occlusion Pressure and the Reliability of the Method during Blood Flow Restriction Training
Authors: Stefanos Karanasios, Charkleia Koutri, Maria Moutzouri, Sofia A. Xergia, Vasiliki Sakellari, George Gioftsos
Abstract:
The precise calculation of arterial occlusive pressure (AOP) is a critical step to accurately prescribe individualized pressures during blood flow restriction training (BFRT). AOP is usually measured in a supine position before training; however, previous reports suggested a significant influence in lower limb AOP across different body positions. The aim of the study was to investigate the effect of three different body positions on upper limb AOP and the reliability of the method for its standardization in clinical practice. Forty-two healthy participants (Mean age: 28.1, SD: ±7.7) underwent measurements of upper limb AOP in supine, seated, and standing positions by three blinded raters. A cuff with a manual pump and a pocket doppler ultrasound were used. A significantly higher upper limb AOP was found in seated compared with supine position (p < 0.031) and in supine compared with standing position (p < 0.031) by all raters. An excellent intraclass correlation coefficient (0.858- 0.984, p < 0.001) was found in all positions. Upper limb AOP is strongly dependent on body position changes. The appropriate measurement position should be selected to accurately calculate AOP before BFRT. The excellent inter-rater reliability and repeatability of the method suggest reliable and consistent results across repeated measurements.Keywords: Kaatsu training, blood flow restriction training, arterial occlusion, reliability
Procedia PDF Downloads 212475 PathoPy2.0: Application of Fractal Geometry for Early Detection and Histopathological Analysis of Lung Cancer
Authors: Rhea Kapoor
Abstract:
Fractal dimension provides a way to characterize non-geometric shapes like those found in nature. The purpose of this research is to estimate Minkowski fractal dimension of human lung images for early detection of lung cancer. Lung cancer is the leading cause of death among all types of cancer and an early histopathological analysis will help reduce deaths primarily due to late diagnosis. A Python application program, PathoPy2.0, was developed for analyzing medical images in pixelated format and estimating Minkowski fractal dimension using a new box-counting algorithm that allows windowing of images for more accurate calculation in the suspected areas of cancerous growth. Benchmark geometric fractals were used to validate the accuracy of the program and changes in fractal dimension of lung images to indicate the presence of issues in the lung. The accuracy of the program for the benchmark examples was between 93-99% of known values of the fractal dimensions. Fractal dimension values were then calculated for lung images, from National Cancer Institute, taken over time to correctly detect the presence of cancerous growth. For example, as the fractal dimension for a given lung increased from 1.19 to 1.27 due to cancerous growth, it represents a significant change in fractal dimension which lies between 1 and 2 for 2-D images. Based on the results obtained on many lung test cases, it was concluded that fractal dimension of human lungs can be used to diagnose lung cancer early. The ideas behind PathoPy2.0 can also be applied to study patterns in the electrical activity of the human brain and DNA matching.Keywords: fractals, histopathological analysis, image processing, lung cancer, Minkowski dimension
Procedia PDF Downloads 178474 Performing Diagnosis in Building with Partially Valid Heterogeneous Tests
Authors: Houda Najeh, Mahendra Pratap Singh, Stéphane Ploix, Antoine Caucheteux, Karim Chabir, Mohamed Naceur Abdelkrim
Abstract:
Building system is highly vulnerable to different kinds of faults and human misbehaviors. Energy efficiency and user comfort are directly targeted due to abnormalities in building operation. The available fault diagnosis tools and methodologies particularly rely on rules or pure model-based approaches. It is assumed that model or rule-based test could be applied to any situation without taking into account actual testing contexts. Contextual tests with validity domain could reduce a lot of the design of detection tests. The main objective of this paper is to consider fault validity when validate the test model considering the non-modeled events such as occupancy, weather conditions, door and window openings and the integration of the knowledge of the expert on the state of the system. The concept of heterogeneous tests is combined with test validity to generate fault diagnoses. A combination of rules, range and model-based tests known as heterogeneous tests are proposed to reduce the modeling complexity. Calculation of logical diagnoses coming from artificial intelligence provides a global explanation consistent with the test result. An application example shows the efficiency of the proposed technique: an office setting at Grenoble Institute of Technology.Keywords: heterogeneous tests, validity, building system, sensor grids, sensor fault, diagnosis, fault detection and isolation
Procedia PDF Downloads 293473 An Investigation of Rainfall Changes in KanganCity During Years 1964 to 2003
Authors: Borzou Faramarzi, Farideh Azimi, Azam Gohardoust, Abbas Ghasemi Ghasemvand, Maryam Mirzaei, Mandana Amani
Abstract:
In this study, attempts were made to examine and analyze the trend for rainfall changes in Kangan City, Booshehr Province, during the time span 1964 to 2003, using seven rainfall threshold indices based on 50 climate extremes indices approved by WMO–CCL/CLIVAR. These indices include days with heavy precipitations, days with rainfalls, frequency of rainfall threshold values, intensity of rainfall threshold values, percentage of rainfall threshold values, successive days of rainfall, and successive days with no precipitation. Results are indicative of the fact that Kangan City climatic conditions have become more dried than before. Indices days with heavy precipitations and days with rainfalls do not show a certain trend in Kangan City. Frequency, intensity, and percentage of rainfall threshold values in the station under investigation do not indicate a certain trend. In analysis of time series of rainfall extreme indices, generally, it was revealed that Kangan City is influenced by general factors of global warming. Calculation of values for the next 10 years based on ARIMA models demonstrates a continuation of warming trends in Kangan City. On the whole, rainfall conditions in Kangan City have experienced more dry periods compared to the past, the trend which is also observable for next 10 years.Keywords: climatic indices, climate change, extreme temperature and precipitation, time series
Procedia PDF Downloads 272472 The Identification of Combined Genomic Expressions as a Diagnostic Factor for Oral Squamous Cell Carcinoma
Authors: Ki-Yeo Kim
Abstract:
Trends in genetics are transforming in order to identify differential coexpressions of correlated gene expression rather than the significant individual gene. Moreover, it is known that a combined biomarker pattern improves the discrimination of a specific cancer. The identification of the combined biomarker is also necessary for the early detection of invasive oral squamous cell carcinoma (OSCC). To identify the combined biomarker that could improve the discrimination of OSCC, we explored an appropriate number of genes in a combined gene set in order to attain the highest level of accuracy. After detecting a significant gene set, including the pre-defined number of genes, a combined expression was identified using the weights of genes in a gene set. We used the Principal Component Analysis (PCA) for the weight calculation. In this process, we used three public microarray datasets. One dataset was used for identifying the combined biomarker, and the other two datasets were used for validation. The discrimination accuracy was measured by the out-of-bag (OOB) error. There was no relation between the significance and the discrimination accuracy in each individual gene. The identified gene set included both significant and insignificant genes. One of the most significant gene sets in the classification of normal and OSCC included MMP1, SOCS3 and ACOX1. Furthermore, in the case of oral dysplasia and OSCC discrimination, two combined biomarkers were identified. The combined genomic expression achieved better performance in the discrimination of different conditions than in a single significant gene. Therefore, it could be expected that accurate diagnosis for cancer could be possible with a combined biomarker.Keywords: oral squamous cell carcinoma, combined biomarker, microarray dataset, correlated genes
Procedia PDF Downloads 423471 Estimation of Time Loss and Costs of Traffic Congestion: The Contingent Valuation Method
Authors: Amira Mabrouk, Chokri Abdennadher
Abstract:
The reduction of road congestion which is inherent to the use of vehicles is an obvious priority to public authority. Therefore, assessing the willingness to pay of an individual in order to save trip-time is akin to estimating the change in price which was the result of setting up a new transport policy to increase the networks fluidity and improving the level of social welfare. This study holds an innovative perspective. In fact, it initiates an economic calculation that has the objective of giving an estimation of the monetized time value during the trips made in Sfax. This research is founded on a double-objective approach. The aim of this study is to i) give an estimation of the monetized value of time; an hour dedicated to trips, ii) determine whether or not the consumer considers the environmental variables to be significant, iii) analyze the impact of applying a public management of the congestion via imposing taxation of city tolls on urban dwellers. This article is built upon a rich field survey led in the city of Sfax. With the use of the contingent valuation method, we analyze the “declared time preferences” of 450 drivers during rush hours. Based on the fond consideration of attributed bias of the applied method, we bring to light the delicacy of this approach with regards to the revelation mode and the interrogative techniques by following the NOAA panel recommendations bearing the exception of the valorization point and other similar studies about the estimation of transportation externality.Keywords: willingness to pay, contingent valuation, time value, city toll
Procedia PDF Downloads 434470 Development of Precise Ephemeris Generation Module for Thaichote Satellite Operations
Authors: Manop Aorpimai, Ponthep Navakitkanok
Abstract:
In this paper, the development of the ephemeris generation module used for the Thaichote satellite operations is presented. It is a vital part of the flight dynamics system, which comprises, the orbit determination, orbit propagation, event prediction and station-keeping maneuver modules. In the generation of the spacecraft ephemeris data, the estimated orbital state vector from the orbit determination module is used as an initial condition. The equations of motion are then integrated forward in time to predict the satellite states. The higher geopotential harmonics, as well as other disturbing forces, are taken into account to resemble the environment in low-earth orbit. Using a highly accurate numerical integrator based on the Burlish-Stoer algorithm the ephemeris data can be generated for long-term predictions, by using a relatively small computation burden and short calculation time. Some events occurring during the prediction course that are related to the mission operations, such as the satellite’s rise/set viewed from the ground station, Earth and Moon eclipses, the drift in ground track as well as the drift in the local solar time of the orbital plane are all detected and reported. When combined with other modules to form a flight dynamics system, this application is aimed to be applied for the Thaichote satellite and successive Thailand’s Earth-observation missions.Keywords: flight dynamics system, orbit propagation, satellite ephemeris, Thailand’s Earth Observation Satellite
Procedia PDF Downloads 377469 Technology in the Calculation of People Health Level: Design of a Computational Tool
Authors: Sara Herrero Jaén, José María Santamaría García, María Lourdes Jiménez Rodríguez, Jorge Luis Gómez González, Adriana Cercas Duque, Alexandra González Aguna
Abstract:
Background: Health concept has evolved throughout history. The health level is determined by the own individual perception. It is a dynamic process over time so that you can see variations from one moment to the next. In this way, knowing the health of the patients you care for, will facilitate decision making in the treatment of care. Objective: To design a technological tool that calculates the people health level in a sequential way over time. Material and Methods: Deductive methodology through text analysis, extraction and logical knowledge formalization and education with expert group. Studying time: September 2015- actually. Results: A computational tool for the use of health personnel has been designed. It has 11 variables. Each variable can be given a value from 1 to 5, with 1 being the minimum value and 5 being the maximum value. By adding the result of the 11 variables we obtain a magnitude in a certain time, the health level of the person. The health calculator allows to represent people health level at a time, establishing temporal cuts being useful to determine the evolution of the individual over time. Conclusion: The Information and Communication Technologies (ICT) allow training and help in various disciplinary areas. It is important to highlight their relevance in the field of health. Based on the health formalization, care acts can be directed towards some of the propositional elements of the concept above. The care acts will modify the people health level. The health calculator allows the prioritization and prediction of different strategies of health care in hospital units.Keywords: calculator, care, eHealth, health
Procedia PDF Downloads 264468 Theoretical Study of Structural, Magnetic, and Magneto-Optical Properties of Ultrathin Films of Fe/Cu (001)
Authors: Mebarek Boukelkoul, Abdelhalim Haroun
Abstract:
By means of the first principle calculation, we have investigated the structural, magnetic and magneto-optical properties of the ultra-thin films of Fen/Cu(001) with (n=1, 2, 3). We adopted a relativistic approach using DFT theorem with local spin density approximation (LSDA). The electronic structure is performed within the framework of the Spin-Polarized Relativistic (SPR) Linear Muffin-Tin Orbitals (LMTO) with the Atomic Sphere Approximation (ASA) method. During the variational principle, the crystal wave function is expressed as a linear combination of the Bloch sums of the so-called relativistic muffin-tin orbitals centered on the atomic sites. The crystalline structure is calculated after an atomic relaxation process using the optimization of the total energy with respect to the atomic interplane distance. A body-centered tetragonal (BCT) pseudomorphic crystalline structure with a tetragonality ratio c/a larger than unity is found. The magnetic behaviour is characterized by an enhanced magnetic moment and a ferromagnetic interplane coupling. The polar magneto-optical Kerr effect spectra are given over a photon energy range extended to 15eV and the microscopic origin of the most interesting features are interpreted by interband transitions. Unlike thin layers, the anisotropy in the ultra-thin films is characterized by a perpendicular magnetization which is perpendicular to the film plane.Keywords: ultrathin films, magnetism, magneto-optics, pseudomorphic structure
Procedia PDF Downloads 335467 Stochastic Edge Based Anomaly Detection for Supervisory Control and Data Acquisitions Systems: Considering the Zambian Power Grid
Authors: Lukumba Phiri, Simon Tembo, Kumbuso Joshua Nyoni
Abstract:
In Zambia recent initiatives by various power operators like ZESCO, CEC, and consumers like the mines to upgrade power systems into smart grids target an even tighter integration with information technologies to enable the integration of renewable energy sources, local and bulk generation, and demand response. Thus, for the reliable operation of smart grids, its information infrastructure must be secure and reliable in the face of both failures and cyberattacks. Due to the nature of the systems, ICS/SCADA cybersecurity and governance face additional challenges compared to the corporate networks, and critical systems may be left exposed. There exist control frameworks internationally such as the NIST framework, however, there are generic and do not meet the domain-specific needs of the SCADA systems. Zambia is also lagging in cybersecurity awareness and adoption, therefore there is a concern about securing ICS controlling key infrastructure critical to the Zambian economy as there are few known facts about the true posture. In this paper, we introduce a stochastic Edged-based Anomaly Detection for SCADA systems (SEADS) framework for threat modeling and risk assessment. SEADS enables the calculation of steady-steady probabilities that are further applied to establish metrics like system availability, maintainability, and reliability.Keywords: anomaly, availability, detection, edge, maintainability, reliability, stochastic
Procedia PDF Downloads 110466 An Activity Based Trajectory Search Approach
Authors: Mohamed Mahmoud Hasan, Hoda M. O. Mokhtar
Abstract:
With the gigantic increment in portable applications use and the spread of positioning and location-aware technologies that we are seeing today, new procedures and methodologies for location-based strategies are required. Location recommendation is one of the highly demanded location-aware applications uniquely with the wide accessibility of social network applications that are location-aware including Facebook check-ins, Foursquare, and others. In this paper, we aim to present a new methodology for location recommendation. The proposed approach coordinates customary spatial traits alongside other essential components including shortest distance, and user interests. We also present another idea namely, "activity trajectory" that represents trajectory that fulfills the set of activities that the user is intrigued to do. The approach dispatched acquaints the related distance value to select trajectory(ies) with minimum cost value (distance) and spatial-area to prune unneeded directions. The proposed calculation utilizes the idea of movement direction to prescribe most comparable N-trajectory(ies) that matches the client's required action design with least voyaging separation. To upgrade the execution of the proposed approach, parallel handling is applied through the employment of a MapReduce based approach. Experiments taking into account genuine information sets were built up and tested for assessing the proposed approach. The exhibited tests indicate how the proposed approach beets different strategies giving better precision and run time.Keywords: location based recommendation, map-reduce, recommendation system, trajectory search
Procedia PDF Downloads 223465 A Practical and Theoretical Study on the Electromotor Bearing Defect Detection in a Wet Mill Using the Vibration Analysis Method and Defect Length Calculation in the Bearing
Authors: Mostafa Firoozabadi, Alireza Foroughi Nematollahi
Abstract:
Wet mills are one of the most important equipment in the mining industries and any defect occurrence in them can stop the production line and it can make some irrecoverable damages to the system. Electromotors are the significant parts of a mill and their monitoring is a necessary process to prevent unwanted defects. The purpose of this study is to investigate the Electromotor bearing defects, theoretically and practically, using the vibration analysis method. When a defect happens in a bearing, it can be transferred to the other parts of the equipment like inner ring, outer ring, balls, and the bearing cage. The electromotor defects source can be electrical or mechanical. Sometimes, the electrical and mechanical defect frequencies are modulated and the bearing defect detection becomes difficult. In this paper, to detect the electromotor bearing defects, the electrical and mechanical defect frequencies are extracted firstly. Then, by calculating the bearing defect frequencies, and the spectrum and time signal analysis, the bearing defects are detected. In addition, the obtained frequency determines that the bearing level in which the defect has happened and by comparing this level to the standards it determines the bearing remaining lifetime. Finally, the defect length is calculated by theoretical equations to demonstrate that there is no need to replace the bearing. The results of the proposed method, which has been implemented on the wet mills in the Golgohar mining and industrial company in Iran, show that this method is capable of detecting the electromotor bearing defects accurately and on time.Keywords: bearing defect length, defect frequency, electromotor defects, vibration analysis
Procedia PDF Downloads 502464 Assessing the Walkability and Urban Design Qualities of Campus Streets
Authors: Zhehao Zhang
Abstract:
Walking has become an indispensable and sustainable way of travel for college students in their daily lives; campus street is an important carrier for students to walk and take part in a variety of activities, improving the walkability of campus streets plays an important role in optimizing the quality of campus space environment, promoting the campus walking system and inducing multiple walking behaviors. The purpose of this paper is to explore the effect of campus layout, facility distribution, and location site selection on the walkability of campus streets, and assess the street design qualities from the elements of imageability, enclosure, complexity, transparency, and human scale, and further examines the relationship between street-level urban design perceptual qualities and walkability and its effect on walking behavior in the campus. Taking Tianjin University as the research object, this paper uses the optimized walk score method based on walking frequency, variety, and distance to evaluate the walkability of streets from a macro perspective and measures the urban design qualities in terms of the calculation of street physical environment characteristics, as well as uses behavior annotation and street image data to establish temporal and spatial behavior database to analyze walking activity from the microscopic view. In addition, based on the conclusions, the improvement and design strategy will be presented from the aspects of the built walking environment, street vitality, and walking behavior.Keywords: walkability, streetscapes, pedestrian activity, walk score
Procedia PDF Downloads 144463 Re-Inhabiting the Roof: Han Slawick Covered Roof Terrace, Amsterdam
Authors: Simone Medio
Abstract:
If we observe many modern cities from above, we are typically confronted with a sea of asphalt-clad flat rooftops. In contrast to the modernist expectation of a populated flat roof, flat rooftops in modern multi-story buildings are rarely used. On the contrary, they typify a desolate and abandoned landscape encouraging mechanical system allocation. Flat roof technology continues to be seen as a state-of-fact in most multi-storey building designs and its greening its prevalent environmental justification. This paper aims to seek a change in the approach to flat roofing. It makes a case for the opportunity at hand for architectonically resolute, sheltered, livable spaces that make a better use of the environment at rooftop level. The researcher is looking for the triggers that allow for that change to happen in the design process of case study buildings. The paper begins by exploring Han Slawick covered roof terrace in Amsterdam as a simple and essential example of transforming the flat roof in a usable, inhabitable space. It investigates the design challenges and the logistic, financial and legislative hurdles faced by the architect, and the outcomes in terms of building performance and occupant use and satisfaction. The researcher uses a grounded research methodology with direct interview process to the architect in charge of the building and the building user. Energy simulation tools and calculation of running costs are also used as further means of validating change.Keywords: environmental design, flat rooftop persistence, roof re-habitation, tectonics
Procedia PDF Downloads 273462 Performance Comparison of Resource Allocation without Feedback in Wireless Body Area Networks by Various Pseudo Orthogonal Sequences
Authors: Ojin Kwon, Yong-Jin Yoon, Liu Xin, Zhang Hongbao
Abstract:
Wireless Body Area Network (WBAN) is a short-range wireless communication around human body for various applications such as wearable devices, entertainment, military, and especially medical devices. WBAN attracts the attention of continuous health monitoring system including diagnostic procedure, early detection of abnormal conditions, and prevention of emergency situations. Compared to cellular network, WBAN system is more difficult to control inter- and inner-cell interference due to the limited power, limited calculation capability, mobility of patient, and non-cooperation among WBANs. In this paper, we compare the performance of resource allocation scheme based on several Pseudo Orthogonal Codewords (POCs) to mitigate inter-WBAN interference. Previously, the POCs are widely exploited for a protocol sequence and optical orthogonal code. Each POCs have different properties of auto- and cross-correlation and spectral efficiency according to its construction of POCs. To identify different WBANs, several different pseudo orthogonal patterns based on POCs exploits for resource allocation of WBANs. By simulating these pseudo orthogonal resource allocations of WBANs on MATLAB, we obtain the performance of WBANs according to different POCs and can analyze and evaluate the suitability of POCs for the resource allocation in the WBANs system.Keywords: wireless body area network, body sensor network, resource allocation without feedback, interference mitigation, pseudo orthogonal pattern
Procedia PDF Downloads 353461 Accuracy of Autonomy Navigation of Unmanned Aircraft Systems through Imagery
Authors: Sidney A. Lima, Hermann J. H. Kux, Elcio H. Shiguemori
Abstract:
The Unmanned Aircraft Systems (UAS) usually navigate through the Global Navigation Satellite System (GNSS) associated with an Inertial Navigation System (INS). However, GNSS can have its accuracy degraded at any time or even turn off the signal of GNSS. In addition, there is the possibility of malicious interferences, known as jamming. Therefore, the image navigation system can solve the autonomy problem, because if the GNSS is disabled or degraded, the image navigation system would continue to provide coordinate information for the INS, allowing the autonomy of the system. This work aims to evaluate the accuracy of the positioning though photogrammetry concepts. The methodology uses orthophotos and Digital Surface Models (DSM) as a reference to represent the object space and photograph obtained during the flight to represent the image space. For the calculation of the coordinates of the perspective center and camera attitudes, it is necessary to know the coordinates of homologous points in the object space (orthophoto coordinates and DSM altitude) and image space (column and line of the photograph). So if it is possible to automatically identify in real time the homologous points the coordinates and attitudes can be calculated whit their respective accuracies. With the methodology applied in this work, it is possible to verify maximum errors in the order of 0.5 m in the positioning and 0.6º in the attitude of the camera, so the navigation through the image can reach values equal to or higher than the GNSS receivers without differential correction. Therefore, navigating through the image is a good alternative to enable autonomous navigation.Keywords: autonomy, navigation, security, photogrammetry, remote sensing, spatial resection, UAS
Procedia PDF Downloads 190460 Evaluation of Turbulence Prediction over Washington, D.C.: Comparison of DCNet Observations and North American Mesoscale Model Outputs
Authors: Nebila Lichiheb, LaToya Myles, William Pendergrass, Bruce Hicks, Dawson Cagle
Abstract:
Atmospheric transport of hazardous materials in urban areas is increasingly under investigation due to the potential impact on human health and the environment. In response to health and safety concerns, several dispersion models have been developed to analyze and predict the dispersion of hazardous contaminants. The models of interest usually rely on meteorological information obtained from the meteorological models of NOAA’s National Weather Service (NWS). However, due to the complexity of the urban environment, NWS forecasts provide an inadequate basis for dispersion computation in urban areas. A dense meteorological network in Washington, DC, called DCNet, has been operated by NOAA since 2003 to support the development of urban monitoring methodologies and provide the driving meteorological observations for atmospheric transport and dispersion models. This study focuses on the comparison of wind observations from the DCNet station on the U.S. Department of Commerce Herbert C. Hoover Building against the North American Mesoscale (NAM) model outputs for the period 2017-2019. The goal is to develop a simple methodology for modifying NAM outputs so that the dispersion requirements of the city and its urban area can be satisfied. This methodology will allow us to quantify the prediction errors of the NAM model and propose adjustments of key variables controlling dispersion model calculation.Keywords: meteorological data, Washington D.C., DCNet data, NAM model
Procedia PDF Downloads 233459 Feedback Matrix Approach for Relativistic Runaway Electron Avalanches Dynamics in Complex Electric Field Structures
Authors: Egor Stadnichuk
Abstract:
Relativistic runaway electron avalanches (RREA) are a widely accepted source of thunderstorm gamma-radiation. In regions with huge electric field strength, RREA can multiply via relativistic feedback. The relativistic feedback is caused both by positron production and by runaway electron bremsstrahlung gamma-rays reversal. In complex multilayer thunderstorm electric field structures, an additional reactor feedback mechanism appears due to gamma-ray exchange between separate strong electric field regions with different electric field directions. The study of this reactor mechanism in conjunction with the relativistic feedback with Monte Carlo simulations or by direct solution of the kinetic Boltzmann equation requires a significant amount of computational time. In this work, a theoretical approach to study feedback mechanisms in RREA physics is developed. It is based on the matrix of feedback operators construction. With the feedback matrix, the problem of the dynamics of avalanches in complex electric structures is reduced to the problem of finding eigenvectors and eigenvalues. A method of matrix elements calculation is proposed. The proposed concept was used to study the dynamics of RREAs in multilayer thunderclouds.Keywords: terrestrial Gamma-ray flashes, thunderstorm ground enhancement, relativistic runaway electron avalanches, gamma-rays, high-energy atmospheric physics, TGF, TGE, thunderstorm, relativistic feedback, reactor feedback, reactor model
Procedia PDF Downloads 172458 Resource Orchestration Based on Two-Sides Scheduling in Computing Network Control Sytems
Authors: Li Guo, Jianhong Wang, Dian Huang, Shengzhong Feng
Abstract:
Computing networks as a new network architecture has shown great promise in boosting the utilization of different resources, such as computing, caching, and communications. To maximise the efficiency of resource orchestration in computing network control systems (CNCSs), this work proposes a dynamic orchestration strategy of a different resource based on task requirements from computing power requestors (CPRs). Specifically, computing power providers (CPPs) in CNCSs could share information with each other through communication channels on the basis of blockchain technology, especially their current idle resources. This dynamic process is modeled as a cooperative game in which CPPs have the same target of maximising long-term rewards by improving the resource utilization ratio. Meanwhile, the task requirements from CPRs, including size, deadline, and calculation, are simultaneously considered in this paper. According to task requirements, the proposed orchestration strategy could schedule the best-fitting resource in CNCSs, achieving the maximum long-term rewards of CPPs and the best quality of experience (QoE) of CRRs at the same time. Based on the EdgeCloudSim simulation platform, the efficiency of the proposed strategy is achieved from both sides of CPRs and CPPs. Besides, experimental results show that the proposed strategy outperforms the other comparisons in all cases.Keywords: computing network control systems, resource orchestration, dynamic scheduling, blockchain, cooperative game
Procedia PDF Downloads 114