Search results for: measurement uncertainty
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3500

Search results for: measurement uncertainty

2750 Seismic Fragility Curves Methodologies for Bridges: A Review

Authors: Amirmozafar Benshams, Khatere Kashmari, Farzad Hatami, Mesbah Saybani

Abstract:

As a part of the transportation network, bridges are one of the most vulnerable structures. In order to investigate the vulnerability and seismic evaluation of bridges performance, identifying of bridge associated with various state of damage is important. Fragility curves provide important data about damage states and performance of bridges against earthquakes. The development of vulnerability information in the form of fragility curves is a widely practiced approach when the information is to be developed accounting for a multitude of uncertain source involved. This paper presents the fragility curve methodologies for bridges and investigates the practice and applications relating to the seismic fragility assessment of bridges.

Keywords: fragility curve, bridge, uncertainty, NLTHA, IDA

Procedia PDF Downloads 269
2749 Considering Uncertainties of Input Parameters on Energy, Environmental Impacts and Life Cycle Costing by Monte Carlo Simulation in the Decision Making Process

Authors: Johannes Gantner, Michael Held, Matthias Fischer

Abstract:

The refurbishment of the building stock in terms of energy supply and efficiency is one of the major challenges of the German turnaround in energy policy. As the building sector accounts for 40% of Germany’s total energy demand, additional insulation is key for energy efficient refurbished buildings. Nevertheless the energetic benefits often the environmental and economic performances of insulation materials are questioned. The methods Life Cycle Assessment (LCA) as well as Life Cycle Costing (LCC) can form the standardized basis for answering this doubts and more and more become important for material producers due efforts such as Product Environmental Footprint (PEF) or Environmental Product Declarations (EPD). Due to increasing use of LCA and LCC information for decision support the robustness and resilience of the results become crucial especially for support of decision and policy makers. LCA and LCC results are based on respective models which depend on technical parameters like efficiencies, material and energy demand, product output, etc.. Nevertheless, the influence of parameter uncertainties on lifecycle results are usually not considered or just studied superficially. Anyhow the effect of parameter uncertainties cannot be neglected. Based on the example of an exterior wall the overall lifecycle results are varying by a magnitude of more than three. As a result simple best case worst case analyses used in practice are not sufficient. These analyses allow for a first rude view on the results but are not taking effects into account such as error propagation. Thereby LCA practitioners cannot provide further guidance for decision makers. Probabilistic analyses enable LCA practitioners to gain deeper understanding of the LCA and LCC results and provide a better decision support. Within this study, the environmental and economic impacts of an exterior wall system over its whole lifecycle are illustrated, and the effect of different uncertainty analysis on the interpretation in terms of resilience and robustness are shown. Hereby the approaches of error propagation and Monte Carlo Simulations are applied and combined with statistical methods in order to allow for a deeper understanding and interpretation. All in all this study emphasis the need for a deeper and more detailed probabilistic evaluation based on statistical methods. Just by this, misleading interpretations can be avoided, and the results can be used for resilient and robust decisions.

Keywords: uncertainty, life cycle assessment, life cycle costing, Monte Carlo simulation

Procedia PDF Downloads 274
2748 Cobb Angle Measurement from Coronal X-Rays Using Artificial Neural Networks

Authors: Andrew N. Saylor, James R. Peters

Abstract:

Scoliosis is a complex 3D deformity of the thoracic and lumbar spines, clinically diagnosed by measurement of a Cobb angle of 10 degrees or more on a coronal X-ray. The Cobb angle is the angle made by the lines drawn along the proximal and distal endplates of the respective proximal and distal vertebrae comprising the curve. Traditionally, Cobb angles are measured manually using either a marker, straight edge, and protractor or image measurement software. The task of measuring the Cobb angle can also be represented by a function taking the spine geometry rendered using X-ray imaging as input and returning the approximate angle. Although the form of such a function may be unknown, it can be approximated using artificial neural networks (ANNs). The performance of ANNs is affected by many factors, including the choice of activation function and network architecture; however, the effects of these parameters on the accuracy of scoliotic deformity measurements are poorly understood. Therefore, the objective of this study was to systematically investigate the effect of ANN architecture and activation function on Cobb angle measurement from the coronal X-rays of scoliotic subjects. The data set for this study consisted of 609 coronal chest X-rays of scoliotic subjects divided into 481 training images and 128 test images. These data, which included labeled Cobb angle measurements, were obtained from the SpineWeb online database. In order to normalize the input data, each image was resized using bi-linear interpolation to a size of 500 × 187 pixels, and the pixel intensities were scaled to be between 0 and 1. A fully connected (dense) ANN with a fixed cost function (mean squared error), batch size (10), and learning rate (0.01) was developed using Python Version 3.7.3 and TensorFlow 1.13.1. The activation functions (sigmoid, hyperbolic tangent [tanh], or rectified linear units [ReLU]), number of hidden layers (1, 3, 5, or 10), and number of neurons per layer (10, 100, or 1000) were varied systematically to generate a total of 36 network conditions. Stochastic gradient descent with early stopping was used to train each network. Three trials were run per condition, and the final mean squared errors and mean absolute errors were averaged to quantify the network response for each condition. The network that performed the best used ReLU neurons had three hidden layers, and 100 neurons per layer. The average mean squared error of this network was 222.28 ± 30 degrees2, and the average mean absolute error was 11.96 ± 0.64 degrees. It is also notable that while most of the networks performed similarly, the networks using ReLU neurons, 10 hidden layers, and 1000 neurons per layer, and those using Tanh neurons, one hidden layer, and 10 neurons per layer performed markedly worse with average mean squared errors greater than 400 degrees2 and average mean absolute errors greater than 16 degrees. From the results of this study, it can be seen that the choice of ANN architecture and activation function has a clear impact on Cobb angle inference from coronal X-rays of scoliotic subjects.

Keywords: scoliosis, artificial neural networks, cobb angle, medical imaging

Procedia PDF Downloads 114
2747 The New Economy: A Pedagogy for Vocational and Technical Education Programmes in Nigeria

Authors: Sunny Nwakanma

Abstract:

The emergence of the new economy has created a new world order for skill acquisition, economic activities and employment. It has dramatically changed the way we live, learn, work and even think about work. It has also created new opportunities as well as challenges and uncertainty. This paper will not only demystify the new economy and present its instrumentality in the acceleration of skill acquisition in technical education, but will also highlight industrial and occupational changes brought about by the synergy between information and communication technology revolution and the global economic system. It advocates among other things, the use of information and communication technology mediated instruction in technical education as it provides the flexibility to meet diverse learners’ need anytime and anywhere and facilitate skill acquisition.

Keywords: new economy, technical education, skill acquisition, information and communication technology

Procedia PDF Downloads 111
2746 Predicting the Exposure Level of Airborne Contaminants in Occupational Settings via the Well-Mixed Room Model

Authors: Alireza Fallahfard, Ludwig Vinches, Stephane Halle

Abstract:

In the workplace, the exposure level of airborne contaminants should be evaluated due to health and safety issues. It can be done by numerical models or experimental measurements, but the numerical approach can be useful when it is challenging to perform experiments. One of the simplest models is the well-mixed room (WMR) model, which has shown its usefulness to predict inhalation exposure in many situations. However, since the WMR is limited to gases and vapors, it cannot be used to predict exposure to aerosols. The main objective is to modify the WMR model to expand its application to exposure scenarios involving aerosols. To reach this objective, the standard WMR model has been modified to consider the deposition of particles by gravitational settling and Brownian and turbulent deposition. Three deposition models were implemented in the model. The time-dependent concentrations of airborne particles predicted by the model were compared to experimental results conducted in a 0.512 m3 chamber. Polystyrene particles of 1, 2, and 3 µm in aerodynamic diameter were generated with a nebulizer under two air changes per hour (ACH). The well-mixed condition and chamber ACH were determined by the tracer gas decay method. The mean friction velocity on the chamber surfaces as one of the input variables for the deposition models was determined by computational fluid dynamics (CFD) simulation. For the experimental procedure, the particles were generated until reaching the steady-state condition (emission period). Then generation stopped, and concentration measurements continued until reaching the background concentration (decay period). The results of the tracer gas decay tests revealed that the ACHs of the chamber were: 1.4 and 3.0, and the well-mixed condition was achieved. The CFD results showed the average mean friction velocity and their standard deviations for the lowest and highest ACH were (8.87 ± 0.36) ×10-2 m/s and (8.88 ± 0.38) ×10-2 m/s, respectively. The numerical results indicated the difference between the predicted deposition rates by the three deposition models was less than 2%. The experimental and numerical aerosol concentrations were compared in the emission period and decay period. In both periods, the prediction accuracy of the modified model improved in comparison with the classic WMR model. However, there is still a difference between the actual value and the predicted value. In the emission period, the modified WMR results closely follow the experimental data. However, the model significantly overestimates the experimental results during the decay period. This finding is mainly due to an underestimation of the deposition rate in the model and uncertainty related to measurement devices and particle size distribution. Comparing the experimental and numerical deposition rates revealed that the actual particle deposition rate is significant, but the deposition mechanisms considered in the model were ten times lower than the experimental value. Thus, particle deposition was significant and will affect the airborne concentration in occupational settings, and it should be considered in the airborne exposure prediction model. The role of other removal mechanisms should be investigated.

Keywords: aerosol, CFD, exposure assessment, occupational settings, well-mixed room model, zonal model

Procedia PDF Downloads 89
2745 A Case Study on the Condition Monitoring of a Critical Machine in a Tyre Manufacturing Plant

Authors: Ramachandra C. G., Amarnath. M., Prashanth Pai M., Nagesh S. N.

Abstract:

The machine's performance level drops down over a period of time due to the wear and tear of its components. The early detection of an emergent fault becomes very vital in order to obtain uninterrupted production in a plant. Maintenance is an activity that helps to keep the machine's performance at an anticipated level, thereby ensuring the availability of the machine to perform its intended function. At present, a number of modern maintenance techniques are available, such as preventive maintenance, predictive maintenance, condition-based maintenance, total productive maintenance, etc. Condition-based maintenance or condition monitoring is one such modern maintenance technique in which the machine's condition or health is checked by the measurement of certain parameters such as sound level, temperature, velocity, displacement, vibration, etc. It can recognize most of the factors restraining the usefulness and efficacy of the total manufacturing unit. This research work is conducted on a Batch Mill in a tire production unit located in the Southern Karnataka region. The health of the mill is assessed using amplitude of vibration as a parameter of measurement. Most commonly, the vibration level is assessed using various points on the machine bearing. The normal or standard level is fixed using reference materials such as manuals or catalogs supplied by the manufacturers and also by referring vibration standards. The Rio-Vibro meter is placed in different locations on the batch-off mill to record the vibration data. The data collected are analyzed to identify the malfunctioning components in the batch off the mill, and corrective measures are suggested.

Keywords: availability, displacement, vibration, rio-vibro, condition monitoring

Procedia PDF Downloads 64
2744 A Proposal of Advanced Key Performance Indicators for Assessing Six Performances of Construction Projects

Authors: Wi Sung Yoo, Seung Woo Lee, Youn Kyoung Hur, Sung Hwan Kim

Abstract:

Large-scale construction projects are continuously increasing, and the need for tools to monitor and evaluate the project success is emphasized. At the construction industry level, there are limitations in deriving performance evaluation factors that reflect the diversity of construction sites and systems that can objectively evaluate and manage performance. Additionally, there are difficulties in integrating structured and unstructured data generated at construction sites and deriving improvements. In this study, we propose the Key Performance Indicators (KPIs) to enable performance evaluation that reflects the increased diversity of construction sites and the unstructured data generated, and present a model for measuring performance by the derived indicators. The comprehensive performance of a unit construction site is assessed based on 6 areas (Time, Cost, Quality, Safety, Environment, Productivity) and 26 indicators. We collect performance indicator information from 30 construction sites that meet legal standards and have been successfully performed. And We apply data augmentation and optimization techniques into establishing measurement standards for each indicator. In other words, the KPI for construction site performance evaluation presented in this study provides standards for evaluating performance in six areas using institutional requirement data and document data. This can be expanded to establish a performance evaluation system considering the scale and type of construction project. Also, they are expected to be used as a comprehensive indicator of the construction industry and used as basic data for tracking competitiveness at the national level and establishing policies.

Keywords: key performance indicator, performance measurement, structured and unstructured data, data augmentation

Procedia PDF Downloads 14
2743 Leadership in the Emergence Paradigm: a Literature Review on the Medusa Principles

Authors: Everard van Kemenade

Abstract:

Many quality improvement activities are planned. Leaders are strongly involved in missions, visions and strategic planning. They use, consciously or unconsciously, the PDCA-cycle, also know as the Deming cycle. After the planning, the plans are carried out and the results or effects are measured. If the results show that the goals in the plan have not been achieved, adjustments are made in the next plan or in the execution of the processes. Then, the cycle is run through again. Traditionally, the PDCA-cycle is advocated as a means to an end. However, PDCA is especially fit for planned, ordered, certain contexts. It fits with the empirical and referential quality paradigm. For uncertain, unordered, unplanned processes, something else might be needed instead of Plan-Do-Check-Act. Due to the complexity of our society, the influence of the context, and the uncertainty in our world nowadays, not every activity can be planned anymore. At the same time organisations need to be more innovative than ever. That provides leaders with ‘wicked tendencies’. However, that raises the question how one can innovate without being able to plan? Complexity science studies the interactions of a diverse group of agents that bring about change in times of uncertainty, e.g. when radical innovation is co-created. This process is called emergence. This research study explores the role of leadership in the emergence paradigm. Aim of the article is to study the way that leadership can support the emergence of innovation in a complex context. First, clarity is given on the concepts used in the research question: complexity, emergence, innovation and leadership. Thereafter a literature search is conducted to answer the research question. The topics ‘emergent leadership’ or ‘complexity leadership’ are chosen for an exploratory search in Google and Google Scholar using the berry picking method. Exclusion criterion is emergence in other disciplines than organizational development or in the meaning of ‘arising’. The literature search conducted gave 45 hits. Twenty-seven articles were excluded after reading the title and abstract because they did not research the topic of emergent leadership and complexity. After reading the remaining articles as a whole one more was excluded because the article used emergent in the limited meaning of ‗arising‘ and eight more were excluded because the topic did not match the research question of this article. That brings the total of the search to 17 articles. The useful conclusions from the articles are merged and grouped together under overarching topics, using thematic analysis. The findings are that 5 topics prevail when looking at possibilities for leadership to facilitate innovation: enabling, sharing values, dreaming, interacting, context sensitivity and adaptivity. Together they form In Dutch the acronym Medusa.

Keywords: complexity science, emergence, leadership in the emergence paradigm, innovation, the Medusa principles

Procedia PDF Downloads 8
2742 W-WING: Aeroelastic Demonstrator for Experimental Investigation into Whirl Flutter

Authors: Jiri Cecrdle

Abstract:

This paper describes the concept of the W-WING whirl flutter aeroelastic demonstrator. Whirl flutter is the specific case of flutter that accounts for the additional dynamic and aerodynamic influences of the engine rotating parts. The instability is driven by motion-induced unsteady aerodynamic propeller forces and moments acting in the propeller plane. Whirl flutter instability is a serious problem that may cause the unstable vibration of a propeller mounting, leading to the failure of an engine installation or an entire wing. The complicated physical principle of whirl flutter required the experimental validation of the analytically gained results. W-WING aeroelastic demonstrator has been designed and developed at Czech Aerospace Research Centre (VZLU) Prague, Czechia. The demonstrator represents the wing and engine of the twin turboprop commuter aircraft. Contrary to the most of past demonstrators, it includes a powered motor and thrusting propeller. It allows the changes of the main structural parameters influencing the whirl flutter stability characteristics. Propeller blades are adjustable at standstill. The demonstrator is instrumented by strain gauges, accelerometers, revolution-counting impulse sensor, sensor of airflow velocity, and the thrust measurement unit. Measurement is supported by the in house program providing the data storage and real-time depiction in the time domain as well as pre-processing into the form of the power spectral densities. The engine is linked with a servo-drive unit, which enables maintaining of the propeller revolutions (constant or controlled rate ramp) and monitoring of immediate revolutions and power. Furthermore, the program manages the aerodynamic excitation of the demonstrator by the aileron flapping (constant, sweep, impulse). Finally, it provides the safety guard to prevent any structural failure of the demonstrator hardware. In addition, LMS TestLab system is used for the measurement of the structure response and for the data assessment by means of the FFT- and OMA-based methods. The demonstrator is intended for the experimental investigations in the VZLU 3m-diameter low-speed wind tunnel. The measurement variant of the model is defined by the structural parameters: pitch and yaw attachment stiffness, pitch and yaw hinge stations, balance weight station, propeller type (duralumin or steel blades), and finally, angle of attack of the propeller blade 75% section (). The excitation is provided either by the airflow turbulence or by means of the aerodynamic excitation by the aileron flapping using a frequency harmonic sweep. The experimental results are planned to be utilized for validation of analytical methods and software tools in the frame of development of the new complex multi-blade twin-rotor propulsion system for the new generation regional aircraft. Experimental campaigns will include measurements of aerodynamic derivatives and measurements of stability boundaries for various configurations of the demonstrator.

Keywords: aeroelasticity, flutter, whirl flutter, W WING demonstrator

Procedia PDF Downloads 79
2741 Evaluating and Prioritizing the Effective Management Factors of Human Resources Empowerment and Efficiency in Manufacturing Companies: A Case Study of Fars’ Livestock and Poultry Manufacturing Companies

Authors: Mohsen Yaghmoor, Sima Radmanesh

Abstract:

Rapid environmental changes have been threaten the life of many organizations .Enabling and productivity of human resource should be considered as the most important issue in order to increase performance and ensure survival of the organizations. In this research, the effectiveness of management factory in productivity & inability of human resource have been identified and reviewed at glance. Afterward there were two questions they are “what are the factors effecting productivity and enabling of human resource” . And ”what are the priority order based on effective management of human resource in Fars Poultry Complex". A specified questionnaire has been designed in order to priorities and effectiveness of the identified factors. Six factors specify to consist of: Individual characteristics, teaching, motivation, partnership management, authority or power submission and job development that have most effect on organization. Then specify a questionnaire for priority and effect measurement of specified factor that reach after collect information and using statistical tests of keronchbakh alpha coefficient r=0.792 that we can say the questionnaire has sufficient reliability. After information analysis of specified six factors by Friedman test categorize their effect. Measurement on organization respectively consists of individual characteristics, job development or enrichment, authority submission, partnership management, teaching and motivation. At last it has been indicated to approaches to increase making power full and productivity of manpower.

Keywords: productivity, empowerment, enrichment, authority submission, partnership management, teaching, motivation

Procedia PDF Downloads 230
2740 Robust Electrical Segmentation for Zone Coherency Delimitation Base on Multiplex Graph Community Detection

Authors: Noureddine Henka, Sami Tazi, Mohamad Assaad

Abstract:

The electrical grid is a highly intricate system designed to transfer electricity from production areas to consumption areas. The Transmission System Operator (TSO) is responsible for ensuring the efficient distribution of electricity and maintaining the grid's safety and quality. However, due to the increasing integration of intermittent renewable energy sources, there is a growing level of uncertainty, which requires a faster responsive approach. A potential solution involves the use of electrical segmentation, which involves creating coherence zones where electrical disturbances mainly remain within the zone. Indeed, by means of coherent electrical zones, it becomes possible to focus solely on the sub-zone, reducing the range of possibilities and aiding in managing uncertainty. It allows faster execution of operational processes and easier learning for supervised machine learning algorithms. Electrical segmentation can be applied to various applications, such as electrical control, minimizing electrical loss, and ensuring voltage stability. Since the electrical grid can be modeled as a graph, where the vertices represent electrical buses and the edges represent electrical lines, identifying coherent electrical zones can be seen as a clustering task on graphs, generally called community detection. Nevertheless, a critical criterion for the zones is their ability to remain resilient to the electrical evolution of the grid over time. This evolution is due to the constant changes in electricity generation and consumption, which are reflected in graph structure variations as well as line flow changes. One approach to creating a resilient segmentation is to design robust zones under various circumstances. This issue can be represented through a multiplex graph, where each layer represents a specific situation that may arise on the grid. Consequently, resilient segmentation can be achieved by conducting community detection on this multiplex graph. The multiplex graph is composed of multiple graphs, and all the layers share the same set of vertices. Our proposal involves a model that utilizes a unified representation to compute a flattening of all layers. This unified situation can be penalized to obtain (K) connected components representing the robust electrical segmentation clusters. We compare our robust segmentation to the segmentation based on a single reference situation. The robust segmentation proves its relevance by producing clusters with high intra-electrical perturbation and low variance of electrical perturbation. We saw through the experiences when robust electrical segmentation has a benefit and in which context.

Keywords: community detection, electrical segmentation, multiplex graph, power grid

Procedia PDF Downloads 59
2739 Rescheduling of Manufacturing Flow Shop under Different Types of Disruption

Authors: M. Ndeley

Abstract:

Now our days, Almost all manufacturing facilities need to use production planning and scheduling systems to increase productivity and to reduce production costs. Real-life production operations are subject to a large number of unexpected disruptions that may invalidate the original schedules. In these cases, rescheduling is essential to minimize the impact on the performance of the system. In this work we consider flow shop layouts that have seldom been studied in the rescheduling literature. We generate and employ three types of disruption that interrupt the original schedules simultaneously. We develop rescheduling algorithms to finally accomplish the twofold objective of establishing a standard framework on the one hand; and proposing rescheduling methods that seek a good trade-off between schedule quality and stability on the other.

Keywords: flow shop scheduling, uncertainty, rescheduling, stability

Procedia PDF Downloads 430
2738 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'

Authors: Anthony Coogan

Abstract:

Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.

Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle

Procedia PDF Downloads 188
2737 Offline High Voltage Diagnostic Test Findings on 15MVA Generator of Basochhu Hydropower Plant

Authors: Suprit Pradhan, Tshering Yangzom

Abstract:

Even with availability of the modern day online insulation diagnostic technologies like partial discharge monitoring, the measurements like Dissipation Factor (tanδ), DC High Voltage Insulation Currents, Polarization Index (PI) and Insulation Resistance Measurements are still widely used as a diagnostic tools to assess the condition of stator insulation in hydro power plants. To evaluate the condition of stator winding insulation in one of the generators that have been operated since 1999, diagnostic tests were performed on the stator bars of 15 MVA generators of Basochhu Hydropower Plant. This paper presents diagnostic study done on the data gathered from the measurements which were performed in 2015 and 2016 as part of regular maintenance as since its commissioning no proper aging data were maintained. Measurement results of Dissipation Factor, DC High Potential tests and Polarization Index are discussed with regard to their effectiveness in assessing the ageing condition of the stator insulation. After a brief review of the theoretical background, the strengths of each diagnostic method in detecting symptoms of insulation deterioration are identified. The interesting results observed from Basochhu Hydropower Plant is taken into consideration to conclude that Polarization Index and DC High Voltage Insulation current measurements are best suited for the detection of humidity and contamination problems and Dissipation Factor measurement is a robust indicator of long-term ageing caused by oxidative degradation.

Keywords: dissipation Factor (tanδ), polarization Index (PI), DC High Voltage Insulation Current, insulation resistance (IR), Tan Delta Tip-Up, dielectric absorption ratio

Procedia PDF Downloads 293
2736 Quantification and Detection of Non-Sewer Water Infiltration and Inflow in Urban Sewer Systems

Authors: M. Beheshti, S. Saegrov, T. M. Muthanna

Abstract:

Separated sewer systems are designed to transfer the wastewater from houses and industrial sections to wastewater treatment plants. Unwanted water in the sewer systems is a well-known problem, i.e. storm-water inflow is around 50% of the foul sewer, and groundwater infiltration to the sewer system can exceed 50% of total wastewater volume in deteriorated networks. Infiltration and inflow of non-sewer water (I/I) into sewer systems is unfavorable in separated sewer systems and can trigger overloading the system and reducing the efficiency of wastewater treatment plants. Moreover, I/I has negative economic, environmental, and social impacts on urban areas. Therefore, for having sustainable management of urban sewer systems, I/I of unwanted water into the urban sewer systems should be considered carefully and maintenance and rehabilitation plan should be implemented on these water infrastructural assets. This study presents a methodology to identify and quantify the level of I/I into the sewer system. Amount of I/I is evaluated by accurate flow measurement in separated sewer systems for specified isolated catchments in Trondheim city (Norway). Advanced information about the characteristics of I/I is gained by CCTV inspection of sewer pipelines with high I/I contribution. Achieving enhanced knowledge about the detection and localization of non-sewer water in foul sewer system during the wet and dry weather conditions will enable the possibility for finding the problem of sewer system and prioritizing them and taking decisions for rehabilitation and renewal planning in the long-term. Furthermore, preventive measures and optimization of sewer systems functionality and efficiency can be executed by maintenance of sewer system. In this way, the exploitation of sewer system can be improved by maintenance and rehabilitation of existing pipelines in a sustainable way by more practical cost-effective and environmental friendly way. This study is conducted on specified catchments with different properties in Trondheim city. Risvollan catchment is one of these catchments with a measuring station to investigate hydrological parameters through the year, which also has a good database. For assessing the infiltration in a separated sewer system, applying the flow rate measurement method can be utilized in obtaining a general view of the network condition from infiltration point of view. This study discusses commonly used and advanced methods of localizing and quantifying I/I in sewer systems. A combination of these methods give sewer operators the possibility to compare different techniques and obtain reliable and accurate I/I data which is vital for long-term rehabilitation plans.

Keywords: flow rate measurement, infiltration and inflow (I/I), non-sewer water, separated sewer systems, sustainable management

Procedia PDF Downloads 311
2735 Fatigue Crack Growth Rate Measurement by Means of Classic Method and Acoustic Emission

Authors: V. Mentl, V. Koula, P. Mazal, J. Volák

Abstract:

Nowadays, the acoustic emission is a widely recognized method of material damage investigation, mainly in cases of cracks initiation and growth observation and evaluation. This is highly important in structures, e.g. pressure vessels, large steam turbine rotors etc., applied both in classic and nuclear power plants. Nevertheless, the acoustic emission signals must be correlated with the real crack progress to be able to evaluate the cracks and their growth by this non-destructive technique alone in real situations and to reach reliable results when the assessment of the structures' safety and reliability is performed and also when the remaining lifetime should be evaluated. The main aim of this study was to propose a methodology for evaluation of the early manifestations of the fatigue cracks and their growth and thus to quantify the material damage by acoustic emission parameters. Specimens made of several steels used in the power producing industry were subjected to fatigue loading in the low- and high-cycle regimes. This study presents results of the crack growth rate measurement obtained by the classic compliance change method and the acoustic emission signal analysis. The experiments were realized in cooperation between laboratories of Brno University of Technology and West Bohemia University in Pilsen within the solution of the project of the Czech Ministry of Industry and Commerce: "A diagnostic complex for the detection of pressure media and material defects in pressure components of nuclear and classic power plants" and the project “New Technologies for Mechanical Engineering”.

Keywords: fatigue, crack growth rate, acoustic emission, material damage

Procedia PDF Downloads 358
2734 Measuring Satisfaction with Life Construct Among Public and Private University Students During COVID-19 Pandemic in Sabah, Malaysia

Authors: Mohd Dahlan Abdul Malek, Muhamad Idris, Adi Fahrudin, Ida Shafinaz Mohamed Kamil, Husmiati Yusuf, Edeymend Reny Japil, Wan Anor Wan Sulaiman, Lailawati Madlan, Alfred Chan, Nurfarhana Adillah Aftar, Mahirah Masdin

Abstract:

This research intended to develop a valid and reliable instrument of the Satisfaction with Life Scale (SWLS) to measure satisfaction with life (SWL) constructs among public and private university students in Sabah, Malaysia, through the exploratory factor analysis (EFA) procedure. The pilot study obtained a sample of 108 students from public and private education institutions in Sabah, Malaysia, through an online survey using a self-administered questionnaire. The researchers performed the EFA procedure on SWL construct using IBM SPSS 25. The Bartletts' Test of Sphericity is highly significant (Sig. = .000). Furthermore, the sampling adequacy by Kaiser-Meyer-Olkin (KMO = 0.839) is excellent. Using the extraction method of Principal Component Analysis (PCA) with Varimax Rotation, a component of the SWL construct is extracted with an eigenvalue of 3.101. The variance explained for this component is 62.030%. The construct of SWL has Cronbach's alpha value of .817. The development scale and validation confirmed that the instrument is consistent and stable with both private and public college and university student samples. It adds a remarkable contribution to the measurement of SWLS, mainly in the context of higher education institution students. The EFA outcomes formed a configuration that extracts a component of SWL, which can be measured by the original five items established in this research. This research reveals that the SWL construct is applicable to this study.

Keywords: satisfaction, university students, measurement, scale development

Procedia PDF Downloads 70
2733 Quality of Life and Self-Assessed Health of Methadone – Maintained Opiate Addicts

Authors: Brajevic-gizdic Igna, Vuletic Gorka

Abstract:

Introduction: Research in opiate addiction is increasingly indicating the importance of substitution therapy in opiate addicts. Opiate addiction is a chronic relapsing disease that includes craving as a criterion. Craving has been considered a predictor of a relapse, which is defined as a strong desire with an excessive need to take a substance. The study aimed to measure the intensity of craving using the VAS (visual analog scale) in opioid addicts taking the Opioid Substitution Therapy (OST). Method: The total sample compromised of 30 participants in outpatient treatment. Two groups of opiate addicts were considered: Methadone-maintenance and buprenorphine-maintenance addicts. The participants completed the survey questionnaire during the outpatient treatment. Results: The results indicated high levels of cravings in patients during the treatment on OST, which is considered an important destabilization factor in abstinence. Thus, the use of methadone/buprenorphine dose should be considered. Conclusion: These findings provided an objective measurement of methadone /buprenorphine dosage and therapy options. The underdoes of OST can put patients at high risk of relapse, resulting in high levels of craving. Thus, when determining the therapeutic dose of OST, it is crucial to consider patients´ craving. This would achieve stabilization more quickly and avoid relapse in abstinence. Subjective physician assessment and patient’s statement are the main criteria to determine OST dosage. Future studies should use larger sample sizes and focus on the importance of intensity craving measurement on OST to objectify methadone /buprenorphine dosage.

Keywords: abstinence, addicts, methadone, OST, quality of life

Procedia PDF Downloads 83
2732 Weibull Cumulative Distribution Function Analysis with Life Expectancy Endurance Test Result of Power Window Switch

Authors: Miky Lee, K. Kim, D. Lim, D. Cho

Abstract:

This paper presents the planning, rationale for test specification derivation, sampling requirements, test facilities, and result analysis used to conduct lifetime expectancy endurance tests on power window switches (PWS) considering thermally induced mechanical stress under diurnal cyclic temperatures during normal operation (power cycling). The detail process of analysis and test results on the selected PWS set were discussed in this paper. A statistical approach to ‘life time expectancy’ was given to the measurement standards dealing with PWS lifetime determination through endurance tests. The approach choice, within the framework of the task, was explained. The present task was dedicated to voltage drop measurement to derive lifetime expectancy while others mostly consider contact or surface resistance. The measurements to perform and the main instruments to measure were fully described accordingly. The failure data from tests were analyzed to conclude lifetime expectancy through statistical method using Weibull cumulative distribution function. The first goal of this task is to develop realistic worst case lifetime endurance test specification because existing large number of switch test standards cannot induce degradation mechanism which makes the switches less reliable. 2nd goal is to assess quantitative reliability status of PWS currently manufactured based on test specification newly developed thru this project. The last and most important goal is to satisfy customer’ requirement regarding product reliability.

Keywords: power window switch, endurance test, Weibull function, reliability, degradation mechanism

Procedia PDF Downloads 220
2731 Credit Risk Evaluation of Dairy Farming Using Fuzzy Logic

Authors: R. H. Fattepur, Sameer R. Fattepur, D. K. Sreekantha

Abstract:

Dairy Farming is one of the key industries in India. India is the leading producer and also the consumer of milk, milk-based products in the world. In this paper, we have attempted to the replace the human expert system and to develop an artificial expert system prototype to increase the speed and accuracy of decision making dairy farming credit risk evaluation. Fuzzy logic is used for dealing with uncertainty, vague and acquired knowledge, fuzzy rule base method is used for representing this knowledge for building an effective expert system.

Keywords: expert system, fuzzy logic, knowledge base, dairy farming, credit risk

Procedia PDF Downloads 344
2730 Analysis of Vibration of Thin-Walled Parts During Milling Made of EN AW-7075 Alloy

Authors: Jakub Czyżycki, Paweł Twardowski

Abstract:

Thin-walled components made of aluminum alloys are increasingly found in many fields of industry, and they dominate the aerospace industry. The machining of thinwalled structures encounters many difficulties related to the high susceptibility of the workpiece, which causes vibrations including the most unfavorable ones called chatter. The effect of these phenomena is the difficulty in obtaining the required geometric dimensions and surface quality. The purpose of this study is to analyze vibrations arising during machining of thin-walled workpieces made of aluminum alloy EN AW-7075. Samples representing actual thin-walled workpieces were examined in a different range of dimensions characterizing thin-walled workpieces. The tests were carried out in HSM high-speed machining (cutting speed vc = 1400 m/min) using a monolithic solid carbide endmill. Measurement of vibration was realized using a singlecomponent piezoelectric accelerometer 4508C from Brüel&Kjær which was mounted directly on the sample before machining, the measurement was made in the normal feed direction AfN. In addition, the natural frequency of the tested thin-walled components was investigated using a laser vibrometer for an broader analysis of the tested samples. The effect of vibrations on machining accuracy was presented in the form of surface images taken with an optical measuring device from Alicona. A classification of the vibrations produced during the test was carried out, and were analyzed in both the time and frequency domains. Observed significant influence of the thickness of the thin-walled component on the course of vibrations during machining.

Keywords: high-speed machining, thin-walled elements, thin-walled components, milling, vibrations

Procedia PDF Downloads 31
2729 First Formaldehyde Retrieval Using the Raw Data Obtained from Pandora in Seoul: Investigation of the Temporal Characteristics and Comparison with Ozone Monitoring Instrument Measurement

Authors: H. Lee, J. Park

Abstract:

In this present study, for the first time, we retrieved the Formaldehyde (HCHO) Vertical Column Density (HCHOVCD) using Pandora instruments in Seoul, a megacity in northeast Asia, for the period between 2012 and 2014 and investigated the temporal characteristics of HCHOVCD. HCHO Slant Column Density (HCHOSCD) was obtained using the Differential Optical Absorption Spectroscopy (DOAS) method. HCHOSCD was converted to HCHOVCD using geometric Air Mass Factor (AMFG) as Pandora is the direct-sun measurement. The HCHOVCDs is low at 12:00 Local Time (LT) and is high in the morning (10:00 LT) and late afternoon (16:00 LT) except for winter. The maximum (minimum) values of Pandora HCHOVCD are 2.68×1016 (1.63×10¹⁶), 3.19×10¹⁶ (2.23×10¹⁶), 2.00×10¹⁶ (1.26×10¹⁶), and 1.63×10¹⁶ (0.82×10¹⁶) molecules cm⁻² in spring, summer, autumn, and winter, respectively. In terms of seasonal variations, HCHOVCD was high in summer and low in winter which implies that photo-oxidation plays an important role in HCHO production in Seoul. In comparison with the Ozone Monitoring Instrument (OMI) measurements, the HCHOVCDs from the OMI are lower than those from Pandora. The correlation coefficient (R) between monthly HCHOVCDs values from Pandora and OMI is 0.61, with slop of 0.35. Furthermore, to understand HCHO mixing ratio within Planetary Boundary Layer (PBL) in Seoul, we converted Pandora HCHOVCDs to HCHO mixing ratio in the PBL using several meteorological input data from the Atmospheric InfraRed Sounder (AIRS). Seasonal HCHO mixing ratio in PBL converted from Pandora (OMI) HCHOVCDs are estimated to be 6.57 (5.17), 7.08 (6.68), 7.60 (4.70), and 5.00 (4.76) ppbv in spring, summer, autumn, and winter, respectively.

Keywords: formaldehyde, OMI, Pandora, remote sensing

Procedia PDF Downloads 139
2728 A Two-Week and Six-Month Stability of Cancer Health Literacy Classification Using the CHLT-6

Authors: Levent Dumenci, Laura A. Siminoff

Abstract:

Health literacy has been shown to predict a variety of health outcomes. Reliable identification of persons with limited cancer health literacy (LCHL) has been proved questionable with existing instruments using an arbitrary cut point along a continuum. The CHLT-6, however, uses a latent mixture modeling approach to identify persons with LCHL. The purpose of this study was to estimate two-week and six-month stability of identifying persons with LCHL using the CHLT-6 with a discrete latent variable approach as the underlying measurement structure. Using a test-retest design, the CHLT-6 was administered to cancer patients with two-week (N=98) and six-month (N=51) intervals. The two-week and six-month latent test-retest agreements were 89% and 88%, respectively. The chance-corrected latent agreements estimated from Dumenci’s latent kappa were 0.62 (95% CI: 0.41 – 0.82) and .47 (95% CI: 0.14 – 0.80) for the two-week and six-month intervals, respectively. High levels of latent test-retest agreement between limited and adequate categories of cancer health literacy construct, coupled with moderate to good levels of change-corrected latent agreements indicated that the CHLT-6 classification of limited versus adequate cancer health literacy is relatively stable over time. In conclusion, the measurement structure underlying the instrument allows for estimating classification errors circumventing limitations due to arbitrary approaches adopted by all other instruments. The CHLT-6 can be used to identify persons with LCHL in oncology clinics and intervention studies to accurately estimate treatment effectiveness.

Keywords: limited cancer health literacy, the CHLT-6, discrete latent variable modeling, latent agreement

Procedia PDF Downloads 163
2727 Measuring Systems Interoperability: A Focal Point for Standardized Assessment of Regional Disaster Resilience

Authors: Joel Thomas, Alexa Squirini

Abstract:

The key argument of this research is that every element of systems interoperability is an enabler of regional disaster resilience, and arguably should become a focal point for standardized measurement of communities’ ability to work together. Few resilience research efforts have focused on the development and application of solutions that measurably improve communities’ ability to work together at a regional level, yet a majority of the most devastating and disruptive disasters are those that have had a regional impact. The key findings of the research include a unique theoretical, mathematical, and operational approach to tangibly and defensibly measure and assess systems interoperability required to support crisis information management activities performed by governments, the private sector, and humanitarian organizations. A most effective way for communities to measurably improve regional disaster resilience is through deliberately executed disaster preparedness activities. Developing interoperable crisis information management capabilities is a crosscutting preparedness activity that greatly affects a community’s readiness and ability to work together in times of crisis. Thus, improving communities’ human and technical posture to work together in advance of a crisis, with the ultimate goal of enabling information sharing to support coordination and the careful management of available resources, is a primary means by which communities may improve regional disaster resilience. This model describes how systems interoperability can be qualitatively and quantitatively assessed when characterized as five forms of capital: governance; standard operating procedures; technology; training and exercises; and usage. The unique measurement framework presented defines the relationships between systems interoperability, information sharing and safeguarding, operational coordination, community preparedness and regional disaster resilience, and offers a means by which to implement real-world solutions and measure progress over the course of a multi-year program. The model is being developed and piloted in partnership with the U.S. Department of Homeland Security (DHS) Science and Technology Directorate (S&T) and the North Atlantic Treaty Organization (NATO) Advanced Regional Civil Emergency Coordination Pilot (ARCECP) with twenty-three organizations in Bosnia and Herzegovina, Croatia, Macedonia, and Montenegro. The intended effect of the model implementation is to enable communities to answer two key questions: 'Have we measurably improved crisis information management capabilities as a result of this effort?' and, 'As a result, are we more resilient?'

Keywords: disaster, interoperability, measurement, resilience

Procedia PDF Downloads 123
2726 Radiation Protection Assessment of the Emission of a d-t Neutron Generator: Simulations with MCNP Code and Experimental Measurements in Different Operating Conditions

Authors: G. M. Contessa, L. Lepore, G. Gandolfo, C. Poggi, N. Cherubini, R. Remetti, S. Sandri

Abstract:

Practical guidelines are provided in this work for the safe use of a portable d-t Thermo Scientific MP-320 neutron generator producing pulsed 14.1 MeV neutron beams. The neutron generator’s emission was tested experimentally and reproduced by MCNPX Monte Carlo code. Simulations were particularly accurate, even generator’s internal components were reproduced on the basis of ad-hoc collected X-ray radiographic images. Measurement campaigns were conducted under different standard experimental conditions using an LB 6411 neutron detector properly calibrated at three different energies, and comparing simulated and experimental data. In order to estimate the dose to the operator vs. the operating conditions and the energy spectrum, the most appropriate value of the conversion factor between neutron fluence and ambient dose equivalent has been identified, taking into account both direct and scattered components. The results of the simulations show that, in real situations, when there is no information about the neutron spectrum at the point where the dose has to be evaluated, it is possible - and in any case conservative - to convert the measured value of the count rate by means of the conversion factor corresponding to 14 MeV energy. This outcome has a general value when using this type of generator, enabling a more accurate design of experimental activities in different setups. The increasingly widespread use of this type of device for industrial and medical applications makes the results of this work of interest in different situations, especially as a support for the definition of appropriate radiation protection procedures and, in general, for risk analysis.

Keywords: instrumentation and monitoring, management of radiological safety, measurement of individual dose, radiation protection of workers

Procedia PDF Downloads 117
2725 Reliability Analysis of Partial Safety Factor Design Method for Slopes in Granular Soils

Authors: K. E. Daryani, H. Mohamad

Abstract:

Uncertainties in the geo-structure analysis and design have a significant impact on the safety of slopes. Traditionally, uncertainties in the geotechnical design are addressed by incorporating a conservative factor of safety in the analytical model. In this paper, a risk-based approach is adopted to assess the influence of the geotechnical variable uncertainties on the stability of infinite slopes in cohesionless soils using the “partial factor of safety on shear strength” approach as stated in Eurocode 7. Analyses conducted using Monte Carlo simulation show that the same partial factor can have very different levels of risk depending on the degree of uncertainty of the mean values of the soil friction angle and void ratio.

Keywords: Safety, Probability of Failure, Reliability, Infinite Slopes, Sand.

Procedia PDF Downloads 559
2724 Manipulator Development for Telediagnostics

Authors: Adam Kurnicki, Bartłomiej Stanczyk, Bartosz Kania

Abstract:

This paper presents development of the light-weight manipulator with series elastic actuation for medical telediagnostics (USG examination). General structure of realized impedance control algorithm was shown. It was described how to perform force measurements based mainly on elasticity of manipulator links.

Keywords: telediagnostics, elastic manipulator, impedance control, force measurement

Procedia PDF Downloads 455
2723 Statistical Assessment of Models for Determination of Soil–Water Characteristic Curves of Sand Soils

Authors: S. J. Matlan, M. Mukhlisin, M. R. Taha

Abstract:

Characterization of the engineering behavior of unsaturated soil is dependent on the soil-water characteristic curve (SWCC), a graphical representation of the relationship between water content or degree of saturation and soil suction. A reasonable description of the SWCC is thus important for the accurate prediction of unsaturated soil parameters. The measurement procedures for determining the SWCC, however, are difficult, expensive, and time-consuming. During the past few decades, researchers have laid a major focus on developing empirical equations for predicting the SWCC, with a large number of empirical models suggested. One of the most crucial questions is how precisely existing equations can represent the SWCC. As different models have different ranges of capability, it is essential to evaluate the precision of the SWCC models used for each particular soil type for better SWCC estimation. It is expected that better estimation of SWCC would be achieved via a thorough statistical analysis of its distribution within a particular soil class. With this in view, a statistical analysis was conducted in order to evaluate the reliability of the SWCC prediction models against laboratory measurement. Optimization techniques were used to obtain the best-fit of the model parameters in four forms of SWCC equation, using laboratory data for relatively coarse-textured (i.e., sandy) soil. The four most prominent SWCCs were evaluated and computed for each sample. The result shows that the Brooks and Corey model is the most consistent in describing the SWCC for sand soil type. The Brooks and Corey model prediction also exhibit compatibility with samples ranging from low to high soil water content in which subjected to the samples that evaluated in this study.

Keywords: soil-water characteristic curve (SWCC), statistical analysis, unsaturated soil, geotechnical engineering

Procedia PDF Downloads 325
2722 Development of a Smart System for Measuring Strain Levels of Natural Gas and Petroleum Pipelines on Earthquake Fault Lines in Turkiye

Authors: Ahmet Yetik, Seyit Ali Kara, Cevat Özarpa

Abstract:

Load changes occur on natural gas and oil pipelines due to natural disasters. The displacement of the soil around the natural gas and oil pipes due to situations that may cause erosion, such as earthquakes, landslides, and floods, is the source of this load change. The exposure of natural gas and oil pipes to variable loads causes deformation, cracks, and breaks in these pipes. Cracks and breaks on the pipes cause damage to people and the environment due to reasons such as explosions. Especially with the examinations made after natural disasters, it can be easily understood which of the pipes has more damage in the regions followed. It has been determined that the earthquakes in Turkey caused permanent damage to the pipelines. This project was designed and realized because it was determined that there were cracks and gas leaks in the insulation gaskets placed in the pipelines, especially at the junction points. In this study, A new SCADA (Supervisory Control and Data Acquisition) application has been developed to monitor load changes caused by natural disasters. The newly developed SCADA application monitors the changes in the x, y, and z axes of the stresses occurring in the pipes with the help of strain gauge sensors placed on the pipes. For the developed SCADA system, test setups in accordance with the standards were created during the fieldwork. The test setups created were integrated into the SCADA system, and the system was followed up. Thanks to the SCADA system developed with the field application, the load changes that will occur on the natural gas and oil pipes are instantly monitored, and the accumulations that may create a load on the pipes and their surroundings are immediately intervened, and new risks that may arise are prevented. It has contributed to energy supply security, asset management, pipeline holistic management, and sustainability.

Keywords: earthquake, natural gas pipes, oil pipes, strain measurement, stress measurement, landslide

Procedia PDF Downloads 59
2721 A Methodology for the Identification of Technological Gaps and the Measurement of the Level of Technological Acceptance in the Rural Sector in Colombia

Authors: Anyi Katherine Garzon Robles, Luis Carlos Gomez Florez

Abstract:

Since the advent of the Internet, the use of Information Technologies (IT) has increased exponentially. The field of informatics and telecommunications has put on the table countless possibilities for the development of different socio-economic activities, promoting a change of social paradigm and the emergence of the so-called information and knowledge society. For more than a decade, the Colombian government has been working on the incorporation of IT into the public sector through an e-government strategy. However, to date, many technological gaps has not yet been identified in the country to our knowledge, especially in rural areas and far from large cities, where factors such as low investment and the expansion of the armed conflict have led to economic and technological stagnation. This paper presents the research results obtained from the execution of a research project, which was approach from a qualitative approach and a methodological design of a participatory action research type. This design consists of nine fundamental stages divided into four work cycles. For which different strategies for data collection and analysis were established. From which, a methodology was obtained for the identification of technological gaps and the measurement of the level of technological acceptance in the rural sector, based on the TAM (Technology Acceptance Model) model, as a previous activity to the development of IT solutions framed in the e-government strategy in Colombia. The result of this research work represents a contribution from academia for the improvement of the country's technological development and a guide for the proper planning of IT solutions aimed at promoting a close relationship between government and citizens.

Keywords: E-government, knowledge society, level of technological acceptance, technological gaps, technology acceptance model

Procedia PDF Downloads 222