Search results for: coordinate measuring machines (CMM)
2025 Power Quality Audit Using Fluke Analyzer
Authors: N. Ravikumar, S. Krishnan, B. Yokeshkumar
Abstract:
In present days, the power quality issues are increases due to non-linear loads like fridge, AC, washing machines, induction motor, etc. This power quality issues will affects the output voltages, output current, and output power of the total performance of the generator. This paper explains how to test the generator using the Fluke 435 II series power quality analyser. This Fluke 435 II series power quality analyser is used to measure the voltage, current, power, energy, total harmonic distortion (THD), current harmonics, voltage harmonics, power factor, and frequency. The Fluke 435 II series power quality analyser have several advantages. They are i) it will records output in analog and digital format. ii) the fluke analyzer will records at every 0.25 sec. iii) it will also measure all the electrical parameter at a time.Keywords: THD, harmonics, power quality, TNEB, Fluke 435
Procedia PDF Downloads 1772024 Bacterial Exposure and Microbial Activity in Dental Clinics during Cleaning Procedures
Authors: Atin Adhikari, Sushma Kurella, Pratik Banerjee, Nabanita Mukherjee, Yamini M. Chandana Gollapudi, Bushra Shah
Abstract:
Different sharp instruments, drilling machines, and high speed rotary instruments are routinely used in dental clinics during dental cleaning. Therefore, these cleaning procedures release a lot of oral microorganisms including bacteria in clinic air and may cause significant occupational bioaerosol exposure risks for dentists, dental hygienists, patients, and dental clinic employees. Two major goals of this study were to quantify volumetric airborne concentrations of bacteria and to assess overall microbial activity in this type of occupational environment. The study was conducted in several dental clinics of southern Georgia and 15 dental cleaning procedures were targeted for sampling of airborne bacteria and testing of overall microbial activity in settled dusts over clinic floors. For air sampling, a Biostage viable cascade impactor was utilized, which comprises an inlet cone, precision-drilled 400-hole impactor stage, and a base that holds an agar plate (Tryptic soy agar). A high-flow Quick-Take-30 pump connected to this impactor pulls microorganisms in air at 28.3 L/min flow rate through the holes (jets) where they are collected on the agar surface for approx. five minutes. After sampling, agar plates containing the samples were placed in an ice chest with blue ice and plates were incubated at 30±2°C for 24 to 72 h. Colonies were counted and converted to airborne concentrations (CFU/m3) followed by positive hole corrections. Most abundant bacterial colonies (selected by visual screening) were identified by PCR amplicon sequencing of 16S rRNA genes. For understanding overall microbial activity in clinic floors and estimating a general cleanliness of the clinic surfaces during or after dental cleaning procedures, ATP levels were determined in swabbed dust samples collected from 10 cm2 floor surfaces. Concentration of ATP may indicate both the cell viability and the metabolic status of settled microorganisms in this situation. An ATP measuring kit was used, which utilized standard luciferin-luciferase fluorescence reaction and a luminometer, which quantified ATP levels as relative light units (RLU). Three air and dust samples were collected during each cleaning procedure (at the beginning, during cleaning, and immediately after the procedure was completed (n = 45). Concentrations at the beginning, during, and after dental cleaning procedures were 671±525, 917±1203, and 899±823 CFU/m3, respectively for airborne bacteria and 91±101, 243±129, and 139±77 RLU/sample, respectively for ATP levels. The concentrations of bacteria were significantly higher than typical indoor residential environments. Although an increasing trend for airborne bacteria was observed during cleaning, the data collected at three different time points were not significantly different (ANOVA: p = 0.38) probably due to high standard deviations of data. The ATP levels, however, demonstrated a significant difference (ANOVA: p <0.05) in this scenario indicating significant change in microbial activity on floor surfaces during dental cleaning. The most common bacterial genera identified were: Neisseria sp., Streptococcus sp., Chryseobacterium sp., Paenisporosarcina sp., and Vibrio sp. in terms of frequencies of occurrences, respectively. The study concluded that bacterial exposure in dental clinics could be a notable occupational biohazard, and appropriate respiratory protections for the employees are urgently needed.Keywords: bioaerosols, hospital hygiene, indoor air quality, occupational biohazards
Procedia PDF Downloads 3112023 The Effect of Impinging WC-12Co Particles Temperature on Thickness of HVOF Thermally Sprayed Coatings
Authors: M. Jalali Azizpour
Abstract:
In this paper, the effect of WC-12Co particle Temperature in HVOF thermal spraying process on the coating thickness has been studied. The statistical results show that the spray distance and oxygen-to-fuel ratio are more effective factors on particle characterization and thickness of HVOF thermal spraying coatings. Spray Watch diagnostic system, scanning electron microscopy (SEM), X-ray diffraction and thickness measuring system were used for this purpose.Keywords: HVOF, temperature thickness, velocity, WC-12Co
Procedia PDF Downloads 2412022 A Modeling Approach for Blockchain-Oriented Information Systems Design
Abstract:
The blockchain technology is regarded as the most promising technology that has the potential to trigger a technological revolution. However, besides the bitcoin industry, we have not yet seen a large-scale application of blockchain in those domains that are supposed to be impacted, such as supply chain, financial network, and intelligent manufacturing. The reasons not only lie in the difficulties of blockchain implementation, but are also root in the challenges of blockchain-oriented information systems design. As the blockchain members are self-interest actors that belong to organizations with different existing information systems. As they expect different information inputs and outputs of the blockchain application, a common language protocol is needed to facilitate communications between blockchain members. Second, considering the decentralization of blockchain organization, there is not any central authority to organize and coordinate the business processes. Thus, the information systems built on blockchain should support more adaptive business process. This paper aims to address these difficulties by providing a modeling approach for blockchain-oriented information systems design. We will investigate the information structure of distributed-ledger data with conceptual modeling techniques and ontology theories, and build an effective ontology mapping method for the inter-organization information flow and blockchain information records. Further, we will study the distributed-ledger-ontology based business process modeling to support adaptive enterprise on blockchain.Keywords: blockchain, ontology, information systems modeling, business process
Procedia PDF Downloads 4492021 Analysis of Translational Ship Oscillations in a Realistic Environment
Authors: Chen Zhang, Bernhard Schwarz-Röhr, Alexander Härting
Abstract:
To acquire accurate ship motions at the center of gravity, a single low-cost inertial sensor is utilized and applied on board to measure ship oscillating motions. As observations, the three axes accelerations and three axes rotational rates provided by the sensor are used. The mathematical model of processing the observation data includes determination of the distance vector between the sensor and the center of gravity in x, y, and z directions. After setting up the transfer matrix from sensor’s own coordinate system to the ship’s body frame, an extended Kalman filter is applied to deal with nonlinearities between the ship motion in the body frame and the observation information in the sensor’s frame. As a side effect, the method eliminates sensor noise and other unwanted errors. Results are not only roll and pitch, but also linear motions, in particular heave and surge at the center of gravity. For testing, we resort to measurements recorded on a small vessel in a well-defined sea state. With response amplitude operators computed numerically by a commercial software (Seaway), motion characteristics are estimated. These agree well with the measurements after processing with the suggested method.Keywords: extended Kalman filter, nonlinear estimation, sea trial, ship motion estimation
Procedia PDF Downloads 5222020 Seersucker Fabrics Development Using Single Warp Beam
Authors: Khubab Shaker, Yasir Nawab, Muhammad Usman Javed, Muhammad Umair, Muhammad Maqsood
Abstract:
Seersucker is a thin and puckered fabric commonly striped or chequered, used to make clothing for spring and woven in such a way that some threads bunch together, giving the fabric a wrinkled appearance in places. Due to use of two warp beams, such fabrics were not possible to weave on conventional weaving machines. Objective of this study was to weave a seersucker fabric on conventional looms using single warp beam. This objective was achieved using two types of yarns, forming stripes in weft: one being 100% cotton yarn and the other core spun elastane yarn with sheath of cotton (95.7% cotton and 4.3% elastane). Stress-strain behaviour of the produced fabric samples were tested and explained.Keywords: seersucker fabrics, elastane yarns, single warp beam, weaving
Procedia PDF Downloads 5252019 OILU Tag: A Projective Invariant Fiducial System
Authors: Youssef Chahir, Messaoud Mostefai, Salah Khodja
Abstract:
This paper presents the development of a 2D visual marker, derived from a recent patented work in the field of numbering systems. The proposed fiducial uses a group of projective invariant straight-line patterns, easily detectable and remotely recognizable. Based on an efficient data coding scheme, the developed marker enables producing a large panel of unique real time identifiers with highly distinguishable patterns. The proposed marker Incorporates simultaneously decimal and binary information, making it readable by both humans and machines. This important feature opens up new opportunities for the development of efficient visual human-machine communication and monitoring protocols. Extensive experiment tests validate the robustness of the marker against acquisition and geometric distortions.Keywords: visual markers, projective invariants, distance map, level sets
Procedia PDF Downloads 1632018 LaPEA: Language for Preprocessing of Edge Applications in Smart Factory
Authors: Masaki Sakai, Tsuyoshi Nakajima, Kazuya Takahashi
Abstract:
In order to improve the productivity of a factory, it is often the case to create an inference model by collecting and analyzing operational data off-line and then to develop an edge application (EAP) that evaluates the quality of the products or diagnoses machine faults in real-time. To accelerate this development cycle, an edge application framework for the smart factory is proposed, which enables to create and modify EAPs based on prepared inference models. In the framework, the preprocessing component is the key part to make it work. This paper proposes a language for preprocessing of edge applications, called LaPEA, which can flexibly process several sensor data from machines into explanatory variables for an inference model, and proves that it meets the requirements for the preprocessing.Keywords: edge application framework, edgecross, preprocessing language, smart factory
Procedia PDF Downloads 1462017 Using the Transient Plane Source Method for Measuring Thermal Parameters of Electroceramics
Authors: Peter Krupa, Svetozár Malinarič
Abstract:
Transient plane source method has been used to measure the thermal diffusivity and thermal conductivity of a compact isostatic electro-ceramics at room temperature. The samples were fired at temperatures from 100 up to 1320 degrees Celsius in steps of 50. Bulk density and specific heat capacity were also measured with their corresponding standard uncertainties. The results were compared with further thermal analysis (dilatometry and thermogravimetry). Structural processes during firing were discussed.Keywords: TPS method, thermal conductivity, thermal diffusivity, thermal analysis, electro-ceramics, firing
Procedia PDF Downloads 4892016 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia
Authors: Yuyun Wabula, B. J. Dewancker
Abstract:
In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.Keywords: geolocation, Twitter, distribution analysis, human mobility
Procedia PDF Downloads 3142015 A Laundry Algorithm for Colored Textiles
Authors: H. E. Budak, B. Arslan-Ilkiz, N. Cakmakci, I. Gocek, U. K. Sahin, H. Acikgoz-Tufan, M. H. Arslan
Abstract:
The aim of this study is to design a novel laundry algorithm for colored textiles which have significant decoloring problem. During the experimental work, bleached knitted single jersey fabric made of 100% cotton and dyed with reactive dyestuff was utilized, since according to a conducted survey textiles made of cotton are the most demanded textile products in the textile market by the textile consumers and for coloration of textiles reactive dyestuffs are the ones that are the most commonly used in the textile industry for dyeing cotton-made products. Therefore, the fabric used in this study was selected and purchased in accordance with the survey results. The fabric samples cut out of this fabric were dyed with different dyeing parameters by using Remazol Brilliant Red 3BS dyestuff in Gyrowash machine at laboratory conditions. From the alternative reactive-dyed cotton fabric samples, the ones that have high tendency to color loss were determined and examined. Accordingly, the parameters of the dyeing process used for these fabric samples were evaluated and the dyeing process which was chosen to be used for causing high tendency to color loss for the cotton fabrics was determined in order to reveal the level of improvement in color loss during this study clearly. Afterwards, all of the untreated fabric samples cut out of the fabric purchased were dyed with the dyeing process selected. When dyeing process was completed, an experimental design was created for the laundering process by using Minitab® program considering temperature, time and mechanical action as parameters. All of the washing experiments were performed in domestic washing machine. 16 washing experiments were performed with 8 different experimental conditions and 2 repeats for each condition. After each of the washing experiments, water samples of the main wash of the laundering process were measured with UV spectrophotometer. The values obtained were compared with the calibration curve of the materials used for the dyeing process. The results of the washing experiments were statistically analyzed with Minitab® program. According to the results, the most suitable washing algorithm to be used in terms of the parameters temperature, time and mechanical action for domestic washing machines for minimizing fabric color loss was chosen. The laundry algorithm proposed in this study have the ability of minimalizing the problem of color loss of colored textiles in washing machines by eliminating the negative effects of the parameters of laundering process on color of textiles without compromising the fundamental effects of basic cleaning action being performed properly. Therefore, since fabric color loss is minimized with this washing algorithm, dyestuff residuals will definitely be lower in the grey water released from the laundering process. In addition to this, with this laundry algorithm it is possible to wash and clean other types of textile products with proper cleaning effect and minimized color loss.Keywords: color loss, laundry algorithm, textiles, domestic washing process
Procedia PDF Downloads 3572014 Performants: Making the Organization of Concerts Easier
Authors: Ioannis Andrianakis, Panagiotis Panagiotopoulos, Kyriakos Chatzidimitriou, Dimitrios Tampakis, Manolis Falelakis
Abstract:
Live music, whether performed in organized venues, restaurants, hotels or any other spots, creates value chains that support and develop local economies and tourism development. In this paper, we describe PerformAnts, a platform that increases the mobility of musicians and their accessibility to remotely located venues by rationalizing the cost of live acts. By analyzing the event history and taking into account their potential availability, the platform provides bespoke recommendations to both bands and venues while also facilitating the organization of tours and helping rationalize transportation expenses by realizing an innovative mechanism called “chain booking”. Moreover, the platform provides an environment where complicated tasks such as technical and financial negotiations, concert promotion or copyrights are easily manipulated by users using best practices. The proposed solution provides important benefits to the whole spectrum of small/medium size concert organizers, as the complexity and the cost of the production are rationalized. The environment is also very beneficial for local talent, musicians that are very mobile, venues located away from large urban areas or in touristic destinations, and managers who will be in a position to coordinate a larger number of musicians without extra effort.Keywords: machine learning, music industry, creative industries, web applications
Procedia PDF Downloads 962013 An Experimental Modeling of Steel Surfaces Wear in Injection of Plastic Materials with SGF
Authors: L. Capitanu, V. Floresci, L. L. Badita
Abstract:
Starting from the idea that the greatest pressure and velocity of composite melted is in the die nozzle, was an experimental nozzle with wear samples of sizes and weights which can be measured with precision as good. For a larger accuracy of measurements, we used a method for radiometric measuring, extremely accurate. Different nitriding steels have been studied as nitriding treatments, as well as some special steels and alloyed steels. Besides these, there have been preliminary attempts made to describe and checking corrosive action of thermoplastics on metals.Keywords: plastics, composites with short glass fibres, moulding, wear, experimental modelling, glass fibres content influence
Procedia PDF Downloads 2662012 Econometric Analysis of West African Countries’ Container Terminal Throughput and Gross Domestic Products
Authors: Kehinde Peter Oyeduntan, Kayode Oshinubi
Abstract:
The west African ports have been experiencing large inflow and outflow of containerized cargo in the last decades, and this has created a quest amongst the countries to attain the status of hub port for the sub-region. This study analyzed the relationship between the container throughput and Gross Domestic Products (GDP) of nine west African countries, using Simple Linear Regression (SLR), Polynomial Regression Model (PRM) and Support Vector Machines (SVM) with a time series of 20 years. The results showed that there exists a high correlation between the GDP and container throughput. The model also predicted the container throughput in west Africa for the next 20 years. The findings and recommendations presented in this research will guide policy makers and help improve the management of container ports and terminals in west Africa, thereby boosting the economy.Keywords: container, ports, terminals, throughput
Procedia PDF Downloads 2142011 Daily Stand-up Meetings - Relationships with Psychological Safety and Well-being in Teams
Authors: Sarah Rietze, Hannes Zacher
Abstract:
Daily stand-up meetings are the most commonly used method in agile teams. In daily stand-ups, team members gather to coordinate and align their efforts, typically for a predefined period of no more than 15 minutes. The primary purpose is to ask and answer the following three questions: What was accomplished yesterday? What will be done today? What obstacles are impeding my progress? Daily stand-ups aim to enhance communication, mutual understanding, and support within the team, as well as promote collective learning from mistakes through daily synchronization and transparency. The use of daily stand-ups is intended to positively influence psychological safety within teams, which is the belief that it is safe to show oneself and take personal risks. Two studies will be presented, which explore the relationships between daily stand-ups, psychological safety, and psychological well-being. In a first study, based on survey results (n = 318), we demonstrated that daily stand-ups have a positive indirect effect on job satisfaction and a negative indirect effect on turnover intention through their impact on psychological safety. In a second study, we investigate, using an experimental design, how the use of daily stand-ups in teams enhances psychological safety and well-being compared to a control group that does not use daily stand-ups. Psychological safety is considered one of the most crucial cultural factors for a sustainable, agile organization. Agile approaches, such as daily stand-ups, are a critical part of the evolving work environment and offer a proactive means to shape and foster psychological safety within teams.Keywords: occupational wellbeing, agile work practices, psychological safety, daily stand-ups
Procedia PDF Downloads 652010 Analytical Solution of the Boundary Value Problem of Delaminated Doubly-Curved Composite Shells
Authors: András Szekrényes
Abstract:
Delamination is one of the major failure modes in laminated composite structures. Delamination tips are mostly captured by spatial numerical models in order to predict crack growth. This paper presents some mechanical models of delaminated composite shells based on shallow shell theories. The mechanical fields are based on a third-order displacement field in terms of the through-thickness coordinate of the laminated shell. The undelaminated and delaminated parts are captured by separate models and the continuity and boundary conditions are also formulated in a general way providing a large size boundary value problem. The system of differential equations is solved by the state space method for an elliptic delaminated shell having simply supported edges. The comparison of the proposed and a numerical model indicates that the primary indicator of the model is the deflection, the secondary is the widthwise distribution of the energy release rate. The model is promising and suitable to determine accurately the J-integral distribution along the delamination front. Based on the proposed model it is also possible to develop finite elements which are able to replace the computationally expensive spatial models of delaminated structures.Keywords: J-integral, levy method, third-order shell theory, state space solution
Procedia PDF Downloads 1312009 A Dynamic Symplectic Manifold Analysis for Wave Propagation in Porous Media
Authors: K. I. M. Guerra, L. A. P. Silva, J. C. Leal
Abstract:
This study aims to understand with more amplitude and clarity the behavior of a porous medium where a pressure wave travels, translated into relative displacements inside the material, using mathematical tools derived from topology and symplectic geometry. The paper starts with a given partial differential equation based on the continuity and conservation theorems to describe the traveling wave through the porous body. A solution for this equation is proposed after all boundary, and initial conditions are fixed, and it’s accepted that the solution lies in a manifold U of purely spatial dimensions and that is embedded in the Real n-dimensional manifold, with spatial and kinetic dimensions. It’s shown that the U manifold of lower dimensions than IRna, where it is embedded, inherits properties of the vector spaces existing inside the topology it lies on. Then, a second manifold (U*), embedded in another space called IRnb of stress dimensions, is proposed and there’s a non-degenerative function that maps it into the U manifold. This relation is proved as a transformation in between two corresponding admissible solutions of the differential equation in distinct dimensions and properties, leading to a more visual and intuitive understanding of the whole dynamic process of a stress wave through a porous medium and also highlighting the dimensional invariance of Terzaghi’s theory for any coordinate system.Keywords: poremechanics, soil dynamics, symplectic geometry, wave propagation
Procedia PDF Downloads 2952008 A Novel Computer-Generated Hologram (CGH) Achieved Scheme Generated from Point Cloud by Using a Lens Array
Authors: Wei-Na Li, Mei-Lan Piao, Nam Kim
Abstract:
We proposed a novel computer-generated hologram (CGH) achieved scheme, wherein the CGH is generated from a point cloud which is transformed by a mapping relationship of a series of elemental images captured from a real three-dimensional (3D) object by using a lens array. This scheme is composed of three procedures: mapping from elemental images to point cloud, hologram generation, and hologram display. A mapping method is figured out to achieve a virtual volume date (point cloud) from a series of elemental images. This mapping method consists of two steps. Firstly, the coordinate (x, y) pairs and its appearing number are calculated from the series of sub-images, which are generated from the elemental images. Secondly, a series of corresponding coordinates (x, y, z) are calculated from the elemental images. Then a hologram is generated from the volume data that is calculated by the previous two steps. Eventually, a spatial light modulator (SLM) and a green laser beam are utilized to display this hologram and reconstruct the original 3D object. In this paper, in order to show a more auto stereoscopic display of a real 3D object, we successfully obtained the actual depth data of every discrete point of the real 3D object, and overcame the inherent drawbacks of the depth camera by obtaining point cloud from the elemental images.Keywords: elemental image, point cloud, computer-generated hologram (CGH), autostereoscopic display
Procedia PDF Downloads 5842007 Robust Quantum Image Encryption Algorithm Leveraging 3D-BNM Chaotic Maps and Controlled Qubit-Level Operations
Authors: Vivek Verma, Sanjeev Kumar
Abstract:
This study presents a novel quantum image encryption algorithm, using a 3D chaotic map and controlled qubit-level scrambling operations. The newly proposed 3D-BNM chaotic map effectively reduces the degradation of chaotic dynamics resulting from the finite word length effect. It facilitates the generation of highly unpredictable random sequences and enhances chaotic performance. The system’s efficacy is additionally enhanced by the inclusion of a SHA-256 hash function. Initially, classical plain images are converted into their quantum equivalents using the Novel Enhanced Quantum Representation (NEQR) model. The Generalized Quantum Arnold Transformation (GQAT) is then applied to disrupt the coordinate information of the quantum image. Subsequently, to diffuse the pixel values of the scrambled image, XOR operations are performed using pseudorandom sequences generated by the 3D-BNM chaotic map. Furthermore, to enhance the randomness and reduce the correlation among the pixels in the resulting cipher image, a controlled qubit-level scrambling operation is employed. The encryption process utilizes fundamental quantum gates such as C-NOT and CCNOT. Both theoretical and numerical simulations validate the effectiveness of the proposed algorithm against various statistical and differential attacks. Moreover, the proposed encryption algorithm operates with low computational complexity.Keywords: 3D Chaotic map, SHA-256, quantum image encryption, Qubit level scrambling, NEQR
Procedia PDF Downloads 102006 Influence of Insulation System Methods on Dissipation Factor and Voltage Endurance
Authors: Farzad Yavari, Hamid Chegini, Saeed Lotfi
Abstract:
This paper reviews the comparison of Resin Rich (RR) and Vacuum Pressure Impregnation (VPI) insulation system qualities for stator bar of rotating electrical machines. Voltage endurance and tangent delta are two diagnostic tests to determine the quality of insulation systems. The paper describes the trend of dissipation factor while performing voltage endurance test for different stator bar samples made with RR and VPI insulation system methods. Some samples were made with the same strands and insulation thickness but with different main wall material to prove the influence of insulation system methods on stator bar quality. Also, some of the samples were subjected to voltage at the temperature of their insulation class, and their dissipation factor changes were measured and studied.Keywords: VPI, resin rich, insulation, stator bar, dissipation factor, voltage endurance
Procedia PDF Downloads 1972005 Measuring Green Growth Indicators: Implication for Policy
Authors: Hanee Ryu
Abstract:
The former president Lee Myung-bak's administration of Korea presented “green growth” as a catchphrase from 2008. He declared “low-carbon, green growth” the nation's vision for the next decade according to United Nation Framework on Climate Change. The government designed omnidirectional policy for low-carbon and green growth with concentrating all effort of departments. The structural change was expected because this slogan is the identity of the government, which is strongly driven with the whole department. After his administration ends, the purpose of this paper is to quantify the policy effect and to compare with the value of the other OECD countries. The major target values under direct policy objectives were suggested, but it could not capture the entire landscape on which the policy makes changes. This paper figures out the policy impacts through comparing the value of ex-ante between the one of ex-post. Furthermore, each index level of Korea’s low-carbon and green growth comparing with the value of the other OECD countries. To measure the policy effect, indicators international organizations have developed are considered. Environmental Sustainable Index (ESI) and Environmental Performance Index (EPI) have been developed by Yale University’s Center for Environmental Law and Policy and Columbia University’s Center for International Earth Science Information Network in collaboration with the World Economic Forum and Joint Research Center of European Commission. It has been widely used to assess the level of natural resource endowments, pollution level, environmental management efforts and society’s capacity to improve its environmental performance over time. Recently OCED publish the Green Growth Indicator for monitoring progress towards green growth based on internationally comparable data. They build up the conceptual framework and select indicators according to well specified criteria: economic activities, natural asset base, environmental dimension of quality of life and economic opportunities and policy response. It considers the socio-economic context and reflects the characteristic of growth. Some selected indicators are used for measuring the level of changes the green growth policies have induced in this paper. As results, the CO2 productivity and energy productivity show trends of declination. It means that policy intended industry structure shift for achieving carbon emission target affects weakly in the short-term. Increasing green technologies patents might result from the investment of previous period. The increasing of official development aids which can be immediately embarked by political decision with no time lag present only in 2008-2009. It means international collaboration and investment to developing countries via ODA has not succeeded since the initial stage of his administration. The green growth framework makes the public expect structural change, but it shows sporadic effect. It needs organization to manage it in terms of the long-range perspectives. Energy, climate change and green growth are not the issue to be handled in the one period of the administration. The policy mechanism to transfer cost problem to value creation should be developed consistently.Keywords: comparing ex-ante between ex-post indicator, green growth indicator, implication for green growth policy, measuring policy effect
Procedia PDF Downloads 4482004 Evolving Credit Scoring Models using Genetic Programming and Language Integrated Query Expression Trees
Authors: Alexandru-Ion Marinescu
Abstract:
There exist a plethora of methods in the scientific literature which tackle the well-established task of credit score evaluation. In its most abstract form, a credit scoring algorithm takes as input several credit applicant properties, such as age, marital status, employment status, loan duration, etc. and must output a binary response variable (i.e. “GOOD” or “BAD”) stating whether the client is susceptible to payment return delays. Data imbalance is a common occurrence among financial institution databases, with the majority being classified as “GOOD” clients (clients that respect the loan return calendar) alongside a small percentage of “BAD” clients. But it is the “BAD” clients we are interested in since accurately predicting their behavior is crucial in preventing unwanted loss for loan providers. We add to this whole context the constraint that the algorithm must yield an actual, tractable mathematical formula, which is friendlier towards financial analysts. To this end, we have turned to genetic algorithms and genetic programming, aiming to evolve actual mathematical expressions using specially tailored mutation and crossover operators. As far as data representation is concerned, we employ a very flexible mechanism – LINQ expression trees, readily available in the C# programming language, enabling us to construct executable pieces of code at runtime. As the title implies, they model trees, with intermediate nodes being operators (addition, subtraction, multiplication, division) or mathematical functions (sin, cos, abs, round, etc.) and leaf nodes storing either constants or variables. There is a one-to-one correspondence between the client properties and the formula variables. The mutation and crossover operators work on a flattened version of the tree, obtained via a pre-order traversal. A consequence of our chosen technique is that we can identify and discard client properties which do not take part in the final score evaluation, effectively acting as a dimensionality reduction scheme. We compare ourselves with state of the art approaches, such as support vector machines, Bayesian networks, and extreme learning machines, to name a few. The data sets we benchmark against amount to a total of 8, of which we mention the well-known Australian credit and German credit data sets, and the performance indicators are the following: percentage correctly classified, area under curve, partial Gini index, H-measure, Brier score and Kolmogorov-Smirnov statistic, respectively. Finally, we obtain encouraging results, which, although placing us in the lower half of the hierarchy, drive us to further refine the algorithm.Keywords: expression trees, financial credit scoring, genetic algorithm, genetic programming, symbolic evolution
Procedia PDF Downloads 1172003 Effect of the Soil-Foundation Interface Condition in the Determination of the Resistance Domain of Rigid Shallow Foundations
Authors: Nivine Abbas, Sergio Lagomarsino, Serena Cattari
Abstract:
The resistance domain of a generally loaded rigid shallow foundation is normally represented as an interaction diagram limited by a failure surface in the three dimensional (3D) load space (N, V, M), where N is the vertical centric load component, V is the horizontal load component and M is the bending moment component. Usually, this resistance domain is constructed neglecting the foundation sliding mechanism that take place at the level of soil-foundation interface once the applied horizontal load exceeds the interface frictional resistance of the foundation. This issue is translated in the literature by the fact that the failure limit in the (2D) load space (N, V) is constructed as a parabola having an initial slope, at the center of the coordinate system, that depends, in some works, only of the soil friction angle, and in other works, has an empirical value. However, considering a given geometry of the foundation lying on a given soil type, the initial slope of the failure limit must change, for instance, when varying the roughness of the foundation surface at its interface with the soil. The present study discusses the effect of the soil-foundation interface condition on the construction of the resistance domain, and proposes a correction to be applied to the failure limit in order to overcome this effect.Keywords: soil-foundation interface, sliding mechanism, soil shearing, resistance domain, rigid shallow foundation
Procedia PDF Downloads 4602002 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications
Authors: Atish Bagchi, Siva Chandrasekaran
Abstract:
Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning
Procedia PDF Downloads 1502001 Possible Exposure of Persons with Cardiac Pacemakers to Extremely Low Frequency (ELF) Electric and Magnetic Fields
Authors: Leena Korpinen, Rauno Pääkkönen, Fabriziomaria Gobba, Vesa Virtanen
Abstract:
The number of persons with implanted cardiac pacemakers (PM) has increased in Western countries. The aim of this paper is to investigate the possible situations where persons with a PM may be exposed to extremely low frequency (ELF) electric (EF) and magnetic fields (MF) that may disturb their PM. Based on our earlier studies, it is possible to find such high public exposure to EFs only in some places near 400 kV power lines, where an EF may disturb a PM in unipolar mode. Such EFs cannot be found near 110 kV power lines. Disturbing MFs can be found near welding machines. However, we do not have measurement data from welding. Based on literature and earlier studies at Tampere University of Technology, it is difficult to find public EF or MF exposure that is high enough to interfere with PMs.Keywords: cardiac pacemaker, electric field, magnetic field, electrical engineering
Procedia PDF Downloads 4322000 Weight Comparison of Oil and Dry Type Distribution Transformers
Authors: Murat Toren, Mehmet Çelebi
Abstract:
Reducing the weight of transformers while providing good performance, cost reduction and increased efficiency is important. Weight is one of the most significant factors in all electrical machines, and as such, many transformer design parameters are related to weight calculations. This study presents a comparison of the weight of oil type transformers and dry type transformer weight. Oil type transformers are mainly used in industry; however, dry type transformers are becoming more widespread in recent years. MATLAB is typically used for designing transformers and design parameters (rated voltages, core loss, etc.) along with design in ANSYS Maxwell. Similar to other studies, this study presented that the dry type transformer option is limited. Moreover, the commonly-used 50 kVA distribution transformers in the industry are oil type and dry type transformers are designed and considered in terms of weight. Currently, the preference for low-cost oil-type transformers would change if costs for dry-type transformer were more competitive. The aim of this study was to compare the weight of transformers, which is a substantial cost factor, and to provide an evaluation about increasing the use of dry type transformers.Keywords: weight, optimization, oil-type transformers, dry-type transformers
Procedia PDF Downloads 3531999 Operator Efficiency Study for Assembly Line Optimization at Semiconductor Assembly and Test
Authors: Rohana Abdullah, Md Nizam Abd Rahman, Seri Rahayu Kamat
Abstract:
Operator efficiency aspect is gaining importance in ensuring optimized usage of resources especially in the semi-automated manufacturing environment. This paper addresses a case study done to solve operator efficiency and line balancing issue at a semiconductor assembly and test manufacturing. A Man-to-Machine (M2M) work study technique is used to study operator current utilization and determine the optimum allocation of the operators to the machines. Critical factors such as operator activity, activity frequency and operator competency level are considered to gain insight on the parameters that affects the operator utilization. Equipment standard time and overall equipment efficiency (OEE) information are also gathered and analyzed to achieve a balanced and optimized production.Keywords: operator efficiency, optimized production, line balancing, industrial and manufacturing engineering
Procedia PDF Downloads 7291998 A Group Setting of IED in Microgrid Protection Management System
Authors: Jyh-Cherng Gu, Ming-Ta Yang, Chao-Fong Yan, Hsin-Yung Chung, Yung-Ruei Chang, Yih-Der Lee, Chen-Min Chan, Chia-Hao Hsu
Abstract:
There are a number of distributed generations (DGs) installed in microgrid, which may have diverse path and direction of power flow or fault current. The overcurrent protection scheme for the traditional radial type distribution system will no longer meet the needs of microgrid protection. Integrating the intelligent electronic device (IED) and a supervisory control and data acquisition (SCADA) with IEC 61850 communication protocol, the paper proposes a microgrid protection management system (MPMS) to protect power system from the fault. In the proposed method, the MPMS performs logic programming of each IED to coordinate their tripping sequence. The GOOSE message defined in IEC 61850 is used as the transmission information medium among IEDs. Moreover, to cope with the difference in fault current of microgrid between grid-connected mode and islanded mode, the proposed MPMS applies the group setting feature of IED to protect system and robust adaptability. Once the microgrid topology varies, the MPMS will recalculate the fault current and update the group setting of IED. Provided there is a fault, IEDs will isolate the fault at once. Finally, the Matlab/Simulink and Elipse Power Studio software are used to simulate and demonstrate the feasibility of the proposed method.Keywords: IEC 61850, IED, group Setting, microgrid
Procedia PDF Downloads 4611997 Study the Dynamic Behavior of Irregular Buildings by the Analysis Method Accelerogram
Authors: Beciri Mohamed Walid
Abstract:
Some architectural conditions required some shapes often lead to an irregular distribution of masses, rigidities and resistances. The main object of the present study consists in estimating the influence of the irregularity both in plan and in elevation which presenting some structures on the dynamic characteristics and his influence on the behavior of this structures. To do this, it is necessary to make apply both dynamic methods proposed by the RPA99 (spectral modal method and method of analysis by accelerogram) on certain similar prototypes and to analyze the parameters measuring the answer of these structures and to proceed to a comparison of the results.Keywords: structure, irregular, code, seismic, method, force, period
Procedia PDF Downloads 3101996 Flammability and Smoke Toxicity of Rainscreen Façades
Authors: Gabrielle Peck, Ryan Hayes
Abstract:
Four façade systems were tested using a reduced height BS 8414-2 (5 m) test rig. An L-shaped masonry test wall was clad with three types of insulation and an aluminum composite panel with a non-combustible filling (meeting Euroclass A2). A large (3 MW) wooden crib was ignited in a recess at the base of the L, and the fire was allowed to burn for 30 minutes. Air velocity measurements and gas samples were taken from the main ventilation duct and also a small additional ventilation duct, like those in an apartment bathroom or kitchen. This provided a direct route of travel for smoke from the building façade to a theoretical room using a similar design to many high-rise buildings where the vent is connected to (approximately) 30 m³ rooms. The times to incapacitation and lethality of the effluent were calculated for both the main exhaust vent and for a vent connected to a theoretical 30 m³ room. The rainscreen façade systems tested were the common combinations seen in many tower blocks across the UK. Three tests using ACM A2 with Stonewool, Phenolic foam, and Polyisocyanurate (PIR) foam. A fourth test was conducted with PIR and ACM-PE (polyethylene core). Measurements in the main exhaust duct were representative of the effluent from the burning wood crib. FEDs showed incapacitation could occur up to 30 times quicker with combustible insulation than non-combustible insulation, with lethal gas concentrations accumulating up to 2.7 times faster than other combinations. The PE-cored ACM/PIR combination produced a ferocious fire, resulting in the termination of the test after 13.5 minutes for safety reasons. Occupants of the theoretical room in the PIR/ACM A2 test reached a FED of 1 after 22 minutes; for PF/ACM A2, this took 25 minutes, and for stone wool, a lethal dose measurement of 0.6 was reached at the end of the 30-minute test. In conclusion, when measuring smoke toxicity in the exhaust duct, there is little difference between smoke toxicity measurements between façade systems. Toxicity measured in the main exhaust is largely a result of the wood crib used to ignite the façade system. The addition of a vent allowed smoke toxicity to be quantified in the cavity of the façade, providing a realistic way of measuring the toxicity of smoke that could enter an apartment from a façade fire.Keywords: smoke toxicity, large-scale testing, BS8414, FED
Procedia PDF Downloads 60