Search results for: benchmark
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 404

Search results for: benchmark

194 Comparative Study of Deep Reinforcement Learning Algorithm Against Evolutionary Algorithms for Finding the Optimal Values in a Simulated Environment Space

Authors: Akshay Paranjape, Nils Plettenberg, Robert Schmitt

Abstract:

Traditional optimization methods like evolutionary algorithms are widely used in production processes to find an optimal or near-optimal solution of control parameters based on the simulated environment space of a process. These algorithms are computationally intensive and therefore do not provide the opportunity for real-time optimization. This paper utilizes the Deep Reinforcement Learning (DRL) framework to find an optimal or near-optimal solution for control parameters. A model based on maximum a posteriori policy optimization (Hybrid-MPO) that can handle both numerical and categorical parameters is used as a benchmark for comparison. A comparative study shows that DRL can find optimal solutions of similar quality as compared to evolutionary algorithms while requiring significantly less time making them preferable for real-time optimization. The results are confirmed in a large-scale validation study on datasets from production and other fields. A trained XGBoost model is used as a surrogate for process simulation. Finally, multiple ways to improve the model are discussed.

Keywords: reinforcement learning, evolutionary algorithms, production process optimization, real-time optimization, hybrid-MPO

Procedia PDF Downloads 77
193 Offline Signature Verification Using Minutiae and Curvature Orientation

Authors: Khaled Nagaty, Heba Nagaty, Gerard McKee

Abstract:

A signature is a behavioral biometric that is used for authenticating users in most financial and legal transactions. Signatures can be easily forged by skilled forgers. Therefore, it is essential to verify whether a signature is genuine or forged. The aim of any signature verification algorithm is to accommodate the differences between signatures of the same person and increase the ability to discriminate between signatures of different persons. This work presented in this paper proposes an automatic signature verification system to indicate whether a signature is genuine or not. The system comprises four phases: (1) The pre-processing phase in which image scaling, binarization, image rotation, dilation, thinning, and connecting ridge breaks are applied. (2) The feature extraction phase in which global and local features are extracted. The local features are minutiae points, curvature orientation, and curve plateau. The global features are signature area, signature aspect ratio, and Hu moments. (3) The post-processing phase, in which false minutiae are removed. (4) The classification phase in which features are enhanced before feeding it into the classifier. k-nearest neighbors and support vector machines are used. The classifier was trained on a benchmark dataset to compare the performance of the proposed offline signature verification system against the state-of-the-art. The accuracy of the proposed system is 92.3%.

Keywords: signature, ridge breaks, minutiae, orientation

Procedia PDF Downloads 121
192 Mixed Effects Models for Short-Term Load Forecasting for the Spanish Regions: Castilla-Leon, Castilla-La Mancha and Andalucia

Authors: C. Senabre, S. Valero, M. Lopez, E. Velasco, M. Sanchez

Abstract:

This paper focuses on an application of linear mixed models to short-term load forecasting. The challenge of this research is to improve a currently working model at the Spanish Transport System Operator, programmed by us, and based on linear autoregressive techniques and neural networks. The forecasting system currently forecasts each of the regions within the Spanish grid separately, even though the behavior of the load in each region is affected by the same factors in a similar way. A load forecasting system has been verified in this work by using the real data from a utility. In this research it has been used an integration of several regions into a linear mixed model as starting point to obtain the information from other regions. Firstly, the systems to learn general behaviors present in all regions, and secondly, it is identified individual deviation in each regions. The technique can be especially useful when modeling the effect of special days with scarce information from the past. The three most relevant regions of the system have been used to test the model, focusing on special day and improving the performance of both currently working models used as benchmark. A range of comparisons with different forecasting models has been conducted. The forecasting results demonstrate the superiority of the proposed methodology.

Keywords: short-term load forecasting, mixed effects models, neural networks, mixed effects models

Procedia PDF Downloads 161
191 Performance Comparison of Thread-Based and Event-Based Web Servers

Authors: Aikaterini Kentroti, Theodore H. Kaskalis

Abstract:

Today, web servers are expected to serve thousands of client requests concurrently within stringent response time limits. In this paper, we evaluate experimentally and compare the performance as well as the resource utilization of popular web servers, which differ in their approach to handle concurrency. More specifically, Central Processing Unit (CPU)- and I/O intensive tests were conducted against the thread-based Apache and Go as well as the event-based Nginx and Node.js under increasing concurrent load. The tests involved concurrent users requesting a term of the Fibonacci sequence (the 10th, 20th, 30th) and the content of a table from the database. The results show that Go achieved the best performance in all benchmark tests. For example, Go reached two times higher throughput than Node.js and five times higher than Apache and Nginx in the 20th Fibonacci term test. In addition, Go had the smallest memory footprint and demonstrated the most efficient resource utilization, in terms of CPU usage. Instead, Node.js had by far the largest memory footprint, consuming up to 90% more memory than Nginx and Apache. Regarding the performance of Apache and Nginx, our findings indicate that Hypertext Preprocessor (PHP) becomes a bottleneck when the servers are requested to respond by performing CPU-intensive tasks under increasing concurrent load.

Keywords: apache, Go, Nginx, node.js, web server benchmarking

Procedia PDF Downloads 63
190 The Effect of Mandatory International Financial Reporting Standards Reporting on Investors' Herding Practice: Evidence from Eu Equity Markets

Authors: Mohammed Lawal Danrimi, Ervina Alfan, Mazni Abdullah

Abstract:

The purpose of this study is to investigate whether the adoption of International Financial Reporting Standards (IFRS) encourages information-based trading and mitigates investors’ herding practice in emerging EU equity markets. Utilizing a modified non-linear model of cross-sectional absolute deviation (CSAD), we find that the hypothesis that mandatory IFRS adoption improves the information set of investors and reduces irrational investment behavior may in some cases be incorrect, and the reverse may be true. For instance, with regard to herding concerns, the new reporting benchmark has rather aggravated investors’ herding practice. However, we also find that mandatory IFRS adoption does not appear to be the only instigator of the observed herding practice; national institutional factors, particularly regulatory quality, political stability and control of corruption, also significantly contribute to investors’ herd formation around the new reporting regime. The findings would be of interest to academics, regulators and policymakers in performing a cost-benefit analysis of the so-called better reporting regime, as well as financial statement users who make decisions based on firms’ fundamental variables, treating them as significant indicators of future market movement.

Keywords: equity markets, herding, IFRS, CSAD

Procedia PDF Downloads 148
189 Experimental Investigation of Fluid Dynamic Effects on Crystallisation Scale Growth and Suppression in Agitation Tank

Authors: Prasanjit Das, M. M. K. Khan, M. G. Rasul, Jie Wu, I. Youn

Abstract:

Mineral scale formation is undoubtedly a more serious problem in the mineral industry than other process industries. To better understand scale growth and suppression, an experimental model is proposed in this study for supersaturated crystallised solutions commonly found in mineral process plants. In this experiment, surface crystallisation of potassium nitrate (KNO3) on the wall of the agitation tank and agitation effects on the scale growth and suppression are studied. The new quantitative scale suppression model predicts that at lower agitation speed, the scale growth rate is enhanced and at higher agitation speed, the scale suppression rate increases due to the increased flow erosion effect. A lab-scale agitation tank with and without baffles were used as a benchmark in this study. The fluid dynamic effects on scale growth and suppression in the agitation tank with three different size impellers (diameter 86, 114, 160 mm and model A310 with flow number 0.56) at various ranges of rotational speed (up to 700 rpm) and solution with different concentration (4.5, 4.75 and 5.25 mol/dm3) were investigated. For more elucidation, the effects of the different size of the impeller on wall surface scale growth and suppression rate as well as bottom settled scale accumulation rate are also discussed. Emphasis was placed on applications in the mineral industry, although results are also relevant to other industrial applications.

Keywords: agitation tank, crystallisation, impeller speed, scale

Procedia PDF Downloads 189
188 Deep Routing Strategy: Deep Learning based Intelligent Routing in Software Defined Internet of Things.

Authors: Zabeehullah, Fahim Arif, Yawar Abbas

Abstract:

Software Defined Network (SDN) is a next genera-tion networking model which simplifies the traditional network complexities and improve the utilization of constrained resources. Currently, most of the SDN based Internet of Things(IoT) environments use traditional network routing strategies which work on the basis of max or min metric value. However, IoT network heterogeneity, dynamic traffic flow and complexity demands intelligent and self-adaptive routing algorithms because traditional routing algorithms lack the self-adaptions, intelligence and efficient utilization of resources. To some extent, SDN, due its flexibility, and centralized control has managed the IoT complexity and heterogeneity but still Software Defined IoT (SDIoT) lacks intelligence. To address this challenge, we proposed a model called Deep Routing Strategy (DRS) which uses Deep Learning algorithm to perform routing in SDIoT intelligently and efficiently. Our model uses real-time traffic for training and learning. Results demonstrate that proposed model has achieved high accuracy and low packet loss rate during path selection. Proposed model has also outperformed benchmark routing algorithm (OSPF). Moreover, proposed model provided encouraging results during high dynamic traffic flow.

Keywords: SDN, IoT, DL, ML, DRS

Procedia PDF Downloads 84
187 Comparison of Sourcing Process in Supply Chain Operation References Model and Business Information Systems

Authors: Batuhan Kocaoglu

Abstract:

Although using powerful systems like ERP (Enterprise Resource Planning), companies still cannot benchmark their processes and measure their process performance easily based on predefined SCOR (Supply Chain Operation References) terms. The purpose of this research is to identify common and corresponding processes to present a conceptual model to model and measure the purchasing process of an organization. The main steps for the research study are: Literature review related to 'procure to pay' process in ERP system; Literature review related to 'sourcing' process in SCOR model; To develop a conceptual model integrating 'sourcing' of SCOR model and 'procure to pay' of ERP model. In this study, we examined the similarities and differences between these two models. The proposed framework is based on the assumptions that are drawn from (1) the body of literature, (2) the authors’ experience by working in the field of enterprise and logistics information systems. The modeling framework provides a structured and systematic way to model and decompose necessary information from conceptual representation to process element specification. This conceptual model will help the organizations to make quality purchasing system measurement instruments and tools. And offered adaptation issues for ERP systems and SCOR model will provide a more benchmarkable and worldwide standard business process.

Keywords: SCOR, ERP, procure to pay, sourcing, reference model

Procedia PDF Downloads 339
186 Normalizing Scientometric Indicators of Individual Publications Using Local Cluster Detection Methods on Citation Networks

Authors: Levente Varga, Dávid Deritei, Mária Ercsey-Ravasz, Răzvan Florian, Zsolt I. Lázár, István Papp, Ferenc Járai-Szabó

Abstract:

One of the major shortcomings of widely used scientometric indicators is that different disciplines cannot be compared with each other. The issue of cross-disciplinary normalization has been long discussed, but even the classification of publications into scientific domains poses problems. Structural properties of citation networks offer new possibilities, however, the large size and constant growth of these networks asks for precaution. Here we present a new tool that in order to perform cross-field normalization of scientometric indicators of individual publications relays on the structural properties of citation networks. Due to the large size of the networks, a systematic procedure for identifying scientific domains based on a local community detection algorithm is proposed. The algorithm is tested with different benchmark and real-world networks. Then, by the use of this algorithm, the mechanism of the scientometric indicator normalization process is shown for a few indicators like the citation number, P-index and a local version of the PageRank indicator. The fat-tail trend of the article indicator distribution enables us to successfully perform the indicator normalization process.

Keywords: citation networks, cross-field normalization, local cluster detection, scientometric indicators

Procedia PDF Downloads 176
185 Taylor’s Law and Relationship between Life Expectancy at Birth and Variance in Age at Death in Period Life Table

Authors: David A. Swanson, Lucky M. Tedrow

Abstract:

Taylor’s Law is a widely observed empirical pattern that relates variances to means in sets of non-negative measurements via an approximate power function, which has found application to human mortality. This study adds to this research by showing that Taylor’s Law leads to a model that reasonably describes the relationship between life expectancy at birth (e0, which also is equal to mean age at death in a life table) and variance at age of death in seven World Bank regional life tables measured at two points in time, 1970 and 2000. Using as a benchmark a non-random sample of four Japanese female life tables covering the period from 1950 to 2004, the study finds that the simple linear model provides reasonably accurate estimates of variance in age at death in a life table from e0, where the latter range from 60.9 to 85.59 years. Employing 2017 life tables from the Human Mortality Database, the simple linear model is used to provide estimates of variance at age in death for six countries, three of which have high e0 values and three of which have lower e0 values. The paper provides a substantive interpretation of Taylor’s Law relative to e0 and concludes by arguing that reasonably accurate estimates of variance in age at death in a period life table can be calculated using this approach, which also can be used where e0 itself is estimated rather than generated through the construction of a life table, a useful feature of the model.

Keywords: empirical pattern, mean age at death in a life table, mean age of a stationary population, stationary population

Procedia PDF Downloads 296
184 Probabilistic Seismic Loss Assessment of Reinforced Concrete (RC) Frame Buildings Pre- and Post-Rehabilitation

Authors: A. Flora, A. Di Lascio, D. Cardone, G. Gesualdi, G. Perrone

Abstract:

This paper considers the seismic assessment and retrofit of a pilotis-type RC frame building, which was designed for gravity loads only, prior to the introduction of seismic design provisions. Pilotis-type RC frame buildings, featuring an uniform infill throughout the height and an open ground floor, were, and still are, quite popular all over the world, as they offer large open areas very suitable for retail space at the ground floor. These architectural advantages, however, are of detriment to the building seismic behavior, as they can determine a soft-storey collapse mechanism. Extensive numerical analyses are carried out to quantify and benchmark the performance of the selected building, both in terms of overall collapse capacity and expected losses. Alternative retrofit strategies are then examined, including: (i) steel jacketing of RC columns and beam-column joints, (ii) steel bracing and (iv) seismic isolation. The Expected Annual Loss (EAL) of the selected case-study building, pre- and post-rehabilitation, is evaluated, following a probabilistic approach. The breakeven time of each solution is computed, comparing the initial cost of the retrofit intervention with expected benefit in terms of EAL reduction.

Keywords: expected annual loss, reinforced concrete buildings, seismic loss assessment, seismic retrofit

Procedia PDF Downloads 212
183 Mechanical Properties of Spark Plasma Sintered 2024 AA Reinforced with TiB₂ and Nano Yttrium

Authors: Suresh Vidyasagar Chevuri, D. B. Karunakar Chevuri

Abstract:

The main advantages of 'Metal Matrix Nano Composites (MMNCs)' include excellent mechanical performance, good wear resistance, low creep rate, etc. The method of fabrication of MMNCs is quite a challenge, which includes processing techniques like Spark Plasma Sintering (SPS), etc. The objective of the present work is to fabricate aluminum based MMNCs with the addition of small amounts of yttrium using Spark Plasma Sintering and to evaluate their mechanical and microstructure properties. Samples of 2024 AA with yttrium ranging from 0.1% to 0.5 wt% keeping 1 wt% TiB2 constant are fabricated by Spark Plasma Sintering (SPS). The mechanical property like hardness is determined using Vickers hardness testing machine. The metallurgical characterization of the samples is evaluated by Optical Microscopy (OM), Field Emission Scanning Electron Microscopy (FE-SEM) and X-Ray Diffraction (XRD). Unreinforced 2024 AA sample is also fabricated as a benchmark to compare its properties with that of the composite developed. It is found that the yttrium addition increases the above-mentioned properties to some extent and then decreases gradually when yttrium wt% increases beyond a point between 0.3 and 0.4 wt%. High density is achieved in the samples fabricated by spark plasma sintering when compared to any other fabrication route, and uniform distribution of yttrium is observed.

Keywords: spark plasma sintering, 2024 AA, yttrium addition, microstructure characterization, mechanical properties

Procedia PDF Downloads 206
182 Exploring the Critical Success Factors of Construction Stakeholders Team Effectiveness

Authors: Olusegun Akinsiku, Olukayode Oyediran, Koleola Odusami

Abstract:

A construction project is usually made up of a variety of stakeholders whose interests may positively or negatively impact on the outcome of the project execution. The variability of project stakeholders is apparent in their cultural differences, professional background and ethics, and differences in ideas. The need for the effectiveness of construction teams has been investigated as this is an important aspect to meeting client’s expectations in the construction industry. This study adopts a cross-sectional descriptive survey with the purpose of identifying the critical success factors (CSFs) associated with the team effectiveness of construction projects stakeholders, their relationship and the effects on construction project performance. The instrument for data collection was a designed questionnaire which was administered to construction professionals in the construction industry in Lagos State, Nigeria using proportionate stratified sampling. The highest ranked identified CSFs include “team trust”, “esprit de corps among members” and “team cohesiveness”. Using factor analysis and considering the effects of team cohesiveness on project performance, the identified CSFs were categorized into three groups namely cognitive attributes, behavior and processes attributes and affective attributes. All the three groups were observed to have a strong correlation with project performance. The findings of this study are useful in helping construction stakeholders benchmark the team effectiveness factors that will guarantee project success.

Keywords: construction, critical success factors, performance, stakeholders, team effectiveness

Procedia PDF Downloads 104
181 A Design Decision Framework for Net-Zero Carbon Buildings in Hot Climates: A Modeled Approach and Expert’s Feedback

Authors: Eric Ohene, Albert P. C. Chan, Shu-Chien HSU

Abstract:

The rising building energy consumption and related carbon emissions make it necessary to construct net-zero carbon buildings (NZCBs). The objective of net-zero buildings has raised the benchmark for building performance and will alter how buildings are designed and constructed. However, there have been growing concerns about uncertainty in net-zero building design and cost implications in decision-making. Lessons from practice have shown that a robust net-zero building design is complex, expensive, and time-consuming. Moreover, climate conditions have an enormous implication for choosing the best-optimal passive and active solutions to ensure building energy performance while ensuring the indoor comfort performance of occupants. It is observed that 20% of the design decisions made in the initial design phase influence 80% of all design decisions. To design and construct NZCBs, it is crucial to ensure adequate decision-making during the early design phases. Therefore, this study aims to explore practical strategies to design NZCBs and to offer a design framework that could help decision-making during the design stage of net-zero buildings. A parametric simulation approach was employed, and experts (i.e., architects, building designers) perspectives on the decision framework were solicited. The study could be helpful to building designers and architects to guide their decision-making during the design stage of NZCBs.

Keywords: net-zero, net-zero carbon building, energy efficiency, parametric simulation, hot climate

Procedia PDF Downloads 70
180 Modeling and Benchmarking the Thermal Energy Performance of Palm Oil Production Plant

Authors: Mathias B. Michael, Esther T. Akinlabi, Tien-Chien Jen

Abstract:

Thermal energy consumption in palm oil production plant comprises mainly of steam, hot water and hot air. In most efficient plants, hot water and air are generated from the steam supply system. Research has shown that thermal energy utilize in palm oil production plants is about 70 percent of the total energy consumption of the plant. In order to manage the plants’ energy efficiently, the energy systems are modelled and optimized. This paper aimed to present the model of steam supply systems of a typical palm oil production plant in Ghana. The models include exergy and energy models of steam boiler, steam turbine and the palm oil mill. The paper further simulates the virtual plant model to obtain the thermal energy performance of the plant under study. The simulation results show that, under normal operating condition, the boiler energy performance is considerably below the expected level as a result of several factors including intermittent biomass fuel supply, significant moisture content of the biomass fuel and significant heat losses. The total thermal energy performance of the virtual plant is set as a baseline. The study finally recommends number of energy efficiency measures to improve the plant’s energy performance.

Keywords: palm biomass, steam supply, exergy and energy models, energy performance benchmark

Procedia PDF Downloads 323
179 Intelligent Transport System: Classification of Traffic Signs Using Deep Neural Networks in Real Time

Authors: Anukriti Kumar, Tanmay Singh, Dinesh Kumar Vishwakarma

Abstract:

Traffic control has been one of the most common and irritating problems since the time automobiles have hit the roads. Problems like traffic congestion have led to a significant time burden around the world and one significant solution to these problems can be the proper implementation of the Intelligent Transport System (ITS). It involves the integration of various tools like smart sensors, artificial intelligence, position technologies and mobile data services to manage traffic flow, reduce congestion and enhance driver's ability to avoid accidents during adverse weather. Road and traffic signs’ recognition is an emerging field of research in ITS. Classification problem of traffic signs needs to be solved as it is a major step in our journey towards building semi-autonomous/autonomous driving systems. The purpose of this work focuses on implementing an approach to solve the problem of traffic sign classification by developing a Convolutional Neural Network (CNN) classifier using the GTSRB (German Traffic Sign Recognition Benchmark) dataset. Rather than using hand-crafted features, our model addresses the concern of exploding huge parameters and data method augmentations. Our model achieved an accuracy of around 97.6% which is comparable to various state-of-the-art architectures.

Keywords: multiclass classification, convolution neural network, OpenCV

Procedia PDF Downloads 146
178 A Comparative Asessment of Some Algorithms for Modeling and Forecasting Horizontal Displacement of Ialy Dam, Vietnam

Authors: Kien-Trinh Thi Bui, Cuong Manh Nguyen

Abstract:

In order to simulate and reproduce the operational characteristics of a dam visually, it is necessary to capture the displacement at different measurement points and analyze the observed movement data promptly to forecast the dam safety. The accuracy of forecasts is further improved by applying machine learning methods to data analysis progress. In this study, the horizontal displacement monitoring data of the Ialy hydroelectric dam was applied to machine learning algorithms: Gaussian processes, multi-layer perceptron neural networks, and the M5-rules algorithm for modelling and forecasting of horizontal displacement of the Ialy hydropower dam (Vietnam), respectively, for analysing. The database which used in this research was built by collecting time series of data from 2006 to 2021 and divided into two parts: training dataset and validating dataset. The final results show all three algorithms have high performance for both training and model validation, but the MLPs is the best model. The usability of them are further investigated by comparison with a benchmark models created by multi-linear regression. The result show the performance which obtained from all the GP model, the MLPs model and the M5-Rules model are much better, therefore these three models should be used to analyze and predict the horizontal displacement of the dam.

Keywords: Gaussian processes, horizontal displacement, hydropower dam, Ialy dam, M5-Rules, multi-layer perception neural networks

Procedia PDF Downloads 174
177 Transition Metal Carbodiimide vs. Spinel Matrices for Photocatalytic Water Oxidation

Authors: Karla Lienau, Rafael Müller, René Moré, Debora Ressnig, Dan Cook, Richard Walton, Greta R. Patzke

Abstract:

The increasing demand for renewable energy sources and storable fuels underscores the high potential of artificial photosynthesis. The four electron transfer process of water oxidation remains the bottleneck of water splitting, so that special emphasis is placed on the development of economic, stable and efficient water oxidation catalysts (WOCs). Our investigations introduced cobalt carbodiimide CoNCN and its transition metal analogues as WOC types, and further studies are focused on the interaction of different transition metals in the convenient all-nitrogen/carbon matrix. This provides further insights into the nature of the ‘true catalyst’ for cobalt centers in this non-oxide environment. Water oxidation activity is evaluated with complementary methods, namely photocatalytically using a Ru-dye sensitized standard setup as well as electrocatalytically, via immobilization of the WOCs on glassy carbon electrodes. To further explore the tuning potential of transition metal combinations, complementary investigations were carried out in oxidic spinel WOC matrices with more versatile host options than the carbodiimide framework. The influence of the preparative history on the WOC performance was evaluated with different synthetic methods (e.g. hydrothermally or microwave assisted). Moreover, the growth mechanism of nanoscale Co3O4-spinel as a benchmark WOC was investigated with in-situ PXRD techniques.

Keywords: carbodiimide, photocatalysis, spinels, water oxidation

Procedia PDF Downloads 259
176 A Tool to Measure Efficiency and Trust Towards eXplainable Artificial Intelligence in Conflict Detection Tasks

Authors: Raphael Tuor, Denis Lalanne

Abstract:

The ATM research community is missing suitable tools to design, test, and validate new UI prototypes. Important stakes underline the implementation of both DSS and XAI methods into current systems. ML-based DSS are gaining in relevance as ATFM becomes increasingly complex. However, these systems only prove useful if a human can understand them, and thus new XAI methods are needed. The human-machine dyad should work as a team and should understand each other. We present xSky, a configurable benchmark tool that allows us to compare different versions of an ATC interface in conflict detection tasks. Our main contributions to the ATC research community are (1) a conflict detection task simulator (xSky) that allows to test the applicability of visual prototypes on scenarios of varying difficulty and outputting relevant operational metrics (2) a theoretical approach to the explanations of AI-driven trajectory predictions. xSky addresses several issues that were identified within available research tools. Researchers can configure the dimensions affecting scenario difficulty with a simple CSV file. Both the content and appearance of the XAI elements can be customized in a few steps. As a proof-of-concept, we implemented an XAI prototype inspired by the maritime field.

Keywords: air traffic control, air traffic simulation, conflict detection, explainable artificial intelligence, explainability, human-automation collaboration, human factors, information visualization, interpretability, trajectory prediction

Procedia PDF Downloads 131
175 Static and Dynamic Analysis of Hyperboloidal Helix Having Thin Walled Open and Close Sections

Authors: Merve Ermis, Murat Yılmaz, Nihal Eratlı, Mehmet H. Omurtag

Abstract:

The static and dynamic analyses of hyperboloidal helix having the closed and the open square box sections are investigated via the mixed finite element formulation based on Timoshenko beam theory. Frenet triad is considered as local coordinate systems for helix geometry. Helix domain is discretized with a two-noded curved element and linear shape functions are used. Each node of the curved element has 12 degrees of freedom, namely, three translations, three rotations, two shear forces, one axial force, two bending moments and one torque. Finite element matrices are derived by using exact nodal values of curvatures and arc length and it is interpolated linearly throughout the element axial length. The torsional moments of inertia for close and open square box sections are obtained by finite element solution of St. Venant torsion formulation. With the proposed method, the torsional rigidity of simply and multiply connected cross-sections can be also calculated in same manner. The influence of the close and the open square box cross-sections on the static and dynamic analyses of hyperboloidal helix is investigated. The benchmark problems are represented for the literature.

Keywords: hyperboloidal helix, squared cross section, thin walled cross section, torsional rigidity

Procedia PDF Downloads 336
174 Understanding Innovation by Analyzing the Pillars of the Global Competitiveness Index

Authors: Ujjwala Bhand, Mridula Goel

Abstract:

Global Competitiveness Index (GCI) prepared by World Economic Forum has become a benchmark in studying the competitiveness of countries and for understanding the factors that enable competitiveness. Innovation is a key pillar in competitiveness and has the unique property of enabling exponential economic growth. This paper attempts to analyze how the pillars comprising the Global Competitiveness Index affect innovation and whether GDP growth can directly affect innovation outcomes for a country. The key objective of the study is to identify areas on which governments of developing countries can focus policies and programs to improve their country’s innovativeness. We have compiled a panel data set for top innovating countries and large emerging economies called BRICS from 2007-08 to 2014-15 in order to find the significant factors that affect innovation. The results of the regression analysis suggest that government should make policies to improve labor market efficiency, establish sophisticated business networks, provide basic health and primary education to its people and strengthen the quality of higher education and training services in the economy. The achievements of smaller economies on innovation suggest that concerted efforts by governments can counter any size related disadvantage, and in fact can provide greater flexibility and speed in encouraging innovation.

Keywords: innovation, global competitiveness index, BRICS, economic growth

Procedia PDF Downloads 238
173 Overview of Multi-Chip Alternatives for 2.5 and 3D Integrated Circuit Packagings

Authors: Ching-Feng Chen, Ching-Chih Tsai

Abstract:

With the size of the transistor gradually approaching the physical limit, it challenges the persistence of Moore’s Law due to the development of the high numerical aperture (high-NA) lithography equipment and other issues such as short channel effects. In the context of the ever-increasing technical requirements of portable devices and high-performance computing, relying on the law continuation to enhance the chip density will no longer support the prospects of the electronics industry. Weighing the chip’s power consumption-performance-area-cost-cycle time to market (PPACC) is an updated benchmark to drive the evolution of the advanced wafer nanometer (nm). The advent of two and half- and three-dimensional (2.5 and 3D)- Very-Large-Scale Integration (VLSI) packaging based on Through Silicon Via (TSV) technology has updated the traditional die assembly methods and provided the solution. This overview investigates the up-to-date and cutting-edge packaging technologies for 2.5D and 3D integrated circuits (ICs) based on the updated transistor structure and technology nodes. The author concludes that multi-chip solutions for 2.5D and 3D IC packagings are feasible to prolong Moore’s Law.

Keywords: moore’s law, high numerical aperture, power consumption-performance-area-cost-cycle time to market, 2.5 and 3D- very-large-scale integration, packaging, through silicon via

Procedia PDF Downloads 96
172 Radial Basis Surrogate Model Integrated to Evolutionary Algorithm for Solving Computation Intensive Black-Box Problems

Authors: Abdulbaset Saad, Adel Younis, Zuomin Dong

Abstract:

For design optimization with high-dimensional expensive problems, an effective and efficient optimization methodology is desired. This work proposes a series of modification to the Differential Evolution (DE) algorithm for solving computation Intensive Black-Box Problems. The proposed methodology is called Radial Basis Meta-Model Algorithm Assisted Differential Evolutionary (RBF-DE), which is a global optimization algorithm based on the meta-modeling techniques. A meta-modeling assisted DE is proposed to solve computationally expensive optimization problems. The Radial Basis Function (RBF) model is used as a surrogate model to approximate the expensive objective function, while DE employs a mechanism to dynamically select the best performing combination of parameters such as differential rate, cross over probability, and population size. The proposed algorithm is tested on benchmark functions and real life practical applications and problems. The test results demonstrate that the proposed algorithm is promising and performs well compared to other optimization algorithms. The proposed algorithm is capable of converging to acceptable and good solutions in terms of accuracy, number of evaluations, and time needed to converge.

Keywords: differential evolution, engineering design, expensive computations, meta-modeling, radial basis function, optimization

Procedia PDF Downloads 362
171 Performance Effects of Demergers in India

Authors: Pavak Vyas, Hiral Vyas

Abstract:

Spin-offs commonly known as demergers in India, represents dismantling of conglomerates which is a common phenomenon in financial markets across the world. Demergers are carried out with different motives. A demerger generally refers to a corporate restructuring where, a large company divests its stake in in its subsidiary and distributes the shares of the subsidiary - demerged entity to the existing shareholders without any consideration. Demergers in Indian companies are over a decade old phenomena, with many companies opting for the same. This study examines the demerger regulations in Indian capital markets and the announcement period price reaction of demergers during year 2010-2015. We study total 97 demerger announcements by companies listed in India and try to establish that demergers results into abnormal returns for the shareholders of the parent company. Using event study methodology we have analyzed the security price performance of the announcement day effect 10 days prior to announcement to 10 days post demerger announcement. We find significant out-performance of the security over the benchmark index post demerger announcements. The cumulative average abnormal returns range from 3.71% on the day of announcement of a private demerger to 2.08% following 10 days surrounding the announcement, and cumulative average abnormal returns range from 5.67% on the day of announcement of a public demerger to 4.15% following10 days surrounding the announcement.

Keywords: demergers, event study, spin offs, stock returns

Procedia PDF Downloads 268
170 On-The-Fly Cross Sections Generation in Neutron Transport with Wide Energy Region

Authors: Rui Chen, Shu-min Zhou, Xiong-jie Zhang, Ren-bo Wang, Fan Huang, Bin Tang

Abstract:

During the temperature changes in reactor core, the nuclide cross section in reactor can vary with temperature, which eventually causes the changes of reactivity. To simulate the interaction between incident neutron and various materials at different temperatures on the nose, it is necessary to generate all the relevant reaction temperature-dependent cross section. Traditionally, the real time cross section generation method is used to avoid storing huge data but contains severe problems of low efficiency and adaptability for narrow energy region. Focused on the research on multi-temperature cross sections generation in real time during in neutron transport, this paper investigated the on-the-fly cross section generation method for resolved resonance region, thermal region and unresolved resonance region, and proposed the real time multi-temperature cross sections generation method based on double-exponential formula for resolved resonance region, as well as the Neville interpolation for thermal and unresolved resonance region. To prove the correctness and validity of multi-temperature cross sections generation based on wide energy region of incident neutron, the proposed method was applied in critical safety benchmark tests, which showed the capability for application in reactor multi-physical coupling simulation.

Keywords: cross section, neutron transport, numerical simulation, on-the-fly

Procedia PDF Downloads 173
169 A Hybrid Traffic Model for Smoothing Traffic Near Merges

Authors: Shiri Elisheva Decktor, Sharon Hornstein

Abstract:

Highway merges and unmarked junctions are key components in any urban road network, which can act as bottlenecks and create traffic disruption. Inefficient highway merges may trigger traffic instabilities such as stop-and-go waves, pose safety conditions and lead to longer journey times. These phenomena occur spontaneously if the average vehicle density exceeds a certain critical value. This study focuses on modeling the traffic using a microscopic traffic flow model. A hybrid traffic model, which combines human-driven and controlled vehicles is assumed. The controlled vehicles obey different driving policies when approaching the merge, or in the vicinity of other vehicles. We developed a co-simulation model in SUMO (Simulation of Urban Mobility), in which the human-driven cars are modeled using the IDM model, and the controlled cars are modeled using a dedicated controller. The scenario chosen for this study is a closed track with one merge and one exit, which could be later implemented using a scaled infrastructure on our lab setup. This will enable us to benchmark the results of this study obtained in simulation, to comparable results in similar conditions in the lab. The metrics chosen for the comparison of the performance of our algorithm on the overall traffic conditions include the average speed, wait time near the merge, and throughput after the merge, measured under different travel demand conditions (low, medium, and heavy traffic).

Keywords: highway merges, traffic modeling, SUMO, driving policy

Procedia PDF Downloads 76
168 BIM-Based Tool for Sustainability Assessment and Certification Documents Provision

Authors: Taki Eddine Seghier, Mohd Hamdan Ahmad, Yaik-Wah Lim, Samuel Opeyemi Williams

Abstract:

The assessment of building sustainability to achieve a specific green benchmark and the preparation of the required documents in order to receive a green building certification, both are considered as major challenging tasks for green building design team. However, this labor and time-consuming process can take advantage of the available Building Information Modeling (BIM) features such as material take-off and scheduling. Furthermore, the workflow can be automated in order to track potentially achievable credit points and provide rating feedback for several design options by using integrated Visual Programing (VP) to handle the stored parameters within the BIM model. Hence, this study proposes a BIM-based tool that uses Green Building Index (GBI) rating system requirements as a unique input case to evaluate the building sustainability in the design stage of the building project life cycle. The tool covers two key models for data extraction, firstly, a model for data extraction, calculation and the classification of achievable credit points in a green template, secondly, a model for the generation of the required documents for green building certification. The tool was validated on a BIM model of residential building and it serves as proof of concept that building sustainability assessment of GBI certification can be automatically evaluated and documented through BIM.

Keywords: green building rating system, GBRS, building information modeling, BIM, visual programming, VP, sustainability assessment

Procedia PDF Downloads 302
167 Improving the Patient Guidance Satisfaction and Integrity of Patients Hospitalized in Iodine-131 Isolation Rooms

Authors: Yu Sin Syu

Abstract:

Objective: The study aimed to improve the patient guidance satisfaction of patients hospitalized in iodine-131 isolation rooms, as well as the patient guidance completion rate for such patients. Method: A patient care guidance checklist and patient care guidance satisfaction questionnaire were administered to 29 patients who had previously been hospitalized in iodine-131 isolation rooms. The evaluation was conducted on a one-on-one basis, and its results showed that the patients’ satisfaction with patient guidance was only 3.7 points and that the completion rate for the patient guidance performed by nurses was only 67%. Therefore, various solutions were implemented to create a more complete patient guidance framework for nurses, including the incorporation of regular care-related training in in-service education courses; the establishment of patient care guidance standards for patients in iodine-131 isolation rooms; the establishment of inpatient care standards and auditing processes for iodine-131 isolation rooms; the creation of an introductory handbook on ward environment; Invite other the care team the revision of iodine-131 health education brochures; the creation of visual cards and videos covering equipment operation procedures; and introduction of QR codes. Results: Following the implementation of the above measures, the overall satisfaction of patients hospitalized in iodine-131 isolation rooms increased from 3.7 points to 4.6 points, and the completion rate for patient guidance rose from 67% to 100%. Conclusion: Given the excellent results achieved in this study, it is hoped that this nursing project can serve as a benchmark for other relevant departments.

Keywords: admission care guidance, guidance satisfaction, integrity, Iodine131 isolation

Procedia PDF Downloads 93
166 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance

Authors: Loai AbdAllah, Mahmoud Kaiyal

Abstract:

Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.

Keywords: missing values, incomplete data, distance, incomplete diabetes data

Procedia PDF Downloads 188
165 A Dynamic Solution Approach for Heart Disease Prediction

Authors: Walid Moudani

Abstract:

The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the coronary heart disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts’ knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.

Keywords: multi-classifier decisions tree, features reduction, dynamic programming, rough sets

Procedia PDF Downloads 381