Search results for: learning-oriented benchmark
215 Advanced Seismic Retrofit of a School Building by a DFP Base Isolation Solution
Authors: Stefano Sorace, Gloria Terenzi
Abstract:
The study of a base isolation seismic retrofit solution for a reinforced concrete school building is presented in this paper. The building was assumed as a benchmark structure for a Research Project financed by the Italian Department of Civil Protection, and is representative of several similar public edifices designed with earlier Technical Standards editions, in Italy as well as in other earthquake-prone European countries. The structural characteristics of the building, and a synthesis of the investigation campaigns developed on it, are initially presented. The mechanical parameters, dimensions, locations and installation details of the base isolation system, incorporating double friction pendulum sliding bearings as protective devices, are then illustrated, along with the performance assessment analyses carried out in original and rehabilitated conditions according to a full non-linear dynamic approach. The results of the analyses show a remarkable enhancement of the seismic response capacities of the structure in base-isolated configuration. This allows reaching the high performance levels postulated in the rehabilitation design with notably lower costs and architectural intrusion as compared to traditional retrofit interventions designed for the same objectives.Keywords: seismic retrofit, seismic assessment, r/c structures, school buildings, base isolation
Procedia PDF Downloads 270214 A New 3D Shape Descriptor Based on Multi-Resolution and Multi-Block CS-LBP
Authors: Nihad Karim Chowdhury, Mohammad Sanaullah Chowdhury, Muhammed Jamshed Alam Patwary, Rubel Biswas
Abstract:
In content-based 3D shape retrieval system, achieving high search performance has become an important research problem. A challenging aspect of this problem is to find an effective shape descriptor which can discriminate similar shapes adequately. To address this problem, we propose a new shape descriptor for 3D shape models by combining multi-resolution with multi-block center-symmetric local binary pattern operator. Given an arbitrary 3D shape, we first apply pose normalization, and generate a set of multi-viewed 2D rendered images. Second, we apply Gaussian multi-resolution filter to generate several levels of images from each of 2D rendered image. Then, overlapped sub-images are computed for each image level of a multi-resolution image. Our unique multi-block CS-LBP comes next. It allows the center to be composed of m-by-n rectangular pixels, instead of a single pixel. This process is repeated for all the 2D rendered images, derived from both ‘depth-buffer’ and ‘silhouette’ rendering. Finally, we concatenate all the features vectors into one dimensional histogram as our proposed 3D shape descriptor. Through several experiments, we demonstrate that our proposed 3D shape descriptor outperform the previous methods by using a benchmark dataset.Keywords: 3D shape retrieval, 3D shape descriptor, CS-LBP, overlapped sub-images
Procedia PDF Downloads 443213 Automated Testing of Workshop Robot Behavior
Authors: Arne Hitzmann, Philipp Wentscher, Alexander Gabel, Reinhard Gerndt
Abstract:
Autonomous mobile robots can be found in a wide field of applications. Their types range from household robots over workshop robots to autonomous cars and many more. All of them undergo a number of testing steps during development, production and maintenance. This paper describes an approach to improve testing of robot behavior. It was inspired by the RoboCup @work competition that itself reflects a robotics benchmark for industrial robotics. There, scaled down versions of mobile industrial robots have to navigate through a workshop-like environment or operation area and have to perform tasks of manipulating and transporting work pieces. This paper will introduce an approach of automated vision-based testing of the behavior of the so called youBot robot, which is the most widely used robot platform in the RoboCup @work competition. The proposed system allows automated testing of multiple tries of the robot to perform a specific missions and it allows for the flexibility of the robot, e.g. selecting different paths between two tasks within a mission. The approach is based on a multi-camera setup using, off the shelf cameras and optical markers. It has been applied for test-driven development (TDD) and maintenance-like verification of the robot behavior and performance.Keywords: supervisory control, testing, markers, mono vision, automation
Procedia PDF Downloads 377212 Violence Detection and Tracking on Moving Surveillance Video Using Machine Learning Approach
Authors: Abe Degale D., Cheng Jian
Abstract:
When creating automated video surveillance systems, violent action recognition is crucial. In recent years, hand-crafted feature detectors have been the primary method for achieving violence detection, such as the recognition of fighting activity. Researchers have also looked into learning-based representational models. On benchmark datasets created especially for the detection of violent sequences in sports and movies, these methods produced good accuracy results. The Hockey dataset's videos with surveillance camera motion present challenges for these algorithms for learning discriminating features. Image recognition and human activity detection challenges have shown success with deep representation-based methods. For the purpose of detecting violent images and identifying aggressive human behaviours, this research suggested a deep representation-based model using the transfer learning idea. The results show that the suggested approach outperforms state-of-the-art accuracy levels by learning the most discriminating features, attaining 99.34% and 99.98% accuracy levels on the Hockey and Movies datasets, respectively.Keywords: violence detection, faster RCNN, transfer learning and, surveillance video
Procedia PDF Downloads 106211 Competitiveness and Value Creation of Tourism Sector: In the Case of 10 ASEAN Economies
Authors: Apirada Chinprateep
Abstract:
The ASEAN Economic Community (AEC) shall be the goal of regional economic integration by 2015. Tourism is an activity that is growing important, especially as a source of foreign currency, employment creation and distribution of income bringing to the region. The preparation of members of the countries group, given the complexity of the issues entail to the concept of sustainable tourism, this paper tries to assess tourism sustainability, based on a number of quantitative indicators for all the ten economies, first, Thailand, compared with other nine countries, Myanmar, Laos, Vietnam, Malaysia, Singapore, Indonesia, Philippines, Cambodia, and Brunei. The proposed methodological framework will provide a number of benchmarks of tourism activities in these countries assessed. They include identification of the dimensions, for example, economic, socio-ecologic, infrastructure and indicators, method of scaling, chart representation and evaluation on Asian countries. This specification shows us that a similar level of tourism activity might introduce different sort of implementation in the tourism activity and might have different consequences for the socio-ecological environment and sustainability. The heterogeneity of developing countries exposed briefly here would be useful to detect and prepare for coping with the main problem of each country in their tourism activities, as well as competitiveness and value creation of tourism for ASEAN economic community, and will compare with other parts of the world and the world benchmark.Keywords: AEC, ASEAN, sustainable, tourism, competitiveness
Procedia PDF Downloads 426210 Importance of Standards in Engineering and Technology Education
Authors: Ahmed S. Khan, Amin Karim
Abstract:
During the past several decades, the economy of each nation has been significantly affected by globalization and technology. Government regulations and private sector standards affect a majority of world trade. Countries have been working together to establish international standards in almost every field. As a result, workers in all sectors need to have an understanding of standards. Engineering and technology students must not only possess an understanding of engineering standards and applicable government codes, but also learn to apply them in designing, developing, testing and servicing products, processes and systems. Accreditation Board for Engineering & Technology (ABET) criteria for engineering and technology education require students to learn and apply standards in their class projects. This paper is a follow-up of a 2006-2009 NSF initiative awarded to IEEE to help develop tutorials and case study modules for students and encourage standards education at college campuses. It presents the findings of a faculty/institution survey conducted through various U.S.-based listservs representing the major engineering and technology disciplines. The intent of the survey was to the gauge the status of use of standards and regulations in engineering and technology coursework and to identify benchmark practices. In light of survey findings, recommendations are made to standards development organizations, industry, and academia to help enhance the use of standards in engineering and technology curricula.Keywords: standards, regulations, ABET, IEEE, engineering, technology curricula
Procedia PDF Downloads 288209 An Innovation Decision Process View in an Adoption of Total Laboratory Automation
Authors: Chia-Jung Chen, Yu-Chi Hsu, June-Dong Lin, Kun-Chen Chan, Chieh-Tien Wang, Li-Ching Wu, Chung-Feng Liu
Abstract:
With fast advances in healthcare technology, various total laboratory automation (TLA) processes have been proposed. However, adopting TLA needs quite high funding. This study explores an early adoption experience by Taiwan’s large-scale hospital group, the Chimei Hospital Group (CMG), which owns three branch hospitals (Yongkang, Liouying and Chiali, in order by service scale), based on the five stages of Everett Rogers’ Diffusion Decision Process. 1.Knowledge stage: Over the years, two weaknesses exists in laboratory department of CMG: 1) only a few examination categories (e.g., sugar testing and HbA1c) can now be completed and reported within a day during an outpatient clinical visit; 2) the Yongkang Hospital laboratory space is dispersed across three buildings, resulting in duplicated investment in analysis instruments and inconvenient artificial specimen transportation. Thus, the senior management of the department raised a crucial question, was it time to process the redesign of the laboratory department? 2.Persuasion stage: At the end of 2013, Yongkang Hospital’s new building and restructuring project created a great opportunity for the redesign of the laboratory department. However, not all laboratory colleagues had the consensus for change. Thus, the top managers arranged a series of benchmark visits to stimulate colleagues into being aware of and accepting TLA. Later, the director of the department proposed a formal report to the top management of CMG with the results of the benchmark visits, preliminary feasibility analysis, potential benefits and so on. 3.Decision stage: This TLA suggestion was well-supported by the top management of CMG and, finally, they made a decision to carry out the project with an instrument-leasing strategy. After the announcement of a request for proposal and several vendor briefings, CMG confirmed their laboratory automation architecture and finally completed the contracts. At the same time, a cross-department project team was formed and the laboratory department assigned a section leader to the National Taiwan University Hospital for one month of relevant training. 4.Implementation stage: During the implementation, the project team called for regular meetings to review the results of the operations and to offer an immediate response to the adjustment. The main project tasks included: 1) completion of the preparatory work for beginning the automation procedures; 2) ensuring information security and privacy protection; 3) formulating automated examination process protocols; 4) evaluating the performance of new instruments and the instrument connectivity; 5)ensuring good integration with hospital information systems (HIS)/laboratory information systems (LIS); and 6) ensuring continued compliance with ISO 15189 certification. 5.Confirmation stage: In short, the core process changes include: 1) cancellation of signature seals on the specimen tubes; 2) transfer of daily examination reports to a data warehouse; 3) routine pre-admission blood drawing and formal inpatient morning blood drawing can be incorporated into an automatically-prepared tube mechanism. The study summarizes below the continuous improvement orientations: (1) Flexible reference range set-up for new instruments in LIS. (2) Restructure of the specimen category. (3) Continuous review and improvements to the examination process. (4) Whether installing the tube (specimen) delivery tracks need further evaluation.Keywords: innovation decision process, total laboratory automation, health care
Procedia PDF Downloads 419208 Lightweight Hybrid Convolutional and Recurrent Neural Networks for Wearable Sensor Based Human Activity Recognition
Authors: Sonia Perez-Gamboa, Qingquan Sun, Yan Zhang
Abstract:
Non-intrusive sensor-based human activity recognition (HAR) is utilized in a spectrum of applications, including fitness tracking devices, gaming, health care monitoring, and smartphone applications. Deep learning models such as convolutional neural networks (CNNs) and long short term memory (LSTM) recurrent neural networks (RNNs) provide a way to achieve HAR accurately and effectively. In this paper, we design a multi-layer hybrid architecture with CNN and LSTM and explore a variety of multi-layer combinations. Based on the exploration, we present a lightweight, hybrid, and multi-layer model, which can improve the recognition performance by integrating local features and scale-invariant with dependencies of activities. The experimental results demonstrate the efficacy of the proposed model, which can achieve a 94.7% activity recognition rate on a benchmark human activity dataset. This model outperforms traditional machine learning and other deep learning methods. Additionally, our implementation achieves a balance between recognition rate and training time consumption.Keywords: deep learning, LSTM, CNN, human activity recognition, inertial sensor
Procedia PDF Downloads 150207 Comparative Study of Deep Reinforcement Learning Algorithm Against Evolutionary Algorithms for Finding the Optimal Values in a Simulated Environment Space
Authors: Akshay Paranjape, Nils Plettenberg, Robert Schmitt
Abstract:
Traditional optimization methods like evolutionary algorithms are widely used in production processes to find an optimal or near-optimal solution of control parameters based on the simulated environment space of a process. These algorithms are computationally intensive and therefore do not provide the opportunity for real-time optimization. This paper utilizes the Deep Reinforcement Learning (DRL) framework to find an optimal or near-optimal solution for control parameters. A model based on maximum a posteriori policy optimization (Hybrid-MPO) that can handle both numerical and categorical parameters is used as a benchmark for comparison. A comparative study shows that DRL can find optimal solutions of similar quality as compared to evolutionary algorithms while requiring significantly less time making them preferable for real-time optimization. The results are confirmed in a large-scale validation study on datasets from production and other fields. A trained XGBoost model is used as a surrogate for process simulation. Finally, multiple ways to improve the model are discussed.Keywords: reinforcement learning, evolutionary algorithms, production process optimization, real-time optimization, hybrid-MPO
Procedia PDF Downloads 112206 Offline Signature Verification Using Minutiae and Curvature Orientation
Authors: Khaled Nagaty, Heba Nagaty, Gerard McKee
Abstract:
A signature is a behavioral biometric that is used for authenticating users in most financial and legal transactions. Signatures can be easily forged by skilled forgers. Therefore, it is essential to verify whether a signature is genuine or forged. The aim of any signature verification algorithm is to accommodate the differences between signatures of the same person and increase the ability to discriminate between signatures of different persons. This work presented in this paper proposes an automatic signature verification system to indicate whether a signature is genuine or not. The system comprises four phases: (1) The pre-processing phase in which image scaling, binarization, image rotation, dilation, thinning, and connecting ridge breaks are applied. (2) The feature extraction phase in which global and local features are extracted. The local features are minutiae points, curvature orientation, and curve plateau. The global features are signature area, signature aspect ratio, and Hu moments. (3) The post-processing phase, in which false minutiae are removed. (4) The classification phase in which features are enhanced before feeding it into the classifier. k-nearest neighbors and support vector machines are used. The classifier was trained on a benchmark dataset to compare the performance of the proposed offline signature verification system against the state-of-the-art. The accuracy of the proposed system is 92.3%.Keywords: signature, ridge breaks, minutiae, orientation
Procedia PDF Downloads 146205 Mixed Effects Models for Short-Term Load Forecasting for the Spanish Regions: Castilla-Leon, Castilla-La Mancha and Andalucia
Authors: C. Senabre, S. Valero, M. Lopez, E. Velasco, M. Sanchez
Abstract:
This paper focuses on an application of linear mixed models to short-term load forecasting. The challenge of this research is to improve a currently working model at the Spanish Transport System Operator, programmed by us, and based on linear autoregressive techniques and neural networks. The forecasting system currently forecasts each of the regions within the Spanish grid separately, even though the behavior of the load in each region is affected by the same factors in a similar way. A load forecasting system has been verified in this work by using the real data from a utility. In this research it has been used an integration of several regions into a linear mixed model as starting point to obtain the information from other regions. Firstly, the systems to learn general behaviors present in all regions, and secondly, it is identified individual deviation in each regions. The technique can be especially useful when modeling the effect of special days with scarce information from the past. The three most relevant regions of the system have been used to test the model, focusing on special day and improving the performance of both currently working models used as benchmark. A range of comparisons with different forecasting models has been conducted. The forecasting results demonstrate the superiority of the proposed methodology.Keywords: short-term load forecasting, mixed effects models, neural networks, mixed effects models
Procedia PDF Downloads 188204 Performance Comparison of Thread-Based and Event-Based Web Servers
Authors: Aikaterini Kentroti, Theodore H. Kaskalis
Abstract:
Today, web servers are expected to serve thousands of client requests concurrently within stringent response time limits. In this paper, we evaluate experimentally and compare the performance as well as the resource utilization of popular web servers, which differ in their approach to handle concurrency. More specifically, Central Processing Unit (CPU)- and I/O intensive tests were conducted against the thread-based Apache and Go as well as the event-based Nginx and Node.js under increasing concurrent load. The tests involved concurrent users requesting a term of the Fibonacci sequence (the 10th, 20th, 30th) and the content of a table from the database. The results show that Go achieved the best performance in all benchmark tests. For example, Go reached two times higher throughput than Node.js and five times higher than Apache and Nginx in the 20th Fibonacci term test. In addition, Go had the smallest memory footprint and demonstrated the most efficient resource utilization, in terms of CPU usage. Instead, Node.js had by far the largest memory footprint, consuming up to 90% more memory than Nginx and Apache. Regarding the performance of Apache and Nginx, our findings indicate that Hypertext Preprocessor (PHP) becomes a bottleneck when the servers are requested to respond by performing CPU-intensive tasks under increasing concurrent load.Keywords: apache, Go, Nginx, node.js, web server benchmarking
Procedia PDF Downloads 97203 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Human action recognition modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view football datasets. Our HMR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH multi-view football datasets, respectively.Keywords: computer vision, human motion analysis, random forest, machine learning
Procedia PDF Downloads 36202 The Effect of Mandatory International Financial Reporting Standards Reporting on Investors' Herding Practice: Evidence from Eu Equity Markets
Authors: Mohammed Lawal Danrimi, Ervina Alfan, Mazni Abdullah
Abstract:
The purpose of this study is to investigate whether the adoption of International Financial Reporting Standards (IFRS) encourages information-based trading and mitigates investors’ herding practice in emerging EU equity markets. Utilizing a modified non-linear model of cross-sectional absolute deviation (CSAD), we find that the hypothesis that mandatory IFRS adoption improves the information set of investors and reduces irrational investment behavior may in some cases be incorrect, and the reverse may be true. For instance, with regard to herding concerns, the new reporting benchmark has rather aggravated investors’ herding practice. However, we also find that mandatory IFRS adoption does not appear to be the only instigator of the observed herding practice; national institutional factors, particularly regulatory quality, political stability and control of corruption, also significantly contribute to investors’ herd formation around the new reporting regime. The findings would be of interest to academics, regulators and policymakers in performing a cost-benefit analysis of the so-called better reporting regime, as well as financial statement users who make decisions based on firms’ fundamental variables, treating them as significant indicators of future market movement.Keywords: equity markets, herding, IFRS, CSAD
Procedia PDF Downloads 178201 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment
Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar
Abstract:
Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors
Procedia PDF Downloads 11200 Experimental Investigation of Fluid Dynamic Effects on Crystallisation Scale Growth and Suppression in Agitation Tank
Authors: Prasanjit Das, M. M. K. Khan, M. G. Rasul, Jie Wu, I. Youn
Abstract:
Mineral scale formation is undoubtedly a more serious problem in the mineral industry than other process industries. To better understand scale growth and suppression, an experimental model is proposed in this study for supersaturated crystallised solutions commonly found in mineral process plants. In this experiment, surface crystallisation of potassium nitrate (KNO3) on the wall of the agitation tank and agitation effects on the scale growth and suppression are studied. The new quantitative scale suppression model predicts that at lower agitation speed, the scale growth rate is enhanced and at higher agitation speed, the scale suppression rate increases due to the increased flow erosion effect. A lab-scale agitation tank with and without baffles were used as a benchmark in this study. The fluid dynamic effects on scale growth and suppression in the agitation tank with three different size impellers (diameter 86, 114, 160 mm and model A310 with flow number 0.56) at various ranges of rotational speed (up to 700 rpm) and solution with different concentration (4.5, 4.75 and 5.25 mol/dm3) were investigated. For more elucidation, the effects of the different size of the impeller on wall surface scale growth and suppression rate as well as bottom settled scale accumulation rate are also discussed. Emphasis was placed on applications in the mineral industry, although results are also relevant to other industrial applications.Keywords: agitation tank, crystallisation, impeller speed, scale
Procedia PDF Downloads 223199 Deep Routing Strategy: Deep Learning based Intelligent Routing in Software Defined Internet of Things.
Authors: Zabeehullah, Fahim Arif, Yawar Abbas
Abstract:
Software Defined Network (SDN) is a next genera-tion networking model which simplifies the traditional network complexities and improve the utilization of constrained resources. Currently, most of the SDN based Internet of Things(IoT) environments use traditional network routing strategies which work on the basis of max or min metric value. However, IoT network heterogeneity, dynamic traffic flow and complexity demands intelligent and self-adaptive routing algorithms because traditional routing algorithms lack the self-adaptions, intelligence and efficient utilization of resources. To some extent, SDN, due its flexibility, and centralized control has managed the IoT complexity and heterogeneity but still Software Defined IoT (SDIoT) lacks intelligence. To address this challenge, we proposed a model called Deep Routing Strategy (DRS) which uses Deep Learning algorithm to perform routing in SDIoT intelligently and efficiently. Our model uses real-time traffic for training and learning. Results demonstrate that proposed model has achieved high accuracy and low packet loss rate during path selection. Proposed model has also outperformed benchmark routing algorithm (OSPF). Moreover, proposed model provided encouraging results during high dynamic traffic flow.Keywords: SDN, IoT, DL, ML, DRS
Procedia PDF Downloads 110198 Comparison of Sourcing Process in Supply Chain Operation References Model and Business Information Systems
Authors: Batuhan Kocaoglu
Abstract:
Although using powerful systems like ERP (Enterprise Resource Planning), companies still cannot benchmark their processes and measure their process performance easily based on predefined SCOR (Supply Chain Operation References) terms. The purpose of this research is to identify common and corresponding processes to present a conceptual model to model and measure the purchasing process of an organization. The main steps for the research study are: Literature review related to 'procure to pay' process in ERP system; Literature review related to 'sourcing' process in SCOR model; To develop a conceptual model integrating 'sourcing' of SCOR model and 'procure to pay' of ERP model. In this study, we examined the similarities and differences between these two models. The proposed framework is based on the assumptions that are drawn from (1) the body of literature, (2) the authors’ experience by working in the field of enterprise and logistics information systems. The modeling framework provides a structured and systematic way to model and decompose necessary information from conceptual representation to process element specification. This conceptual model will help the organizations to make quality purchasing system measurement instruments and tools. And offered adaptation issues for ERP systems and SCOR model will provide a more benchmarkable and worldwide standard business process.Keywords: SCOR, ERP, procure to pay, sourcing, reference model
Procedia PDF Downloads 362197 Normalizing Scientometric Indicators of Individual Publications Using Local Cluster Detection Methods on Citation Networks
Authors: Levente Varga, Dávid Deritei, Mária Ercsey-Ravasz, Răzvan Florian, Zsolt I. Lázár, István Papp, Ferenc Járai-Szabó
Abstract:
One of the major shortcomings of widely used scientometric indicators is that different disciplines cannot be compared with each other. The issue of cross-disciplinary normalization has been long discussed, but even the classification of publications into scientific domains poses problems. Structural properties of citation networks offer new possibilities, however, the large size and constant growth of these networks asks for precaution. Here we present a new tool that in order to perform cross-field normalization of scientometric indicators of individual publications relays on the structural properties of citation networks. Due to the large size of the networks, a systematic procedure for identifying scientific domains based on a local community detection algorithm is proposed. The algorithm is tested with different benchmark and real-world networks. Then, by the use of this algorithm, the mechanism of the scientometric indicator normalization process is shown for a few indicators like the citation number, P-index and a local version of the PageRank indicator. The fat-tail trend of the article indicator distribution enables us to successfully perform the indicator normalization process.Keywords: citation networks, cross-field normalization, local cluster detection, scientometric indicators
Procedia PDF Downloads 203196 Taylor’s Law and Relationship between Life Expectancy at Birth and Variance in Age at Death in Period Life Table
Authors: David A. Swanson, Lucky M. Tedrow
Abstract:
Taylor’s Law is a widely observed empirical pattern that relates variances to means in sets of non-negative measurements via an approximate power function, which has found application to human mortality. This study adds to this research by showing that Taylor’s Law leads to a model that reasonably describes the relationship between life expectancy at birth (e0, which also is equal to mean age at death in a life table) and variance at age of death in seven World Bank regional life tables measured at two points in time, 1970 and 2000. Using as a benchmark a non-random sample of four Japanese female life tables covering the period from 1950 to 2004, the study finds that the simple linear model provides reasonably accurate estimates of variance in age at death in a life table from e0, where the latter range from 60.9 to 85.59 years. Employing 2017 life tables from the Human Mortality Database, the simple linear model is used to provide estimates of variance at age in death for six countries, three of which have high e0 values and three of which have lower e0 values. The paper provides a substantive interpretation of Taylor’s Law relative to e0 and concludes by arguing that reasonably accurate estimates of variance in age at death in a period life table can be calculated using this approach, which also can be used where e0 itself is estimated rather than generated through the construction of a life table, a useful feature of the model.Keywords: empirical pattern, mean age at death in a life table, mean age of a stationary population, stationary population
Procedia PDF Downloads 330195 Probabilistic Seismic Loss Assessment of Reinforced Concrete (RC) Frame Buildings Pre- and Post-Rehabilitation
Authors: A. Flora, A. Di Lascio, D. Cardone, G. Gesualdi, G. Perrone
Abstract:
This paper considers the seismic assessment and retrofit of a pilotis-type RC frame building, which was designed for gravity loads only, prior to the introduction of seismic design provisions. Pilotis-type RC frame buildings, featuring an uniform infill throughout the height and an open ground floor, were, and still are, quite popular all over the world, as they offer large open areas very suitable for retail space at the ground floor. These architectural advantages, however, are of detriment to the building seismic behavior, as they can determine a soft-storey collapse mechanism. Extensive numerical analyses are carried out to quantify and benchmark the performance of the selected building, both in terms of overall collapse capacity and expected losses. Alternative retrofit strategies are then examined, including: (i) steel jacketing of RC columns and beam-column joints, (ii) steel bracing and (iv) seismic isolation. The Expected Annual Loss (EAL) of the selected case-study building, pre- and post-rehabilitation, is evaluated, following a probabilistic approach. The breakeven time of each solution is computed, comparing the initial cost of the retrofit intervention with expected benefit in terms of EAL reduction.Keywords: expected annual loss, reinforced concrete buildings, seismic loss assessment, seismic retrofit
Procedia PDF Downloads 240194 Mechanical Properties of Spark Plasma Sintered 2024 AA Reinforced with TiB₂ and Nano Yttrium
Authors: Suresh Vidyasagar Chevuri, D. B. Karunakar Chevuri
Abstract:
The main advantages of 'Metal Matrix Nano Composites (MMNCs)' include excellent mechanical performance, good wear resistance, low creep rate, etc. The method of fabrication of MMNCs is quite a challenge, which includes processing techniques like Spark Plasma Sintering (SPS), etc. The objective of the present work is to fabricate aluminum based MMNCs with the addition of small amounts of yttrium using Spark Plasma Sintering and to evaluate their mechanical and microstructure properties. Samples of 2024 AA with yttrium ranging from 0.1% to 0.5 wt% keeping 1 wt% TiB2 constant are fabricated by Spark Plasma Sintering (SPS). The mechanical property like hardness is determined using Vickers hardness testing machine. The metallurgical characterization of the samples is evaluated by Optical Microscopy (OM), Field Emission Scanning Electron Microscopy (FE-SEM) and X-Ray Diffraction (XRD). Unreinforced 2024 AA sample is also fabricated as a benchmark to compare its properties with that of the composite developed. It is found that the yttrium addition increases the above-mentioned properties to some extent and then decreases gradually when yttrium wt% increases beyond a point between 0.3 and 0.4 wt%. High density is achieved in the samples fabricated by spark plasma sintering when compared to any other fabrication route, and uniform distribution of yttrium is observed.Keywords: spark plasma sintering, 2024 AA, yttrium addition, microstructure characterization, mechanical properties
Procedia PDF Downloads 224193 Exploring the Critical Success Factors of Construction Stakeholders Team Effectiveness
Authors: Olusegun Akinsiku, Olukayode Oyediran, Koleola Odusami
Abstract:
A construction project is usually made up of a variety of stakeholders whose interests may positively or negatively impact on the outcome of the project execution. The variability of project stakeholders is apparent in their cultural differences, professional background and ethics, and differences in ideas. The need for the effectiveness of construction teams has been investigated as this is an important aspect to meeting client’s expectations in the construction industry. This study adopts a cross-sectional descriptive survey with the purpose of identifying the critical success factors (CSFs) associated with the team effectiveness of construction projects stakeholders, their relationship and the effects on construction project performance. The instrument for data collection was a designed questionnaire which was administered to construction professionals in the construction industry in Lagos State, Nigeria using proportionate stratified sampling. The highest ranked identified CSFs include “team trust”, “esprit de corps among members” and “team cohesiveness”. Using factor analysis and considering the effects of team cohesiveness on project performance, the identified CSFs were categorized into three groups namely cognitive attributes, behavior and processes attributes and affective attributes. All the three groups were observed to have a strong correlation with project performance. The findings of this study are useful in helping construction stakeholders benchmark the team effectiveness factors that will guarantee project success.Keywords: construction, critical success factors, performance, stakeholders, team effectiveness
Procedia PDF Downloads 127192 A Design Decision Framework for Net-Zero Carbon Buildings in Hot Climates: A Modeled Approach and Expert’s Feedback
Authors: Eric Ohene, Albert P. C. Chan, Shu-Chien HSU
Abstract:
The rising building energy consumption and related carbon emissions make it necessary to construct net-zero carbon buildings (NZCBs). The objective of net-zero buildings has raised the benchmark for building performance and will alter how buildings are designed and constructed. However, there have been growing concerns about uncertainty in net-zero building design and cost implications in decision-making. Lessons from practice have shown that a robust net-zero building design is complex, expensive, and time-consuming. Moreover, climate conditions have an enormous implication for choosing the best-optimal passive and active solutions to ensure building energy performance while ensuring the indoor comfort performance of occupants. It is observed that 20% of the design decisions made in the initial design phase influence 80% of all design decisions. To design and construct NZCBs, it is crucial to ensure adequate decision-making during the early design phases. Therefore, this study aims to explore practical strategies to design NZCBs and to offer a design framework that could help decision-making during the design stage of net-zero buildings. A parametric simulation approach was employed, and experts (i.e., architects, building designers) perspectives on the decision framework were solicited. The study could be helpful to building designers and architects to guide their decision-making during the design stage of NZCBs.Keywords: net-zero, net-zero carbon building, energy efficiency, parametric simulation, hot climate
Procedia PDF Downloads 103191 Modeling and Benchmarking the Thermal Energy Performance of Palm Oil Production Plant
Authors: Mathias B. Michael, Esther T. Akinlabi, Tien-Chien Jen
Abstract:
Thermal energy consumption in palm oil production plant comprises mainly of steam, hot water and hot air. In most efficient plants, hot water and air are generated from the steam supply system. Research has shown that thermal energy utilize in palm oil production plants is about 70 percent of the total energy consumption of the plant. In order to manage the plants’ energy efficiently, the energy systems are modelled and optimized. This paper aimed to present the model of steam supply systems of a typical palm oil production plant in Ghana. The models include exergy and energy models of steam boiler, steam turbine and the palm oil mill. The paper further simulates the virtual plant model to obtain the thermal energy performance of the plant under study. The simulation results show that, under normal operating condition, the boiler energy performance is considerably below the expected level as a result of several factors including intermittent biomass fuel supply, significant moisture content of the biomass fuel and significant heat losses. The total thermal energy performance of the virtual plant is set as a baseline. The study finally recommends number of energy efficiency measures to improve the plant’s energy performance.Keywords: palm biomass, steam supply, exergy and energy models, energy performance benchmark
Procedia PDF Downloads 349190 Intelligent Transport System: Classification of Traffic Signs Using Deep Neural Networks in Real Time
Authors: Anukriti Kumar, Tanmay Singh, Dinesh Kumar Vishwakarma
Abstract:
Traffic control has been one of the most common and irritating problems since the time automobiles have hit the roads. Problems like traffic congestion have led to a significant time burden around the world and one significant solution to these problems can be the proper implementation of the Intelligent Transport System (ITS). It involves the integration of various tools like smart sensors, artificial intelligence, position technologies and mobile data services to manage traffic flow, reduce congestion and enhance driver's ability to avoid accidents during adverse weather. Road and traffic signs’ recognition is an emerging field of research in ITS. Classification problem of traffic signs needs to be solved as it is a major step in our journey towards building semi-autonomous/autonomous driving systems. The purpose of this work focuses on implementing an approach to solve the problem of traffic sign classification by developing a Convolutional Neural Network (CNN) classifier using the GTSRB (German Traffic Sign Recognition Benchmark) dataset. Rather than using hand-crafted features, our model addresses the concern of exploding huge parameters and data method augmentations. Our model achieved an accuracy of around 97.6% which is comparable to various state-of-the-art architectures.Keywords: multiclass classification, convolution neural network, OpenCV
Procedia PDF Downloads 176189 A Comparative Asessment of Some Algorithms for Modeling and Forecasting Horizontal Displacement of Ialy Dam, Vietnam
Authors: Kien-Trinh Thi Bui, Cuong Manh Nguyen
Abstract:
In order to simulate and reproduce the operational characteristics of a dam visually, it is necessary to capture the displacement at different measurement points and analyze the observed movement data promptly to forecast the dam safety. The accuracy of forecasts is further improved by applying machine learning methods to data analysis progress. In this study, the horizontal displacement monitoring data of the Ialy hydroelectric dam was applied to machine learning algorithms: Gaussian processes, multi-layer perceptron neural networks, and the M5-rules algorithm for modelling and forecasting of horizontal displacement of the Ialy hydropower dam (Vietnam), respectively, for analysing. The database which used in this research was built by collecting time series of data from 2006 to 2021 and divided into two parts: training dataset and validating dataset. The final results show all three algorithms have high performance for both training and model validation, but the MLPs is the best model. The usability of them are further investigated by comparison with a benchmark models created by multi-linear regression. The result show the performance which obtained from all the GP model, the MLPs model and the M5-Rules model are much better, therefore these three models should be used to analyze and predict the horizontal displacement of the dam.Keywords: Gaussian processes, horizontal displacement, hydropower dam, Ialy dam, M5-Rules, multi-layer perception neural networks
Procedia PDF Downloads 210188 Transition Metal Carbodiimide vs. Spinel Matrices for Photocatalytic Water Oxidation
Authors: Karla Lienau, Rafael Müller, René Moré, Debora Ressnig, Dan Cook, Richard Walton, Greta R. Patzke
Abstract:
The increasing demand for renewable energy sources and storable fuels underscores the high potential of artificial photosynthesis. The four electron transfer process of water oxidation remains the bottleneck of water splitting, so that special emphasis is placed on the development of economic, stable and efficient water oxidation catalysts (WOCs). Our investigations introduced cobalt carbodiimide CoNCN and its transition metal analogues as WOC types, and further studies are focused on the interaction of different transition metals in the convenient all-nitrogen/carbon matrix. This provides further insights into the nature of the ‘true catalyst’ for cobalt centers in this non-oxide environment. Water oxidation activity is evaluated with complementary methods, namely photocatalytically using a Ru-dye sensitized standard setup as well as electrocatalytically, via immobilization of the WOCs on glassy carbon electrodes. To further explore the tuning potential of transition metal combinations, complementary investigations were carried out in oxidic spinel WOC matrices with more versatile host options than the carbodiimide framework. The influence of the preparative history on the WOC performance was evaluated with different synthetic methods (e.g. hydrothermally or microwave assisted). Moreover, the growth mechanism of nanoscale Co3O4-spinel as a benchmark WOC was investigated with in-situ PXRD techniques.Keywords: carbodiimide, photocatalysis, spinels, water oxidation
Procedia PDF Downloads 289187 Innovative Pedagogy and the Fostering of Soft Skills among Higher Education Students: A Case Study of Ben Ms’Ick Faculty of Sciences
Authors: Azzeddine Atibi, Sara Atibi, Salim Ahmed, Khadija El Kabab
Abstract:
In an educational context where innovation holds a predominant position, political discourses and pedagogical practices are increasingly oriented toward this concept. Innovation has become a benchmark value, gradually replacing the notion of progress. This term is omnipresent in discussions among policymakers, administrators, and academic researchers. The pressure to innovate impacts all levels of education, influencing institutional and educational policies, training objectives, and teachers' pedagogical practices. Higher education and continuing education sectors are not exempt from this trend. These sectors are compelled to transform to attract and retain an audience whose behaviors and expectations have significantly evolved. Indeed, the employability of young graduates has become a crucial issue, prompting us to question the effectiveness of various pedagogical methods in meeting this criterion. In this article, we propose to thoroughly examine the relationship between pedagogical methods employed in different fields of higher education and the acquisition of interpersonal skills, or "soft skills". Our aim is to determine to what extent these methods contribute to better-preparing students for the professional world. We will analyze how innovative pedagogical approaches can enhance the acquisition of soft skills, which are essential for the professional success of young graduates.Keywords: educational context, innovation, higher education, soft skills, pedagogical practices, pedagogical approaches
Procedia PDF Downloads 41186 A Tool to Measure Efficiency and Trust Towards eXplainable Artificial Intelligence in Conflict Detection Tasks
Authors: Raphael Tuor, Denis Lalanne
Abstract:
The ATM research community is missing suitable tools to design, test, and validate new UI prototypes. Important stakes underline the implementation of both DSS and XAI methods into current systems. ML-based DSS are gaining in relevance as ATFM becomes increasingly complex. However, these systems only prove useful if a human can understand them, and thus new XAI methods are needed. The human-machine dyad should work as a team and should understand each other. We present xSky, a configurable benchmark tool that allows us to compare different versions of an ATC interface in conflict detection tasks. Our main contributions to the ATC research community are (1) a conflict detection task simulator (xSky) that allows to test the applicability of visual prototypes on scenarios of varying difficulty and outputting relevant operational metrics (2) a theoretical approach to the explanations of AI-driven trajectory predictions. xSky addresses several issues that were identified within available research tools. Researchers can configure the dimensions affecting scenario difficulty with a simple CSV file. Both the content and appearance of the XAI elements can be customized in a few steps. As a proof-of-concept, we implemented an XAI prototype inspired by the maritime field.Keywords: air traffic control, air traffic simulation, conflict detection, explainable artificial intelligence, explainability, human-automation collaboration, human factors, information visualization, interpretability, trajectory prediction
Procedia PDF Downloads 160