Search results for: real world driving data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32527

Search results for: real world driving data

30937 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework

Procedia PDF Downloads 266
30936 Topology-Based Character Recognition Method for Coin Date Detection

Authors: Xingyu Pan, Laure Tougne

Abstract:

For recognizing coins, the graved release date is important information to identify precisely its monetary type. However, reading characters in coins meets much more obstacles than traditional character recognition tasks in the other fields, such as reading scanned documents or license plates. To address this challenging issue in a numismatic context, we propose a training-free approach dedicated to detection and recognition of the release date of the coin. In the first step, the date zone is detected by comparing histogram features; in the second step, a topology-based algorithm is introduced to recognize coin numbers with various font types represented by binary gradient map. Our method obtained a recognition rate of 92% on synthetic data and of 44% on real noised data.

Keywords: coin, detection, character recognition, topology

Procedia PDF Downloads 238
30935 Analyzing Current Transformer’s Transient and Steady State Behavior for Different Burden’s Using LabVIEW Data Acquisition Tool

Authors: D. Subedi, D. Sharma

Abstract:

Current transformers (CTs) are used to transform large primary currents to a small secondary current. Since most standard equipment’s are not designed to handle large primary currents the CTs have an important part in any electrical system for the purpose of Metering and Protection both of which are integral in Power system. Now a days due to advancement in solid state technology, the operation times of the protective relays have come to a few cycles from few seconds. Thus, in such a scenario it becomes important to study the transient response of the current transformers as it will play a vital role in the operating of the protective devices. This paper shows the steady state and transient behavior of current transformers and how it changes with change in connected burden. The transient and steady state response will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer characteristics with changes in burden will be discussed.

Keywords: accuracy, accuracy limiting factor, burden, current transformer, instrument security factor

Procedia PDF Downloads 330
30934 Alternative Robust Estimators for the Shape Parameters of the Burr XII Distribution

Authors: Fatma Zehra Doğru, Olcay Arslan

Abstract:

In this paper, we propose alternative robust estimators for the shape parameters of the Burr XII distribution. We provide a small simulation study and a real data example to illustrate the performance of the proposed estimators over the ML and the LS estimators.

Keywords: burr xii distribution, robust estimator, m-estimator, least squares

Procedia PDF Downloads 415
30933 Railway Process Automation to Ensure Human Safety with the Aid of IoT and Image Processing

Authors: K. S. Vedasingha, K. K. M. T. Perera, K. I. Hathurusinghe, H. W. I. Akalanka, Nelum Chathuranga Amarasena, Nalaka R. Dissanayake

Abstract:

Railways provide the most convenient and economically beneficial mode of transportation, and it has been the most popular transportation method among all. According to the past analyzed data, it reveals a considerable number of accidents which occurred at railways and caused damages to not only precious lives but also to the economy of the countries. There are some major issues which need to be addressed in railways of South Asian countries since they fall under the developing category. The goal of this research is to minimize the influencing aspect of railway level crossing accidents by developing the “railway process automation system”, as there are high-risk areas that are prone to accidents, and safety at these places is of utmost significance. This paper describes the implementation methodology and the success of the study. The main purpose of the system is to ensure human safety by using the Internet of Things (IoT) and image processing techniques. The system can detect the current location of the train and close the railway gate automatically. And it is possible to do the above-mentioned process through a decision-making system by using past data. The specialty is both processes working parallel. As usual, if the system fails to close the railway gate due to technical or a network failure, the proposed system can identify the current location and close the railway gate through a decision-making system, which is a revolutionary feature. The proposed system introduces further two features to reduce the causes of railway accidents. Railway track crack detection and motion detection are those features which play a significant role in reducing the risk of railway accidents. Moreover, the system is capable of detecting rule violations at a level crossing by using sensors. The proposed system is implemented through a prototype, and it is tested with real-world scenarios to gain the above 90% of accuracy.

Keywords: crack detection, decision-making, image processing, Internet of Things, motion detection, prototype, sensors

Procedia PDF Downloads 163
30932 Exploring the Social Health and Well-Being Factors of Hydraulic Fracturing

Authors: S. Grinnell

Abstract:

A PhD Research Project exploring the Social Health and Well-Being Impacts associated with Hydraulic Fracturing, with an aim to produce a Best Practice Support Guidance for those anticipating dealing with planning applications or submitting Environmental Impact Assessments (EIAs). Amid a possible global energy crisis, founded upon a number of factors, including unstable political situations, increasing world population growth, people living longer, it is perhaps inevitable that Hydraulic Fracturing (commonly referred to as ‘fracking’) will become a major player within the global long-term energy and sustainability agenda. As there is currently no best practice guidance for governing bodies the Best Practice Support Document will be targeted at a number of audiences including, consultants undertaking EIAs, Planning Officers, those commissioning EIAs Industry and interested public stakeholders. It will offer a robust, evidence-based criteria and recommendations which provide a clear narrative and consistent and shared approach to the language used along with containing an understanding of the issues identified. It is proposed that the Best Practice Support Document will also support the mitigation of health impacts identified. The Best Practice Support Document will support the newly amended Environmental Impact Assessment Directive (2011/92/EU), to be transposed into UK law by 2017. A significant amendment introduced focuses on, ‘higher level of protection to the environment and health.’ Methodology: A qualitative research methods approach is being taken with this research. It will have a number of key stages. A literature review has been undertaken and been critically reviewed and analysed. This was followed by a descriptive content analysis of a selection of international and national policies, programmes and strategies along with published Environmental Impact Assessments and associated planning guidance. In terms of data collection, a number of stakeholders were interviewed as well as a number of focus groups of local community groups potentially affected by fracking. These were determined from across the UK. A theme analysis of all the data collected and the literature review will be undertaken, using NVivo. Best Practice Supporting Document will be developed based on the outcomes of the analysis and be tested and piloted in the professional fields, before a live launch. Concluding statement: Whilst fracking is not a new concept, the technology is now driving a new force behind the use of this engineering to supply fuels. A number of countries have pledged moratoria on fracking until further investigation from the impacts on health have been explored, whilst other countries including Poland and the UK are pushing to support the use of fracking. If this should be the case, it will be important that the public’s concerns, perceptions, fears and objections regarding the wider social health and well-being impacts are considered along with the more traditional biomedical health impacts.

Keywords: fracking, hydraulic fracturing, socio-economic health, well-being

Procedia PDF Downloads 223
30931 An Extensible Software Infrastructure for Computer Aided Custom Monitoring of Patients in Smart Homes

Authors: Ritwik Dutta, Marylin Wolf

Abstract:

This paper describes the trade-offs and the design from scratch of a self-contained, easy-to-use health dashboard software system that provides customizable data tracking for patients in smart homes. The system is made up of different software modules and comprises a front-end and a back-end component. Built with HTML, CSS, and JavaScript, the front-end allows adding users, logging into the system, selecting metrics, and specifying health goals. The back-end consists of a NoSQL Mongo database, a Python script, and a SimpleHTTPServer written in Python. The database stores user profiles and health data in JSON format. The Python script makes use of the PyMongo driver library to query the database and displays formatted data as a daily snapshot of user health metrics against target goals. Any number of standard and custom metrics can be added to the system, and corresponding health data can be fed automatically, via sensor APIs or manually, as text or picture data files. A real-time METAR request API permits correlating weather data with patient health, and an advanced query system is implemented to allow trend analysis of selected health metrics over custom time intervals. Available on the GitHub repository system, the project is free to use for academic purposes of learning and experimenting, or practical purposes by building on it.

Keywords: flask, Java, JavaScript, health monitoring, long-term care, Mongo, Python, smart home, software engineering, webserver

Procedia PDF Downloads 370
30930 Research and Design of Functional Mixed Community: A Model Based on the Construction of New Districts in China

Authors: Wu Chao

Abstract:

The urban design of the new district in China is different from other existing cities at the city planning level, including Beijing, Shanghai, Guangzhou, etc. And the urban problems of these super-cities are same as many big cities around the world. The goal of the new district construction plan is to enable people to live comfortably, to improve the well-being of residents, and to create a way of life different from that of other urban communities. To avoid the emergence of the super community, the idea of "decentralization" is taken as the overall planning idea, and the function and form of each community are set up with a homogeneous allocation of resources so that the community can grow naturally. Similar to the growth of vines in nature, each community groups are independent and connected through roads, with clear community boundaries that limit their unlimited expansion. With a community contained 20,000 people as a case, the community is a mixture for living, production, office, entertainment, and other functions. Based on the development of the Internet, to create more space for public use, and can use data to allocate resources in real time. And this kind of shared space is the main part of the activity space in the community. At the same time, the transformation of spatial function can be determined by the usage feedback of all kinds of existing space, and the use of space can be changed by the changing data. Take the residential unit as the basic building function mass, take the lower three to four floors of the building as the main flexible space for use, distribute functions such as entertainment, service, office, etc. For the upper living space, set up a small amount of indoor and outdoor activity space, also used as shared space. The transformable space of the bottom layer is evenly distributed, combined with the walking space connected the community, the service and entertainment network can be formed in the whole community, and can be used in most of the community space. With the basic residential unit as the replicable module, the design of the other residential units runs through the idea of decentralization and the concept of the vine community, and the various units are reasonably combined. At the same time, a small number of office buildings are added to meet the special office needs. The new functional mixed community can change many problems of the present city in the future construction, at the same time, it can keep its vitality through the adjustment function of the Internet.

Keywords: decentralization, mixed functional community, shared space, spatial usage data

Procedia PDF Downloads 100
30929 An EWMA P-Chart Based on Improved Square Root Transformation

Authors: Saowanit Sukparungsee

Abstract:

Generally, the traditional Shewhart p chart has been developed by for charting the binomial data. This chart has been developed using the normal approximation with condition as low defect level and the small to moderate sample size. In real applications, however, are away from these assumptions due to skewness in the exact distribution. In this paper, a modified Exponentially Weighted Moving Average (EWMA) control chat for detecting a change in binomial data by improving square root transformations, namely ISRT p EWMA control chart. The numerical results show that ISRT p EWMA chart is superior to ISRT p chart for small to moderate shifts, otherwise, the latter is better for large shifts.

Keywords: number of defects, exponentially weighted moving average, average run length, square root transformations

Procedia PDF Downloads 423
30928 Disrupted or Discounted Cash Flow: Impact of Digitisation on Business Valuation

Authors: Matthias Haerri, Tobias Huettche, Clemens Kustner

Abstract:

This article discusses the impact of digitization on business valuation. In order to become and remain ‘digital’, investments are necessary whose return on investment (ROI) often remains vague. This uncertainty is contradictory for a valuation, that rely on predictable cash flows, fixed capital structures and the steady state. However digitisation does not make a company valuation impossible, but traditional approaches must be reconsidered. The authors identify four areas that are to be changing: (1) Tools instead of intuition - In the future, company valuation will neither be art nor science, but craft. This does not require intuition, but experience and good tools. Digital evaluation tools beyond Excel will therefore gain in importance. (2) Real-time instead of deadline - At present, company valuations are always carried out on a case-by-case basis and on a specific key date. This will change with the digitalization and the introduction of web-based valuation tools. Company valuations can thus not only be carried out faster and more efficiently, but can also be offered more frequently. Instead of calculating the value for a previous key date, current and real-time valuations can be carried out. (3) Predictive planning instead of analysis of the past - Past data will also be needed in the future, but its use will not be limited to monovalent time series or key figure analyses. With pictures of ‘black swans’ and the ‘turkey illusion’ it was made clear to us that we build forecasts on too few data points of the past and underestimate the power of chance. Predictive planning can help here. (4) Convergence instead of residual value - Digital transformation shortens the lifespan of viable business models. If companies want to live forever, they have to change forever. For the company valuation, this means that the business model valid on the valuation date only has a limited service life.

Keywords: business valuation, corporate finance, digitisation, disruption

Procedia PDF Downloads 109
30927 The Visible Third: Female Artists’ Participation in the Portuguese Contemporary Art World

Authors: Sonia Bernardo Correia

Abstract:

This paper is part of ongoing research that aims to understand the role of gender in the composition of the Portuguese contemporary art world and the possibilities and limits to the success of the professional paths of women and men artists. The field of visual arts is gender-sensitive as it differentiates the positions occupied by artists in terms of visibility and recognition. Women artists occupy a peripheral space, which may hinder the progression of their professional careers. Based on the collection of data on the participation of artists in Portuguese exhibitions, art fairs, auctions, and art awards between 2012 and 2019, the goal of this study is to portray female artists’ participation as a condition of professional, social, and cultural visibility. From the analysis of a significant sample of institutions from the artistic field, it was possible to observe that the works of female authors are under exhibited, never exceeding one-third of the total of exhibitions. Male artists also enjoy a comfortable majority as gallery artists (around 70%) and as part of institutional collections (around 80%). However, when analysing the younger age cohorts of artists by gender, it appears that there is representation parity, which may be a good sign of change. The data shows that there are persistent gender inequalities in accessing the artist profession. Women are not yet occupying positions of exposure, recognition, and legitimation in the market similar to those of their male counterparts, suggesting that they may face greater obstacles in experiencing successful professional trajectories.

Keywords: inequalities, invisibility of the woman artist, gender, visual arts

Procedia PDF Downloads 120
30926 Analysis of Truck Drivers’ Distraction on Crash Risk

Authors: Samuel Nderitu Muchiri, Tracy Wangechi Maina

Abstract:

Truck drivers face a myriad of challenges in their profession. Enhancements in logistics effectiveness can be pivotal in propelling economic developments. The specific objective of the study was to assess the influence of driver distraction on crash risk. The study is significant as it elucidates best practices that truck drivers can embrace in an effort to enhance road safety. These include amalgamating behaviors that enable drivers to fruitfully execute multifaceted functions such as finding and following routes, evading collisions, monitoring speed, adhering to road regulations, and evaluating vehicle systems’ conditions. The analysis involved an empirical review of ten previous studies related to the research topic. The articles revealed that driver distraction plays a substantial role in road accidents and other crucial road security incidents across the globe. Africa depends immensely on the freight transport sector to facilitate supply chain operations. Several studies indicate that drivers who operate primarily on rural roads, such as those found in Sub-Saharan Africa, have an increased propensity to engage in distracted activities such as cell phone usage while driving. The findings also identified the need for digitalization in truck driving operations, including carrier management techniques such as fatigue management, artificial intelligence, and automating functions like cell phone usage controls. The recommendations can aid policymakers and commercial truck carriers in deepening their understanding of driver distraction and enforcing mitigations to foster road safety.

Keywords: truck drivers, distraction, digitalization, crash risk, road safety

Procedia PDF Downloads 25
30925 Using Data Mining in Automotive Safety

Authors: Carine Cridelich, Pablo Juesas Cano, Emmanuel Ramasso, Noureddine Zerhouni, Bernd Weiler

Abstract:

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Keywords: KDD process, passive safety systems, sled test, dummy injury assessment reference values, frontal impact

Procedia PDF Downloads 365
30924 Influence of Driving Speed on Bearing Capacity Measurement of Intra-Urban Roads with the Traffic Speed Deflectometer(Tsd)

Authors: Pahirangan Sivapatham, Barbara Esser, Andreas Grimmel

Abstract:

In times of limited public funds and, in particular, an increased social, environmental awareness, as well as the limited availability of construction materials, sustainable and resource-saving pavement management system, is becoming more and more important. Therefore, the knowledge about the condition of the structural substances, particularly bearing capacity and its consideration while planning the maintenance measures of the subordinate network, i.e., state and municipal roads unavoidable. According to the experience, the recommended ride speed of the Traffic Speed Deflectometer (TSD) shall be higher than 40 km/h. Holding of this speed on the intra-urban roads is nearly not possible because of intersections and traffic lights as well as the speed limit. A sufficient background of experience for the evaluation of bearing capacity measurements with TSD in the range of lower speeds is not available yet. The aim of this study is to determine the possible lowest ride speed of the TSD while the bearing capacity measurement on the intra-urban roads. The manufacturer of the TSD used in this study states that the measurements can be conducted at a ride speed of higher than 5 km/h. It is well known that with decreasing ride speed, the viscous fractions in the response of the asphalt pavement increase. This must be taken into account when evaluating the bearing capacity data. In the scope of this study, several measurements were carried out at different speeds between 10 km/h and 60 km/h on the selected intra-urban roads with Pavement-Scanner of the University of Wuppertal, which is equipped with TSD. Pavement-Scanner is able to continuously determine the deflections of asphalt roads in flowing traffic at speeds of up to 80 km/h. The raw data is then aggregated to 10 m mean values so that, as a rule, a bearing capacity characteristic value can be determined for each 10 m road section. By means of analysing of obtained test results, the quality and validity of the determined data rate subject to the riding speed of TSD have been determined. Moreover, the data and pictures of the additional measuring systems of Pavement-Scanners such as High-Speed Road Monitor, Ground Penetration Radar and front cameras can be used to determine and eliminate irregularities in the pavement, which could influence the bearing capacity.

Keywords: bearing capacity measurement, traffic speed deflectometer, inter-urban roads, Pavement-Scanner, structural substance

Procedia PDF Downloads 212
30923 Human Creativity through Dooyeweerd's Philosophy: The Case of Creative Diagramming

Authors: Kamaran Fathulla

Abstract:

Human creativity knows no bounds. More than a millennia ago humans have expressed their knowledge on cave walls and on clay artefacts. Using visuals such as diagrams and paintings have always provided us with a natural and intuitive medium for expressing such creativity. Making sense of human generated visualisation has been influenced by western scientific philosophies which are often reductionist in their nature. Theoretical frameworks such as those delivered by Peirce dominated our views of how to make sense of visualisation where a visual is seen as an emergent property of our thoughts. Others have reduced the richness of human-generated visuals to mere shapes drawn on a piece of paper or on a screen. This paper introduces an alternate framework where the centrality of human functioning is given explicit and richer consideration through the multi aspectual philosophical works of Herman Dooyeweerd. Dooyeweerd's framework of understanding reality was based on fifteen aspects of reality, each having a distinct core meaning. The totality of the aspects formed a ‘rainbow’ like spectrum of meaning. The thesis of this approach is that meaningful human functioning in most cases involves the diversity of all aspects working in synergy and harmony. Illustration of the foundations and applicability of this approach is underpinned in the case of humans use of diagramming for creative purposes, particularly within an educational context. Diagrams play an important role in education. Students and lecturers use diagrams as a powerful tool to aid their thinking. However, research into the role of diagrams used in education continues to reveal difficulties students encounter during both processes of interpretation and construction of diagrams. Their main problems shape up students difficulties with diagrams. The ever-increasing diversity of diagrams' types coupled with the fact that most real-world diagrams often contain a mix of these different types of diagrams such as boxes and lines, bar charts, surfaces, routes, shapes dotted around the drawing area, and so on with each type having its own distinct set of static and dynamic semantics. We argue that the persistence of these problems is grounded in our existing ways of understanding diagrams that are often reductionist in their underpinnings driven by a single perspective or formalism. In this paper, we demonstrate the limitations of these approaches in dealing with the three problems. Consequently, we propose, discuss, and demonstrate the potential of a nonreductionist framework for understanding diagrams based on Symbolic and Spatial Mappings (SySpM) underpinned by Dooyeweerd philosophy. The potential of the framework to account for the meaning of diagrams is demonstrated by applying it to a real-world case study physics diagram.

Keywords: SySpM, drawing style, mapping

Procedia PDF Downloads 224
30922 Designing Directed Network with Optimal Controllability

Authors: Liang Bai, Yandong Xiao, Haorang Wang, Songyang Lao

Abstract:

The directedness of links is crucial to determine the controllability in complex networks. Even the edge directions can determine the controllability of complex networks. Obviously, for a given network, we wish to design its edge directions that make this network approach the optimal controllability. In this work, we firstly introduce two methods to enhance network by assigning edge directions. However, these two methods could not completely mitigate the negative effects of inaccessibility and dilations. Thus, to approach the optimal network controllability, the edge directions must mitigate the negative effects of inaccessibility and dilations as much as possible. Finally, we propose the edge direction for optimal controllability. The optimal method has been found to be successfully useful on real-world and synthetic networks.

Keywords: complex network, dynamics, network control, optimization

Procedia PDF Downloads 155
30921 Modelling Fluidization by Data-Based Recurrence Computational Fluid Dynamics

Authors: Varun Dongre, Stefan Pirker, Stefan Heinrich

Abstract:

Over the last decades, the numerical modelling of fluidized bed processes has become feasible even for industrial processes. Commonly, continuous two-fluid models are applied to describe large-scale fluidization. In order to allow for coarse grids novel two-fluid models account for unresolved sub-grid heterogeneities. However, computational efforts remain high – in the order of several hours of compute-time for a few seconds of real-time – thus preventing the representation of long-term phenomena such as heating or particle conversion processes. In order to overcome this limitation, data-based recurrence computational fluid dynamics (rCFD) has been put forward in recent years. rCFD can be regarded as a data-based method that relies on the numerical predictions of a conventional short-term simulation. This data is stored in a database and then used by rCFD to efficiently time-extrapolate the flow behavior in high spatial resolution. This study will compare the numerical predictions of rCFD simulations with those of corresponding full CFD reference simulations for lab-scale and pilot-scale fluidized beds. In assessing the predictive capabilities of rCFD simulations, we focus on solid mixing and secondary gas holdup. We observed that predictions made by rCFD simulations are highly sensitive to numerical parameters such as diffusivity associated with face swaps. We achieved a computational speed-up of four orders of magnitude (10,000 time faster than classical TFM simulation) eventually allowing for real-time simulations of fluidized beds. In the next step, we apply the checkerboarding technique by introducing gas tracers subjected to convection and diffusion. We then analyze the concentration profiles by observing mixing, transport of gas tracers, insights about the convective and diffusive pattern of the gas tracers, and further towards heat and mass transfer methods. Finally, we run rCFD simulations and calibrate them with numerical and physical parameters compared with convectional Two-fluid model (full CFD) simulation. As a result, this study gives a clear indication of the applicability, predictive capabilities, and existing limitations of rCFD in the realm of fluidization modelling.

Keywords: multiphase flow, recurrence CFD, two-fluid model, industrial processes

Procedia PDF Downloads 54
30920 Transverse Behavior of Frictional Flat Belt Driven by Tapered Pulley -Change of Transverse Force Under Driving State–

Authors: Satoko Fujiwara, Kiyotaka Obunai, Kazuya Okubo

Abstract:

A skew is one of important problems for designing the conveyor and transmission with frictional flat belt, in which running belt is deviated in width direction due to the transverse force applied to the belt. The skew often not only degrades the stability of the path of belt but also causes some damages of the belt and auxiliary machines. However, the transverse behavior such as the skew has not been discussed quantitatively in detail for frictional belts. The objective of this study is to clarify the transverse behavior of frictional flat belt driven by tapered pulley. Commercially available rubber flat belt reinforced by polyamide film was prepared as the test belt where the thickness and length were 1.25 mm and 630 mm, respectively. Test belt was driven between two pulleys made of aluminum alloy, where diameter and inter-axial length were 50 mm and 150 mm, respectively. Some tapered pulleys were applied where tapered angles were 0 deg (for comparison), 2 deg, 4 deg, and 6 deg. In order to alternatively investigate the transverse behavior, the transverse force applied to the belt was measured when the skew was constrained at the string under driving state. The transverse force was measured by a load cell having free rollers contacting on the side surface of the belt when the displacement in the belt width direction was constrained. The conditions of observed bending stiffness in-plane of the belt were changed by preparing three types of belts (the width of the belt was 20, 30, and 40 mm) where their observed stiffnesses were changed. The contributions of the bending stiffness in-plane of belt and initial inter-axial force to the transverse were discussed in experiments. The inter-axial force was also changed by setting a distance (about 240 mm) between the two pulleys. Influence of observed bending stiffness in-plane of the belt and initial inter-axial force on the transverse force were investigated. The experimental results showed that the transverse force was increased with an increase of observed bending stiffness in-plane of the belt and initial inter-axial force. The transverse force acting on the belt running on the tapered pulley was classified into multiple components. Those were components of forces applied with the deflection of the inter-axial force according to the change of taper angle, the resultant force by the bending moment applied on the belt winding around the tapered pulley, and the reaction force applied due to the shearing deformation. The calculation result of the transverse force was almost agreed with experimental data when those components were formulated. It was also shown that the most contribution was specified to be the shearing deformation, regardless of the test conditions. This study found that transverse behavior of frictional flat belt driven by tapered pulley was explained by the summation of those components of forces.

Keywords: skew, frictional flat belt, transverse force, tapered pulley

Procedia PDF Downloads 136
30919 Digital Structural Monitoring Tools @ADaPT for Cracks Initiation and Growth due to Mechanical Damage Mechanism

Authors: Faizul Azly Abd Dzubir, Muhammad F. Othman

Abstract:

Conventional structural health monitoring approach for mechanical equipment uses inspection data from Non-Destructive Testing (NDT) during plant shut down window and fitness for service evaluation to estimate the integrity of the equipment that is prone to crack damage. Yet, this forecast is fraught with uncertainty because it is often based on assumptions of future operational parameters, and the prediction is not continuous or online. Advanced Diagnostic and Prognostic Technology (ADaPT) uses Acoustic Emission (AE) technology and a stochastic prognostic model to provide real-time monitoring and prediction of mechanical defects or cracks. The forecast can help the plant authority handle their cracked equipment before it ruptures, causing an unscheduled shutdown of the facility. The ADaPT employs process historical data trending, finite element analysis, fitness for service, and probabilistic statistical analysis to develop a prediction model for crack initiation and growth due to mechanical damage. The prediction model is combined with live equipment operating data for real-time prediction of the remaining life span owing to fracture. ADaPT was devised at a hot combined feed exchanger (HCFE) that had suffered creep crack damage. The ADaPT tool predicts the initiation of a crack at the top weldment area by April 2019. During the shutdown window in April 2019, a crack was discovered and repaired. Furthermore, ADaPT successfully advised the plant owner to run at full capacity and improve output by up to 7% by April 2019. ADaPT was also used on a coke drum that had extensive fatigue cracking. The initial cracks are declared safe with ADaPT, with remaining crack lifetimes extended another five (5) months, just in time for another planned facility downtime to execute repair. The prediction model, when combined with plant information data, allows plant operators to continuously monitor crack propagation caused by mechanical damage for improved maintenance planning and to avoid costly shutdowns to repair immediately.

Keywords: mechanical damage, cracks, continuous monitoring tool, remaining life, acoustic emission, prognostic model

Procedia PDF Downloads 62
30918 A Macroeconomic Analysis of Defense Industry: Comparisons, Trends and Improvements in Brazil and in the World

Authors: J. Fajardo, J. Guerra, E. Gonzales

Abstract:

This paper will outline a study of Brazil's industrial base of defense (IDB), through a bibliographic research method, combined with an analysis of macroeconomic data from several available public data platforms. This paper begins with a brief study about Brazilian national industry, including analyzes of productivity, income, outcome and jobs. Next, the research presents a study on the defense industry in Brazil, presenting the main national companies that operate in the aeronautical, army and naval branches. After knowing the main points of the Brazilian defense industry, data on the productivity of the defense industry of the main countries and competing companies of the Brazilian industry were analyzed, in order to summarize big cases in Brazil with a comparative analysis. Concerned the methodology, were used bibliographic research and the exploration of historical data series, in order to analyze information, to get trends and to make comparisons along the time. The research is finished with the main trends for the development of the Brazilian defense industry, comparing the current situation with the point of view of several countries.

Keywords: economics of defence, industry, trends, market

Procedia PDF Downloads 135
30917 Numerical Analysis of Fire Performance of Timber Structures

Authors: Van Diem Thi, Mourad Khelifa, Mohammed El Ganaoui, Yann Rogaume

Abstract:

An efficient numerical method has been developed to incorporate the effects of heat transfer in timber panels on partition walls exposed to real building fires. The procedure has been added to the software package Abaqus/Standard as a user-defined subroutine (UMATHT) and has been verified using both time-and spatially dependent heat fluxes in two- and three-dimensional problems. The aim is to contribute to the development of simulation tools needed to assist structural engineers and fire testing laboratories in technical assessment exercises. The presented method can also be used under the developmental stages of building components to optimize performance in real fire conditions. The accuracy of the used thermal properties and the finite element models was validated by comparing the predicted results with three different available fire tests in literature. It was found that the model calibrated to results from standard fire conditions provided reasonable predictions of temperatures within assemblies exposed to real building fire.

Keywords: Timber panels, heat transfer, thermal properties, standard fire tests

Procedia PDF Downloads 321
30916 Verification of Satellite and Observation Measurements to Build Solar Energy Projects in North Africa

Authors: Samy A. Khalil, U. Ali Rahoma

Abstract:

The measurements of solar radiation, satellite data has been routinely utilize to estimate solar energy. However, the temporal coverage of satellite data has some limits. The reanalysis, also known as "retrospective analysis" of the atmosphere's parameters, is produce by fusing the output of NWP (Numerical Weather Prediction) models with observation data from a variety of sources, including ground, and satellite, ship, and aircraft observation. The result is a comprehensive record of the parameters affecting weather and climate. The effectiveness of reanalysis datasets (ERA-5) for North Africa was evaluate against high-quality surfaces measured using statistical analysis. Estimating the distribution of global solar radiation (GSR) over five chosen areas in North Africa through ten-years during the period time from 2011 to 2020. To investigate seasonal change in dataset performance, a seasonal statistical analysis was conduct, which showed a considerable difference in mistakes throughout the year. By altering the temporal resolution of the data used for comparison, the performance of the dataset is alter. Better performance is indicate by the data's monthly mean values, but data accuracy is degraded. Solar resource assessment and power estimation are discuses using the ERA-5 solar radiation data. The average values of mean bias error (MBE), root mean square error (RMSE) and mean absolute error (MAE) of the reanalysis data of solar radiation vary from 0.079 to 0.222, 0.055 to 0.178, and 0.0145 to 0.198 respectively during the period time in the present research. The correlation coefficient (R2) varies from 0.93 to 99% during the period time in the present research. This research's objective is to provide a reliable representation of the world's solar radiation to aid in the use of solar energy in all sectors.

Keywords: solar energy, ERA-5 analysis data, global solar radiation, North Africa

Procedia PDF Downloads 80
30915 Supplier Selection by Bi-Objectives Mixed Integer Program Approach

Authors: K.-H. Yang

Abstract:

In the past, there was a lot of excellent research studies conducted on topics related to supplier selection. Because the considered factors of supplier selection are complicated and difficult to be quantified, most researchers deal supplier selection issues by qualitative approaches. Compared to qualitative approaches, quantitative approaches are less applicable in the real world. This study tried to apply the quantitative approach to study a supplier selection problem with considering operation cost and delivery reliability. By those factors, this study applies Normalized Normal Constraint Method to solve the dual objectives mixed integer program of the supplier selection problem.

Keywords: bi-objectives MIP, normalized normal constraint method, supplier selection, quantitative approach

Procedia PDF Downloads 395
30914 AER Model: An Integrated Artificial Society Modeling Method for Cloud Manufacturing Service Economic System

Authors: Deyu Zhou, Xiao Xue, Lizhen Cui

Abstract:

With the increasing collaboration among various services and the growing complexity of user demands, there are more and more factors affecting the stable development of the cloud manufacturing service economic system (CMSE). This poses new challenges to the evolution analysis of the CMSE. Many researchers have modeled and analyzed the evolution process of CMSE from the perspectives of individual learning and internal factors influencing the system, but without considering other important characteristics of the system's individuals (such as heterogeneity, bounded rationality, etc.) and the impact of external environmental factors. Therefore, this paper proposes an integrated artificial social model for the cloud manufacturing service economic system, which considers both the characteristics of the system's individuals and the internal and external influencing factors of the system. The model consists of three parts: the Agent model, environment model, and rules model (Agent-Environment-Rules, AER): (1) the Agent model considers important features of the individuals, such as heterogeneity and bounded rationality, based on the adaptive behavior mechanisms of perception, action, and decision-making; (2) the environment model describes the activity space of the individuals (real or virtual environment); (3) the rules model, as the driving force of system evolution, describes the mechanism of the entire system's operation and evolution. Finally, this paper verifies the effectiveness of the AER model through computational and experimental results.

Keywords: cloud manufacturing service economic system (CMSE), AER model, artificial social modeling, integrated framework, computing experiment, agent-based modeling, social networks

Procedia PDF Downloads 65
30913 Government Big Data Ecosystem: A Systematic Literature Review

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Data that is high in volume, velocity, veracity and comes from a variety of sources is usually generated in all sectors including the government sector. Globally public administrations are pursuing (big) data as new technology and trying to adopt a data-centric architecture for hosting and sharing data. Properly executed, big data and data analytics in the government (big) data ecosystem can be led to data-driven government and have a direct impact on the way policymakers work and citizens interact with governments. In this research paper, we conduct a systematic literature review. The main aims of this paper are to highlight essential aspects of the government (big) data ecosystem and to explore the most critical socio-technical factors that contribute to the successful implementation of government (big) data ecosystem. The essential aspects of government (big) data ecosystem include definition, data types, data lifecycle models, and actors and their roles. We also discuss the potential impact of (big) data in public administration and gaps in the government data ecosystems literature. As this is a new topic, we did not find specific articles on government (big) data ecosystem and therefore focused our research on various relevant areas like humanitarian data, open government data, scientific research data, industry data, etc.

Keywords: applications of big data, big data, big data types. big data ecosystem, critical success factors, data-driven government, egovernment, gaps in data ecosystems, government (big) data, literature review, public administration, systematic review

Procedia PDF Downloads 207
30912 A Machine Learning Decision Support Framework for Industrial Engineering Purposes

Authors: Anli Du Preez, James Bekker

Abstract:

Data is currently one of the most critical and influential emerging technologies. However, the true potential of data is yet to be exploited since, currently, about 1% of generated data are ever actually analyzed for value creation. There is a data gap where data is not explored due to the lack of data analytics infrastructure and the required data analytics skills. This study developed a decision support framework for data analytics by following Jabareen’s framework development methodology. The study focused on machine learning algorithms, which is a subset of data analytics. The developed framework is designed to assist data analysts with little experience, in choosing the appropriate machine learning algorithm given the purpose of their application.

Keywords: Data analytics, Industrial engineering, Machine learning, Value creation

Procedia PDF Downloads 151
30911 The National Socialist and Communist Propaganda Activities in the Turkish Press during the World War II

Authors: Asuman Tezcan Mirer

Abstract:

This proposed paper discusses nationalist socialist and communist propaganda struggles in the Turkish press during World War II. The paper aspires to analyze how government agencies directed and organized the Turkish press to prevent the "5th column" from influencing public opinion. During the Second World War, one of the most emphasized issues was propaganda and how Turkish citizens would be protected from the effects of disinformation. Istanbul became a significant headquarters for belligerent countries' intelligence services, and these services were involved in gathering intelligence and disseminating propaganda. The main motive of national socialist propaganda was "anti-communism" in Turkey. Subsidizing certain magazines, controlling German companies' advertisements and paper trade, spreading rumors, printing propaganda brochures, and showing German propaganda films are some tactics that the nationalist socialists applied before and during the Second World War. On the other hand, the communists targeted Turkish racist/ultra-nationalist groups and their publications, which were influenced by the Nazi regime. They were also involved in distributing Marxist publications, printing brochures, and broadcasting radio programs. This study composes of three parts. The first part describes the nationalist socialist and communist propaganda activities in Turkey during the Second World War. The second part addresses the debates over propaganda among selected newspapers representing different ideologies. Finally, the last part analyzes the Turkish government's press policy. It explains why the government allowed ideological debates in the press despite its authoritarian press policy and "active neutrality" stance in the international arena.

Keywords: propaganda, press, 5th column, World War II, Turkey

Procedia PDF Downloads 79
30910 Tracing the Concept of Equivalence in Translation Theories from the Linguistics Oriented Era to Present

Authors: Fatma Ülkü Kavruk

Abstract:

The comparison of the old and new approaches reveals that the concept of equivalence has been interpreted and categorized in different ways by different scholars throughout the history. The aim of this study is to trace the concept of equivalence in translation theories from the linguistics-oriented era to present, referring to various translation scholars and to provide a critical evaluation of the nature and applicability of the concept of equivalence in today’s world of translation studies. Within the study, various interpretations of equivalence proposed by international scholars in translation studies are to be presented. In order to find out the reflections of these scholars’ approaches to the Turkish scholars’ research, the interpretations of equivalence by various Turkish scholars are to be examined. At the end of the paper, the applicability of the concept of equivalence in real life is to be discussed in light of these approaches.

Keywords: translation studies, equivalence, translation theories, evaluation

Procedia PDF Downloads 472
30909 Using Swarm Intelligence to Forecast Outcomes of English Premier League Matches

Authors: Hans Schumann, Colin Domnauer, Louis Rosenberg

Abstract:

In this study, machine learning techniques were deployed on real-time human swarm data to forecast the likelihood of outcomes for English Premier League matches in the 2020/21 season. These techniques included ensemble models in combination with neural networks and were tested against an industry standard of Vegas Oddsmakers. Predictions made from the collective intelligence of human swarm participants managed to achieve a positive return on investment over a full season on matches, empirically proving the usefulness of a new artificial intelligence valuing human instinct and intelligence.

Keywords: artificial intelligence, data science, English Premier League, human swarming, machine learning, sports betting, swarm intelligence

Procedia PDF Downloads 190
30908 A Comparative Approach to the Concept of Incarnation of God in Hinduism and Christianity

Authors: Cemil Kutluturk

Abstract:

This is a comparative study of the incarnation of God according to Hinduism and Christianity. After dealing with their basic ideas on the concept of the incarnation of God, the main similarities and differences between each other will be examined by quoting references from their sacred texts. In Hinduism, the term avatara is used in order to indicate the concept of the incarnation of God. The word avatara is derived from ava (down) and tri (to cross, to save, attain). Thus avatara means to come down or to descend. Although an avatara is commonly considered as an appearance of any deity on earth, the term refers particularly to descents of Vishnu. According to Hinduism, God becomes an avatara in every age and entering into diverse wombs for the sake of establishing righteousness. On the Christian side, the word incarnation means enfleshment. In Christianity, it is believed that the Logos or Word, the Second Person of Trinity, presumed human reality. Incarnation refers both to the act of God becoming a human being and to the result of his action, namely the permanent union of the divine and human natures in the one Person of the Word. When the doctrines of incarnation and avatara are compared some similarities and differences can be found between each other. The basic similarity is that both doctrines are not bound by the laws of nature as human beings are. They reveal God’s personal love and concern, and emphasize loving devotion. Their entry into the world is generally accompanied by extraordinary signs. In both cases, the descent of God allows for human beings to ascend to God. On the other hand, there are some distinctions between two religious traditions. For instance, according to Hinduism there are many and repeated avataras, while Christ comes only once. Indeed, this is related to the respective cyclic and linear worldviews of the two religions. Another difference is that in Hinduism avataras are real and perfect, while in Christianity Christ is also real, yet imperfect; that is, he has human imperfections, except sin. While Christ has never been thought of as a partial incarnation, in Hinduism there are some partial and full avataras. The other difference is that while the purpose of Christ is primarily ultimate salvation, not every avatara grants ultimate liberation, some of them come only to save a devotee from a specific predicament.

Keywords: Avatara, Christianity, Hinduism, incarnation

Procedia PDF Downloads 237