Search results for: component analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28677

Search results for: component analysis

25887 Multi-Label Approach to Facilitate Test Automation Based on Historical Data

Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally

Abstract:

The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.

Keywords: machine learning, multi-class, multi-label, supervised learning, test automation

Procedia PDF Downloads 117
25886 Effect of Out-Of-Plane Deformation on Relaxation Method of Stress Concentration in a Plate with a Circular Hole

Authors: Shingo Murakami, Shinichi Enoki

Abstract:

In structures, stress concentration is a factor of fatigue fracture. Basically, the stress concentration is a phenomenon that should be avoided. However, it is difficult to avoid the stress concentration. Therefore, relaxation of the stress concentration is important. The stress concentration arises from notches and circular holes. There is a relaxation method that a composite patch covers a notch and a circular hole. This relaxation method is used to repair aerial wings, but it is not systematized. Composites are more expensive than single materials. Accordingly, we propose the relaxation method that a single material patch covers a notch and a circular hole, and aim to systematize this relaxation method. We performed FEA (Finite Element Analysis) about an object by using a three-dimensional FEA model. The object was that a patch adheres to a plate with a circular hole. And, a uniaxial tensile load acts on the patched plate with a circular hole. In the three-dimensional FEA model, it is not easy to model the adhesion layer. Basically, the yield stress of the adhesive is smaller than that of adherents. Accordingly, the adhesion layer gets to plastic deformation earlier than the adherents under the yield load of adherents. Therefore, we propose the three-dimensional FEA model which is applied a nonlinear elastic region to the adhesion layer. The nonlinear elastic region was calculated by a bilinear approximation. We compared the analysis results with the tensile test results to confirm whether the analysis model has usefulness. As a result, the analysis results agreed with the tensile test results. And, we confirmed that the analysis model has usefulness. As a result that the three-dimensional FEA model was used to the analysis, it was confirmed that an out-of-plane deformation occurred to the patched plate with a circular hole. The out-of-plane deformation causes stress increase of the patched plate with a circular hole. Therefore, we investigated that the out-of-plane deformation affects relaxation of the stress concentration in the plate with a circular hole on this relaxation method. As a result, it was confirmed that the out-of-plane deformation inhibits relaxation of the stress concentration on the plate with a circular hole.

Keywords: stress concentration, patch, out-of-plane deformation, Finite Element Analysis

Procedia PDF Downloads 293
25885 Time-Evolving Wave Packet in Phase Space

Authors: Mitsuyoshi Tomiya, Kentaro Kawamura, Shoichi Sakamoto

Abstract:

In chaotic billiard systems, scar-like localization has been found on time-evolving wave packet. We may call it the “dynamical scar” to separate it to the original scar in stationary states. It also comes out along the vicinity of classical unstable periodic orbits, when the wave packets are launched along the orbits, against the hypothesis that the waves become homogenous all around the billiard. Then time-evolving wave packets are investigated numerically in phase space. The Wigner function is adopted to detect the wave packets in phase space. The 2-dimensional Poincaré sections of the 4-dimensional phase space are introduced to clarify the dynamical behavior of the wave packets. The Poincaré sections of the coordinate (x or y) and the momentum (Px or Py) can visualize the dynamical behavior of the wave packets, including the behavior in the momentum degree also. For example, in “dynamical scar” states, a bit larger momentum component comes first, and then the a bit smaller and smaller components follow next. The sections made in the momentum space (Px or Py) elucidates specific trajectories that have larger contribution to the “dynamical scar” states. It is the fixed point observation of the momentum degrees at a specific fixed point(x0, y0) in the phase space. The accumulation are also calculated to search the “dynamical scar” in the Poincare sections. It is found the scars as bright spots in momentum degrees of the phase space.

Keywords: chaotic billiard, Poincaré section, scar, wave packet

Procedia PDF Downloads 443
25884 The Role of Transport Investment and Enhanced Railway Accessibility in Regional Efficiency Improvement in Saudi Arabia: Data Envelopment Analysis

Authors: Saleh Alotaibi, Mohammed Quddus, Craig Morton, Jobair Bin Alam

Abstract:

This paper explores the role of large-scale investment in transport sectors and the impact of increased railway accessibility on the efficiency of the regional economic productivity in the Kingdom of Saudi Arabia (KSA). There are considerable differences among the KSA regions in terms of their levels of investment and productivity due to their geographical scale and location, which in turn greatly affect their relative efficiency. The study used a non-parametric linear programming technique - Data Envelopment Analysis (DEA) - to measure the regional efficiency change over time and determine the drivers of inefficiency and their scope of improvement. In addition, Window DEA analysis is carried out to compare the efficiency performance change for various time periods. Malmquist index (MI) is also analyzed to identify the sources of productivity change between two subsequent years. The analysis involves spatial and temporal panel data collected from 1999 to 2018 for the 13 regions of the country. Outcomes reveal that transport investment and improved railway accessibility, in general, have significantly contributed to regional economic development. Moreover, the endowment of the new railway stations has spill-over effects. The DEA Window analysis confirmed the dynamic improvement in the average regional efficiency over the study periods. MI showed that the technical efficiency change was the main source of regional productivity improvement. However, there is evidence of investment allocation discrepancy among regions which could limit the achievement of development goals in the long term. These relevant findings will assist the Saudi government in developing better strategic decisions for future transport investments and their allocation at the regional level.

Keywords: data envelopment analysis, transport investment, railway accessibility, efficiency

Procedia PDF Downloads 142
25883 Uncertainty and Optimization Analysis Using PETREL RE

Authors: Ankur Sachan

Abstract:

The ability to make quick yet intelligent and value-added decisions to develop new fields has always been of great significance. In situations where the capital expenses and subsurface risk are high, carefully analyzing the inherent uncertainties in the reservoir and how they impact the predicted hydrocarbon accumulation and production becomes a daunting task. The problem is compounded in offshore environments, especially in the presence of heavy oils and disconnected sands where the margin for error is small. Uncertainty refers to the degree to which the data set may be in error or stray from the predicted values. To understand and quantify the uncertainties in reservoir model is important when estimating the reserves. Uncertainty parameters can be geophysical, geological, petrophysical etc. Identification of these parameters is necessary to carry out the uncertainty analysis. With so many uncertainties working at different scales, it becomes essential to have a consistent and efficient way of incorporating them into our analysis. Ranking the uncertainties based on their impact on reserves helps to prioritize/ guide future data gathering and uncertainty reduction efforts. Assigning probabilistic ranges to key uncertainties also enables the computation of probabilistic reserves. With this in mind, this paper, with the help the uncertainty and optimization process in petrel RE shows how the most influential uncertainties can be determined efficiently and how much impact so they have on the reservoir model thus helping in determining a cost effective and accurate model of the reservoir.

Keywords: uncertainty, reservoir model, parameters, optimization analysis

Procedia PDF Downloads 613
25882 Ground Motion Modelling in Bangladesh Using Stochastic Method

Authors: Mizan Ahmed, Srikanth Venkatesan

Abstract:

Geological and tectonic framework indicates that Bangladesh is one of the most seismically active regions in the world. The Bengal Basin is at the junction of three major interacting plates: the Indian, Eurasian, and Burma Plates. Besides there are many active faults within the region, e.g. the large Dauki fault in the north. The country has experienced a number of destructive earthquakes due to the movement of these active faults. Current seismic provisions of Bangladesh are mostly based on earthquake data prior to the 1990. Given the record of earthquakes post 1990, there is a need to revisit the design provisions of the code. This paper compares the base shear demand of three major cities in Bangladesh: Dhaka (the capital city), Sylhet, and Chittagong for earthquake scenarios of magnitudes 7.0MW, 7.5MW, 8.0MW and 8.5MW using a stochastic model. In particular, the stochastic model allows the flexibility to input region specific parameters such as shear wave velocity profile (that were developed from Global Crustal Model CRUST2.0) and include the effects of attenuation as individual components. Effects of soil amplification were analysed using the Extended Component Attenuation Model (ECAM). Results show that the estimated base shear demand is higher in comparison with code provisions leading to the suggestion of additional seismic design consideration in the study regions.

Keywords: attenuation, earthquake, ground motion, Stochastic, seismic hazard

Procedia PDF Downloads 239
25881 Additive Manufacturing Optimization Via Integrated Taguchi-Gray Relation Methodology for Oil and Gas Component Fabrication

Authors: Meshal Alsaiari

Abstract:

Fused Deposition Modeling is one of the additive manufacturing technologies the industry is shifting to nowadays due to its simplicity and low affordable cost. The fabrication processing parameters predominantly influence FDM part strength and mechanical properties. This presentation will demonstrate the influences of the two manufacturing parameters on the tensile testing evaluation indexes, infill density, and Printing Orientation, which were analyzed to create a piping spacer suitable for oil and gas applications. The tensile specimens are made of two polymers, Acrylonitrile Styrene Acrylate (ASA) and High high-impact polystyrene (HIPS), to characterize the mechanical properties performance for creating the final product. The mechanical testing was carried out per the ASTM D638 testing standard, following Type IV requirements. Taguchi's experiment design using an L-9 orthogonal array was used to evaluate the performance output and identify the optimal manufacturing factors. The experimental results demonstrate that the tensile test is more pronounced with 100% infill for ASA and HIPS samples. However, the printing orientations varied in reactions; ASA is maximum at 0 degrees while HIPS shows almost similar percentages between 45 and 90 degrees. Taguchi-Gray integrated methodology was adopted to minimize the response and recognize optimal fabrication factors combinations.

Keywords: FDM, ASTM D638, tensile testing, acrylonitrile styrene acrylate

Procedia PDF Downloads 78
25880 Comparative Analysis of the Computer Methods' Usage for Calculation of Hydrocarbon Reserves in the Baltic Sea

Authors: Pavel Shcherban, Vlad Golovanov

Abstract:

Nowadays, the depletion of hydrocarbon deposits on the land of the Kaliningrad region leads to active geological exploration and development of oil and natural gas reserves in the southeastern part of the Baltic Sea. LLC 'Lukoil-Kaliningradmorneft' implements a comprehensive program for the development of the region's shelf in 2014-2023. Due to heterogeneity of reservoir rocks in various open fields, as well as with ambiguous conclusions on the contours of deposits, additional geological prospecting and refinement of the recoverable oil reserves are carried out. The key element is use of an effective technique of computer stock modeling at the first stage of processing of the received data. The following step uses information for the cluster analysis, which makes it possible to optimize the field development approaches. The article analyzes the effectiveness of various methods for reserves' calculation and computer modelling methods of the offshore hydrocarbon fields. Cluster analysis allows to measure influence of the obtained data on the development of a technical and economic model for mining deposits. The relationship between the accuracy of the calculation of recoverable reserves and the need of modernization of existing mining infrastructure, as well as the optimization of the scheme of opening and development of oil deposits, is observed.

Keywords: cluster analysis, computer modelling of deposits, correction of the feasibility study, offshore hydrocarbon fields

Procedia PDF Downloads 157
25879 Analysis of Steles with Libyan Inscriptions of Grande Kabylia, Algeria

Authors: Samia Ait Ali Yahia

Abstract:

Several steles with Libyan inscriptions were discovered in Grande Kabylia (Algeria), but very few researchers were interested in these inscriptions. Our work is to list, if possible all these steles in order to do a descriptive study of the corpus. The steles analysis will be focused on the iconographic and epigraphic level and on the different forms of Libyan characters in order to highlight the alphabet used by the Grande Kabylia.

Keywords: epigraphy, stele, Libyan inscription, Grande Kabylia

Procedia PDF Downloads 204
25878 Ghost Frequency Noise Reduction through Displacement Deviation Analysis

Authors: Paua Ketan, Bhagate Rajkumar, Adiga Ganesh, M. Kiran

Abstract:

Low gear noise is an important sound quality feature in modern passenger cars. Annoying gear noise from the gearbox is influenced by the gear design, gearbox shaft layout, manufacturing deviations in the components, assembly errors and the mounting arrangement of the complete gearbox. Geometrical deviations in the form of profile and lead errors are often present on the flanks of the inspected gears. Ghost frequencies of a gear are very challenging to identify in standard gear measurement and analysis process due to small wavelengths involved. In this paper, gear whine noise occurring at non-integral multiples of gear mesh frequency of passenger car gearbox is investigated and the root cause is identified using the displacement deviation analysis (DDA) method. DDA method is applied to identify ghost frequency excitations on the flanks of gears arising out of generation grinding. Frequency identified through DDA correlated with the frequency of vibration and noise on the end-of-line machine as well as vehicle level measurements. With the application of DDA method along with standard lead profile measurement, gears with ghost frequency geometry deviations were identified on the production line to eliminate defective parts and thereby eliminate ghost frequency noise from a vehicle. Further, displacement deviation analysis can be used in conjunction with the manufacturing process simulation to arrive at suitable countermeasures for arresting the ghost frequency.

Keywords: displacement deviation analysis, gear whine, ghost frequency, sound quality

Procedia PDF Downloads 134
25877 A Critical Discourse Analysis: Embedded Inequalities in the UK Disability Social Security System

Authors: Cara Williams

Abstract:

In 2006, the UK Labour government published a Green Paper introducing Employment and Support Allowance (ESA) as a replacement for Incapacity Benefit (IB), as well as a new Work Capability Assessment (WCA); signalling a controversial political and economic shift in disability welfare policy. In 2016, the Conservative government published Improving Lives: The Work, Health, and Disability Green Paper, as part of their social reform agenda, evidently to address the ‘injustice’ of the ‘disability employment gap’. This paper contextualises ESA in the wider ideology and rhetoric of ‘welfare to work’, ‘dependency’ and ‘responsibility’. Using the British ‘social model of disability’ as a theoretical framework, the study engages in a critical discourse analysis of these two Green Papers. By uncovering the medicalised conceptions embedded in the texts, the analysis has revealed ESA is linked with late capitalisms concern with the ‘disability category’.

Keywords: disability, employment, social security, welfare

Procedia PDF Downloads 158
25876 Tamper Resistance Evaluation Tests with Noise Resources

Authors: Masaya Yoshikawa, Toshiya Asai, Ryoma Matsuhisa, Yusuke Nozaki, Kensaku Asahi

Abstract:

Recently, side-channel attacks, which estimate secret keys using side-channel information such as power consumption and compromising emanations of cryptography circuits embedded in hardware, have become a serious problem. In particular, electromagnetic analysis attacks against cryptographic circuits between information processing and electromagnetic fields, which are related to secret keys in cryptography circuits, are the most threatening side-channel attacks. Therefore, it is important to evaluate tamper resistance against electromagnetic analysis attacks for cryptography circuits. The present study performs basic examination of the tamper resistance of cryptography circuits using electromagnetic analysis attacks with noise resources.

Keywords: tamper resistance, cryptographic circuit, hardware security evaluation, noise resources

Procedia PDF Downloads 492
25875 Extended Shelf Life of Chicken Meat Using Carboxymethyl Cellulose Coated Polypropylene Films Containing Zataria multiflora Essential Oil

Authors: Z. Honarvar, M. Farhoodi, M. R. Khani, S. Shojaee-Aliabadi

Abstract:

The purpose of the present study was to evaluate carboxymethyl cellulose (CMC) coated polypropylene (PP) films containing Zataria multiflora (ZEO) essential oils (4%) as an antimicrobial packaging for chicken breast stored at 4 °C. To increase PP film hydrophilicity, it was treated by atmospheric cold plasma prior to coating by CMC. Then, different films including PP, PP/CMC, PP/CMC containing 4% of ZEO were used for the chicken meat packaging in vapor phase. Total viable count and pseudomonads population and oxidative (TBA) changes of the chicken breast were analyzed during shelf life. Results showed that the shelf life of chicken meat kept in films containing ZEO improved from three to nine days compared to the control sample without any direct contact with the film. Study of oxygen barrier properties of bilayer film without essential oils (0.096 cm3 μm/m2 d kPa) in comparison with PP film (416 cm3 μm/m2 d kPa) shows that coating of PP with CMC significantly reduces oxygen permeation of the obtained packaging (P<0.05), which reduced aerobic bacteria growth. Chemical composition of ZEO was also evaluated by gas chromatography–mass spectrometry (GC–MS), and this shows that thymol was the main antimicrobial and antioxidant component of the essential oil. The results revealed that PP/CMC containing ZEO has good potential for application as active food packaging in indirect contact which would also improve sensory properties of product.

Keywords: shelf life, chicken breast, polypropylene, carboxymethyl cellulose, essential oil

Procedia PDF Downloads 226
25874 Document-level Sentiment Analysis: An Exploratory Case Study of Low-resource Language Urdu

Authors: Ammarah Irum, Muhammad Ali Tahir

Abstract:

Document-level sentiment analysis in Urdu is a challenging Natural Language Processing (NLP) task due to the difficulty of working with lengthy texts in a language with constrained resources. Deep learning models, which are complex neural network architectures, are well-suited to text-based applications in addition to data formats like audio, image, and video. To investigate the potential of deep learning for Urdu sentiment analysis, we implemented five different deep learning models, including Bidirectional Long Short Term Memory (BiLSTM), Convolutional Neural Network (CNN), Convolutional Neural Network with Bidirectional Long Short Term Memory (CNN-BiLSTM), and Bidirectional Encoder Representation from Transformer (BERT). In this study, we developed a hybrid deep learning model called BiLSTM-Single Layer Multi Filter Convolutional Neural Network (BiLSTM-SLMFCNN) by fusing BiLSTM and CNN architecture. The proposed and baseline techniques are applied on Urdu Customer Support data set and IMDB Urdu movie review data set by using pre-trained Urdu word embedding that are suitable for sentiment analysis at the document level. Results of these techniques are evaluated and our proposed model outperforms all other deep learning techniques for Urdu sentiment analysis. BiLSTM-SLMFCNN outperformed the baseline deep learning models and achieved 83%, 79%, 83% and 94% accuracy on small, medium and large sized IMDB Urdu movie review data set and Urdu Customer Support data set respectively.

Keywords: urdu sentiment analysis, deep learning, natural language processing, opinion mining, low-resource language

Procedia PDF Downloads 57
25873 Thermo-Economic Evaluation of Sustainable Biogas Upgrading via Solid-Oxide Electrolysis

Authors: Ligang Wang, Theodoros Damartzis, Stefan Diethelm, Jan Van Herle, François Marechal

Abstract:

Biogas production from anaerobic digestion of organic sludge from wastewater treatment as well as various urban and agricultural organic wastes is of great significance to achieve a sustainable society. Two upgrading approaches for cleaned biogas can be considered: (1) direct H₂ injection for catalytic CO₂ methanation and (2) CO₂ separation from biogas. The first approach usually employs electrolysis technologies to generate hydrogen and increases the biogas production rate; while the second one usually applies commercially-available highly-selective membrane technologies to efficiently extract CO₂ from the biogas with the latter being then sent afterward for compression and storage for further use. A straightforward way of utilizing the captured CO₂ is on-site catalytic CO₂ methanation. From the perspective of system complexity, the second approach may be questioned, since it introduces an additional expensive membrane component for producing the same amount of methane. However, given the circumstance that the sustainability of the produced biogas should be retained after biogas upgrading, renewable electricity should be supplied to drive the electrolyzer. Therefore, considering the intermittent nature and seasonal variation of renewable electricity supply, the second approach offers high operational flexibility. This indicates that these two approaches should be compared based on the availability and scale of the local renewable power supply and not only the technical systems themselves. Solid-oxide electrolysis generally offers high overall system efficiency, and more importantly, it can achieve simultaneous electrolysis of CO₂ and H₂O (namely, co-electrolysis), which may bring significant benefits for the case of CO₂ separation from the produced biogas. When taking co-electrolysis into account, two additional upgrading approaches can be proposed: (1) direct steam injection into the biogas with the mixture going through the SOE, and (2) CO₂ separation from biogas which can be used later for co-electrolysis. The case study of integrating SOE to a wastewater treatment plant is investigated with wind power as the renewable power. The dynamic production of biogas is provided on an hourly basis with the corresponding oxygen and heating requirements. All four approaches mentioned above are investigated and compared thermo-economically: (a) steam-electrolysis with grid power, as the base case for steam electrolysis, (b) CO₂ separation and co-electrolysis with grid power, as the base case for co-electrolysis, (c) steam-electrolysis and CO₂ separation (and storage) with wind power, and (d) co-electrolysis and CO₂ separation (and storage) with wind power. The influence of the scale of wind power supply is investigated by a sensitivity analysis. The results derived provide general understanding on the economic competitiveness of SOE for sustainable biogas upgrading, thus assisting the decision making for biogas production sites. The research leading to the presented work is funded by European Union’s Horizon 2020 under grant agreements n° 699892 (ECo, topic H2020-JTI-FCH-2015-1) and SCCER BIOSWEET.

Keywords: biogas upgrading, solid-oxide electrolyzer, co-electrolysis, CO₂ utilization, energy storage

Procedia PDF Downloads 144
25872 Experimental Validation of a Mathematical Model for Sizing End-of-Production-Line Test Benches for Electric Motors of Electric Vehicle

Authors: Emiliano Lustrissimi, Bonifacio Bianco, Sebastiano Caravaggi, Antonio Rosato

Abstract:

A mathematical framework has been designed to enhance the configuration of an end-of-production-line (EOL) test bench. This system can be used to assess the performance of electric motors or axles intended for electric vehicles. The model has been developed to predict the behaviour of EOL test benches and electric motors/axles under various boundary conditions, eliminating the need for extensive physical testing and reducing the corresponding power consumption. The suggested model is versatile, capable of being utilized across various types of electric motors or axles, and adaptable to accommodate varying power ratings of electric motors or axles. The maximum performance to be guaranteed by the EMs according to the car maker's specifications are taken as inputs in the model. Then, the required performance of each main EOL test bench component is calculated, and the corresponding systems available on the market are selected based on manufacturers’ catalogues. In this study, an EOL test bench has been designed according to the proposed model outputs for testing a low-power (about 22 kW) electric axle. The performance of the designed EOL test bench has been measured and used to validate the proposed model and assess both the consistency of the constraints as well as the accuracy of predictions in terms of electric demands. The comparison between experimental and predicted data exhibited a reasonable agreement, allowing to demonstrate that, despite some discrepancies, the model gives an accurate representation of the EOL test benches' performance.

Keywords: electric motors, electric vehicles, end-of-production-line test bench, mathematical model, field tests

Procedia PDF Downloads 39
25871 Design and Analysis of a Rear Bumper of an Automobile with a Hybrid Polymer Composite of Oil Palm Empty Fruit Bunch Fiber/Banana Fibres

Authors: S. O. Ologe, U. P. Anaidhuno, Duru C. A.

Abstract:

This research investigated the design and analysis of a rear bumper of an automobile with a hybrid polymer composite of OPEBF/Banana fibre. OPEBF/Banana fibre hybrid polymers composite is of low cost, lightweight, as well as possesses satisfactory mechanical properties. In this research work, hybrid composites have been developed using the hand layup technique based on the percentage combination of OPEBF/Banana fibre at 10:90, 20:80, 30:70, 40:60, 50:50. 60:40, 70:30. 20:80, 90:10, 95:5. The mechanical properties in the context of compressive strength of 65MPa, a flexural strength of 20MPa, and impact strength of 3.25Joule were observed, and the simulation analysis on the induction of 500N load at the factor of safety of 3 was observed to have displayed a good strength suitable for automobile bumper with the advantages of weight reduction.

Keywords: OPEBF, Banana, fibre, hybrid

Procedia PDF Downloads 98
25870 Definite Article Errors and Effect of L1 Transfer

Authors: Bimrisha Mali

Abstract:

The present study investigates the type of errors English as a second language (ESL) learners produce using the definite article ‘the’. The participants were provided a questionnaire on the learner's ability test. The questionnaire consists of three cloze tests and two free composition tests. Each participant's response was received in the form of written data. A total of 78 participants from three government schools participated in the study. The participants are high-school students from Rural Assam. Assam is a north-eastern state of India. Their age ranged between 14-15. The medium of instruction and the communication among the students take place in the local language, i.e., Assamese. Pit Corder’s steps for conducting error analysis have been followed for the analysis procedure. Four types of errors were found (1) deletion of the definite article, (2) use of the definite article as modifiers as adjectives, (3) incorrect use of the definite article with singular proper nouns, (4) substitution of the definite article by the indefinite article ‘a’. Classifiers in Assamese that express definiteness is used with nouns, adjectives, and numerals. It is found that native language (L1) transfer plays a pivotal role in the learners’ errors. The analysis reveals the learners' inability to acquire the semantic connotation of definiteness in English due to native language (L1) interference.

Keywords: definite article error, l1 transfer, error analysis, ESL

Procedia PDF Downloads 116
25869 A Comprehensive Key Performance Indicators Dashboard for Emergency Medical Services

Authors: Giada Feletti, Daniela Tedesco, Paolo Trucco

Abstract:

The present study aims to develop a dashboard of Key Performance Indicators (KPI) to enhance information and predictive capabilities in Emergency Medical Services (EMS) systems, supporting both operational and strategic decisions of different actors. The employed research methodology consists of the first phase of revision of the technical-scientific literature concerning the indicators currently used for the performance measurement of EMS systems. From this literature analysis, it emerged that current studies focus on two distinct perspectives: the ambulance service, a fundamental component of pre-hospital health treatment, and the patient care in the Emergency Department (ED). The perspective proposed by this study is to consider an integrated view of the ambulance service process and the ED process, both essential to ensure high quality of care and patient safety. Thus, the proposal focuses on the entire healthcare service process and, as such, allows considering the interconnection between the two EMS processes, the pre-hospital and hospital ones, connected by the assignment of the patient to a specific ED. In this way, it is possible to optimize the entire patient management. Therefore, attention is paid to the dependency of decisions that in current EMS management models tend to be neglected or underestimated. In particular, the integration of the two processes enables the evaluation of the advantage of an ED selection decision having visibility on EDs’ saturation status and therefore considering the distance, the available resources and the expected waiting times. Starting from a critical review of the KPIs proposed in the extant literature, the design of the dashboard was carried out: the high number of analyzed KPIs was reduced by eliminating the ones firstly not in line with the aim of the study and then the ones supporting a similar functionality. The KPIs finally selected were tested on a realistic dataset, which draws us to exclude additional indicators due to the unavailability of data required for their computation. The final dashboard, which was discussed and validated by experts in the field, includes a variety of KPIs able to support operational and planning decisions, early warning, and citizens’ awareness of EDs accessibility in real-time. By associating each KPI to the EMS phase it refers to, it was also possible to design a well-balanced dashboard covering both efficiency and effective performance of the entire EMS process. Indeed, just the initial phases related to the interconnection between ambulance service and patient’s care are covered by traditional KPIs compared to the subsequent phases taking place in the hospital ED. This could be taken into consideration for the potential future development of the dashboard. Moreover, the research could proceed by building a multi-layer dashboard composed of the first level with a minimal set of KPIs to measure the basic performance of the EMS system at an aggregate level and further levels with KPIs that can bring additional and more detailed information.

Keywords: dashboard, decision support, emergency medical services, key performance indicators

Procedia PDF Downloads 100
25868 Analytical Modelling of Surface Roughness during Compacted Graphite Iron Milling Using Ceramic Inserts

Authors: Ş. Karabulut, A. Güllü, A. Güldaş, R. Gürbüz

Abstract:

This study investigates the effects of the lead angle and chip thickness variation on surface roughness during the machining of compacted graphite iron using ceramic cutting tools under dry cutting conditions. Analytical models were developed for predicting the surface roughness values of the specimens after the face milling process. Experimental data was collected and imported to the artificial neural network model. A multilayer perceptron model was used with the back propagation algorithm employing the input parameters of lead angle, cutting speed and feed rate in connection with chip thickness. Furthermore, analysis of variance was employed to determine the effects of the cutting parameters on surface roughness. Artificial neural network and regression analysis were used to predict surface roughness. The values thus predicted were compared with the collected experimental data, and the corresponding percentage error was computed. Analysis results revealed that the lead angle is the dominant factor affecting surface roughness. Experimental results indicated an improvement in the surface roughness value with decreasing lead angle value from 88° to 45°.

Keywords: CGI, milling, surface roughness, ANN, regression, modeling, analysis

Procedia PDF Downloads 440
25867 Surface Roughness Prediction Using Numerical Scheme and Adaptive Control

Authors: Michael K.O. Ayomoh, Khaled A. Abou-El-Hossein., Sameh F.M. Ghobashy

Abstract:

This paper proposes a numerical modelling scheme for surface roughness prediction. The approach is premised on the use of 3D difference analysis method enhanced with the use of feedback control loop where a set of adaptive weights are generated. The surface roughness values utilized in this paper were adapted from [1]. Their experiments were carried out using S55C high carbon steel. A comparison was further carried out between the proposed technique and those utilized in [1]. The experimental design has three cutting parameters namely: depth of cut, feed rate and cutting speed with twenty-seven experimental sample-space. The simulation trials conducted using Matlab software is of two sub-classes namely: prediction of the surface roughness readings for the non-boundary cutting combinations (NBCC) with the aid of the known surface roughness readings of the boundary cutting combinations (BCC). The following simulation involved the use of the predicted outputs from the NBCC to recover the surface roughness readings for the boundary cutting combinations (BCC). The simulation trial for the NBCC attained a state of total stability in the 7th iteration i.e. a point where the actual and desired roughness readings are equal such that error is minimized to zero by using a set of dynamic weights generated in every following simulation trial. A comparative study among the three methods showed that the proposed difference analysis technique with adaptive weight from feedback control, produced a much accurate output as against the abductive and regression analysis techniques presented in this.

Keywords: Difference Analysis, Surface Roughness; Mesh- Analysis, Feedback control, Adaptive weight, Boundary Element

Procedia PDF Downloads 613
25866 Fast Detection of Local Fiber Shifts by X-Ray Scattering

Authors: Peter Modregger, Özgül Öztürk

Abstract:

Glass fabric reinforced thermoplastic (GFRT) are composite materials, which combine low weight and resilient mechanical properties rendering them especially suitable for automobile construction. However, defects in the glass fabric as well as in the polymer matrix can occur during manufacturing, which may compromise component lifetime or even safety. One type of these defects is local fiber shifts, which can be difficult to detect. Recently, we have experimentally demonstrated the reliable detection of local fiber shifts by X-ray scattering based on the edge-illumination (EI) principle. EI constitutes a novel X-ray imaging technique that utilizes two slit masks, one in front of the sample and one in front of the detector, in order to simultaneously provide absorption, phase, and scattering contrast. The principle of contrast formation is as follows. The incident X-ray beam is split into smaller beamlets by the sample mask, resulting in small beamlets. These are distorted by the interaction with the sample, and the distortions are scaled up by the detector masks, rendering them visible to a pixelated detector. In the experiment, the sample mask is laterally scanned, resulting in Gaussian-like intensity distributions in each pixel. The area under the curves represents absorption, the peak offset refraction, and the width of the curve represents the scattering occurring in the sample. Here, scattering is caused by the numerous glass fiber/polymer matrix interfaces. In our recent publication, we have shown that the standard deviation of the absorption and scattering values over a selected field of view can be used to distinguish between intact samples and samples with local fiber shift defects. The quantification of defect detection performance was done by using p-values (p=0.002 for absorption and p=0.009 for scattering) and contrast-to-noise ratios (CNR=3.0 for absorption and CNR=2.1 for scattering) between the two groups of samples. This was further improved for the scattering contrast to p=0.0004 and CNR=4.2 by utilizing a harmonic decomposition analysis of the images. Thus, we concluded that local fiber shifts can be reliably detected by the X-ray scattering contrasts provided by EI. However, a potential application in, for example, production monitoring requires fast data acquisition times. For the results above, the scanning of the sample masks was performed over 50 individual steps, which resulted in long total scan times. In this paper, we will demonstrate that reliable detection of local fiber shift defects is also possible by using single images, which implies a speed up of total scan time by a factor of 50. Additional performance improvements will also be discussed, which opens the possibility for real-time acquisition. This contributes a vital step for the translation of EI to industrial applications for a wide variety of materials consisting of numerous interfaces on the micrometer scale.

Keywords: defects in composites, X-ray scattering, local fiber shifts, X-ray edge Illumination

Procedia PDF Downloads 54
25865 Analysis on the Building Energy Performance of a Retrofitted Residential Building with RETScreen Expert Software

Authors: Abdulhameed Babatunde Owolabi, Benyoh Emmanuel Kigha Nsafon, Jeung-Soo Huh

Abstract:

Energy efficiency measures for residential buildings in South Korea is a national issue because most of the apartments built in the last decades were constructed without proper energy efficiency measures making the energy performance of old buildings to be very poor when compared with new buildings. However, the adoption of advanced building technologies and regulatory building codes are effective energy efficiency strategies for new construction. There is a need to retrofits the existing building using energy conservation measures (ECMs) equipment’s in order to conserve energy and reduce GHGs emissions. To achieve this, the Institute for Global Climate Change and Energy (IGCCE), Kyungpook National University (KNU), Daegu, South Korea employed RETScreen Expert software to carry out measurement and verification (M&V) analysis on an existing building in Korea by using six years gas consumption data collected from Daesung Energy Co., Ltd in order to determine the building energy performance after the introduction of ECM. Through the M&V, energy efficiency is attained, and the resident doubt was reduced. From the analysis, a total of 657 Giga Joules (GJ) of liquefied natural gas (LNG) was consumed at the rate of 0.34 GJ/day having a peak in the year 2015, which cost the occupant the sum of $10,821.

Keywords: energy efficiency, measurement and verification, performance analysis, RETScreen experts

Procedia PDF Downloads 123
25864 Production of Novel Antibiotics by Importing eryK and eryG Genes in Streptomyces fradiae

Authors: Neda Gegar Goshe, Hossein Rassi

Abstract:

The antibacterial properties of macrolide antibiotics (such as erythromycin and tylosin) depend ultimately on the glycosylation of otherwise inactive polyketide lactones. Among the sugars commonly found in such macrolides are various 6-deoxyhexoses including the 3-dimethylamino sugars mycaminose and desosamine (4-deoxymycaminose). Some macrolides (such as tylosin) possess multiple sugar moieties, whereas others (such as erythromycin) have two sugar substituents. Streptomyces fradiae is an ideal host for development of generic polyketide-overproducing strains because it contains three of the most common precursors-malonyl-CoA, methylmalonyl-CoA and ethylmalonyl-CoA-used by modular PKS, and is a host that is amenable to genetic manipulation. As patterns of glycosylation markedly influence a macrolide's drug activity, there is considerable interest in the possibility of using combinatorial biosynthesis to generate new pairings of polyketide lactones with sugars, especially 6-deoxyhexoses. Here, we report a successful attempt to alter the aminodeoxyhexose-biosynthetic capacity of Streptomyces fradiae (a producer of tylosin) by importing genes from the erythromycin producer Saccharopolyspora erythraea. The biotransformation of erythromycin-D into the desired major component erythromycin-A involves two final enzymatic reactions, EryK-catalyzed hydroxylation at the C-12 position of the aglycone and EryG-catalyzed O methylation at the C-3 position of macrose .This engineered S. fradiae produced substantial amounts of two potentially useful macrolides that had not previously been obtained by fermentation.

Keywords: Streptomyces fradiae, eryK and eryG genes, tylosin, antibiotics

Procedia PDF Downloads 318
25863 End-User Behavior: Analysis of Their Role and Impacts on Energy Savings Achievements

Authors: Margarida Plana

Abstract:

End-users behavior has become one of the main aspects to be solved on energy efficiency projects. Especially on the residential sector, the end-users have a direct impact that affects the achievement of energy saving’s targets. This paper is focused on presenting and quantify the impact of end-users behavior on basis of the analysis of real projects’ data. The analysis study which is the role of buiding’s occupants and how their behavior can change the success of energy efficiency projects how to limit their impact. The results obtained show two main conclusions. The first one is easiest to solve: we need to control and limit the end-users interaction with the equipment operation to be able to reach the targets fixed. The second one: as the plugged equipment are increasing exponentially on the residential sector, big efforts of disseminations are needed in order to explain to citizens the impact of their day by day actions through dissemination campaigns.

Keywords: end-users impacts, energy efficiency, energy savings, impact limitations

Procedia PDF Downloads 347
25862 Design of an Ultra High Frequency Rectifier for Wireless Power Systems by Using Finite-Difference Time-Domain

Authors: Felipe M. de Freitas, Ícaro V. Soares, Lucas L. L. Fortes, Sandro T. M. Gonçalves, Úrsula D. C. Resende

Abstract:

There is a dispersed energy in Radio Frequencies (RF) that can be reused to power electronics circuits such as: sensors, actuators, identification devices, among other systems, without wire connections or a battery supply requirement. In this context, there are different types of energy harvesting systems, including rectennas, coil systems, graphene and new materials. A secondary step of an energy harvesting system is the rectification of the collected signal which may be carried out, for example, by the combination of one or more Schottky diodes connected in series or shunt. In the case of a rectenna-based system, for instance, the diode used must be able to receive low power signals at ultra-high frequencies. Therefore, it is required low values of series resistance, junction capacitance and potential barrier voltage. Due to this low-power condition, voltage multiplier configurations are used such as voltage doublers or modified bridge converters. Lowpass filter (LPF) at the input, DC output filter, and a resistive load are also commonly used in the rectifier design. The electronic circuits projects are commonly analyzed through simulation in SPICE (Simulation Program with Integrated Circuit Emphasis) environment. Despite the remarkable potential of SPICE-based simulators for complex circuit modeling and analysis of quasi-static electromagnetic fields interaction, i.e., at low frequency, these simulators are limited and they cannot model properly applications of microwave hybrid circuits in which there are both, lumped elements as well as distributed elements. This work proposes, therefore, the electromagnetic modelling of electronic components in order to create models that satisfy the needs for simulations of circuits in ultra-high frequencies, with application in rectifiers coupled to antennas, as in energy harvesting systems, that is, in rectennas. For this purpose, the numerical method FDTD (Finite-Difference Time-Domain) is applied and SPICE computational tools are used for comparison. In the present work, initially the Ampere-Maxwell equation is applied to the equations of current density and electric field within the FDTD method and its circuital relation with the voltage drop in the modeled component for the case of lumped parameter using the FDTD (Lumped-Element Finite-Difference Time-Domain) proposed in for the passive components and the one proposed in for the diode. Next, a rectifier is built with the essential requirements for operating rectenna energy harvesting systems and the FDTD results are compared with experimental measurements.

Keywords: energy harvesting system, LE-FDTD, rectenna, rectifier, wireless power systems

Procedia PDF Downloads 121
25861 Thermodynamic Modeling of Three Pressure Level Reheat HRSG, Parametric Analysis and Optimization Using PSO

Authors: Mahmoud Nadir, Adel Ghenaiet

Abstract:

The main purpose of this study is the thermodynamic modeling, the parametric analysis, and the optimization of three pressure level reheat HRSG (Heat Recovery Steam Generator) using PSO method (Particle Swarm Optimization). In this paper, a parametric analysis followed by a thermodynamic optimization is presented. The chosen objective function is the specific work of the steam cycle that may be, in the case of combined cycle (CC), a good criterion of thermodynamic performance analysis, contrary to the conventional steam turbines in which the thermal efficiency could be also an important criterion. The technologic constraints such as maximal steam cycle temperature, minimal steam fraction at steam turbine outlet, maximal steam pressure, minimal stack temperature, minimal pinch point, and maximal superheater effectiveness are also considered. The parametric analyses permitted to understand the effect of design parameters and the constraints on steam cycle specific work variation. PSO algorithm was used successfully in HRSG optimization, knowing that the achieved results are in accordance with those of the previous studies in which genetic algorithms were used. Moreover, this method is easy to implement comparing with the other methods.

Keywords: combined cycle, HRSG thermodynamic modeling, optimization, PSO, steam cycle specific work

Procedia PDF Downloads 372
25860 Evaluation Metrics for Machine Learning Techniques: A Comprehensive Review and Comparative Analysis of Performance Measurement Approaches

Authors: Seyed-Ali Sadegh-Zadeh, Kaveh Kavianpour, Hamed Atashbar, Elham Heidari, Saeed Shiry Ghidary, Amir M. Hajiyavand

Abstract:

Evaluation metrics play a critical role in assessing the performance of machine learning models. In this review paper, we provide a comprehensive overview of performance measurement approaches for machine learning models. For each category, we discuss the most widely used metrics, including their mathematical formulations and interpretation. Additionally, we provide a comparative analysis of performance measurement approaches for metric combinations. Our review paper aims to provide researchers and practitioners with a better understanding of performance measurement approaches and to aid in the selection of appropriate evaluation metrics for their specific applications.

Keywords: evaluation metrics, performance measurement, supervised learning, unsupervised learning, reinforcement learning, model robustness and stability, comparative analysis

Procedia PDF Downloads 49
25859 Asymptotic Analysis of the Viscous Flow through a Pipe and the Derivation of the Darcy-Weisbach Law

Authors: Eduard Marusic-Paloka

Abstract:

The Darcy-Weisbach formula is used to compute the pressure drop of the fluid in the pipe, due to the friction against the wall. Because of its simplicity, the Darcy-Weisbach formula became widely accepted by engineers and is used for laminar as well as the turbulent flows through pipes, once the method to compute the mysterious friction coefficient was derived. Particularly in the second half of the 20th century. Formula is empiric, and our goal is to derive it from the basic conservation law, via rigorous asymptotic analysis. We consider the case of the laminar flow but with significant Reynolds number. In case of the perfectly smooth pipe, the situation is trivial, as the Navier-Stokes system can be solved explicitly via the Poiseuille formula leading to the friction coefficient in the form 64/Re. For the rough pipe, the situation is more complicated and some effects of the roughness appear in the friction coefficient. We start from the Navier-Stokes system in the pipe with periodically corrugated wall and derive an asymptotic expansion for the pressure and for the velocity. We use the homogenization techniques and the boundary layer analysis. The approximation derived by formal analysis is then justified by rigorous error estimate in the norm of the appropriate Sobolev space, using the energy formulation and classical a priori estimates for the Navier-Stokes system. Our method leads to the formula for the friction coefficient. The formula involves resolution of the appropriate boundary layer problems, namely the boundary value problems for the Stokes system in an infinite band, that needs to be done numerically. However, theoretical analysis characterising their nature can be done without solving them.

Keywords: Darcy-Weisbach law, pipe flow, rough boundary, Navier law

Procedia PDF Downloads 345
25858 Sensitivity Analysis of Prestressed Post-Tensioned I-Girder and Deck System

Authors: Tahsin A. H. Nishat, Raquib Ahsan

Abstract:

Sensitivity analysis of design parameters of the optimization procedure can become a significant factor while designing any structural system. The objectives of the study are to analyze the sensitivity of deck slab thickness parameter obtained from both the conventional and optimum design methodology of pre-stressed post-tensioned I-girder and deck system and to compare the relative significance of slab thickness. For analysis on conventional method, the values of 14 design parameters obtained by the conventional iterative method of design of a real-life I-girder bridge project have been considered. On the other side for analysis on optimization method, cost optimization of this system has been done using global optimization methodology 'Evolutionary Operation (EVOP)'. The problem, by which optimum values of 14 design parameters have been obtained, contains 14 explicit constraints and 46 implicit constraints. For both types of design parameters, sensitivity analysis has been conducted on deck slab thickness parameter which can become too sensitive for the obtained optimum solution. Deviations of slab thickness on both the upper and lower side of its optimum value have been considered reflecting its realistic possible ranges of variations during construction. In this procedure, the remaining parameters have been kept unchanged. For small deviations from the optimum value, compliance with the explicit and implicit constraints has been examined. Variations in the cost have also been estimated. It is obtained that without violating any constraint deck slab thickness obtained by the conventional method can be increased up to 25 mm whereas slab thickness obtained by cost optimization can be increased only up to 0.3 mm. The obtained result suggests that slab thickness becomes less sensitive in case of conventional method of design. Therefore, for realistic design purpose sensitivity should be conducted for any of the design procedure of girder and deck system.

Keywords: sensitivity analysis, optimum design, evolutionary operations, PC I-girder, deck system

Procedia PDF Downloads 127