Search results for: life cycle based analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 48924

Search results for: life cycle based analysis

38424 Comparing Deep Architectures for Selecting Optimal Machine Translation

Authors: Despoina Mouratidis, Katia Lida Kermanidis

Abstract:

Machine translation (MT) is a very important task in Natural Language Processing (NLP). MT evaluation is crucial in MT development, as it constitutes the means to assess the success of an MT system, and also helps improve its performance. Several methods have been proposed for the evaluation of (MT) systems. Some of the most popular ones in automatic MT evaluation are score-based, such as the BLEU score, and others are based on lexical similarity or syntactic similarity between the MT outputs and the reference involving higher-level information like part of speech tagging (POS). This paper presents a language-independent machine learning framework for classifying pairwise translations. This framework uses vector representations of two machine-produced translations, one from a statistical machine translation model (SMT) and one from a neural machine translation model (NMT). The vector representations consist of automatically extracted word embeddings and string-like language-independent features. These vector representations used as an input to a multi-layer neural network (NN) that models the similarity between each MT output and the reference, as well as between the two MT outputs. To evaluate the proposed approach, a professional translation and a "ground-truth" annotation are used. The parallel corpora used are English-Greek (EN-GR) and English-Italian (EN-IT), in the educational domain and of informal genres (video lecture subtitles, course forum text, etc.) that are difficult to be reliably translated. They have tested three basic deep learning (DL) architectures to this schema: (i) fully-connected dense, (ii) Convolutional Neural Network (CNN), and (iii) Long Short-Term Memory (LSTM). Experiments show that all tested architectures achieved better results when compared against those of some of the well-known basic approaches, such as Random Forest (RF) and Support Vector Machine (SVM). Better accuracy results are obtained when LSTM layers are used in our schema. In terms of a balance between the results, better accuracy results are obtained when dense layers are used. The reason for this is that the model correctly classifies more sentences of the minority class (SMT). For a more integrated analysis of the accuracy results, a qualitative linguistic analysis is carried out. In this context, problems have been identified about some figures of speech, as the metaphors, or about certain linguistic phenomena, such as per etymology: paronyms. It is quite interesting to find out why all the classifiers led to worse accuracy results in Italian as compared to Greek, taking into account that the linguistic features employed are language independent.

Keywords: machine learning, machine translation evaluation, neural network architecture, pairwise classification

Procedia PDF Downloads 114
38423 To Cloudify or Not to Cloudify

Authors: Laila Yasir Al-Harthy, Ali H. Al-Badi

Abstract:

As an emerging business model, cloud computing has been initiated to satisfy the need of organizations and to push Information Technology as a utility. The shift to the cloud has changed the way Information Technology departments are managed traditionally and has raised many concerns for both, public and private sectors. The purpose of this study is to investigate the possibility of cloud computing services replacing services provided traditionally by IT departments. Therefore, it aims to 1) explore whether organizations in Oman are ready to move to the cloud; 2) identify the deciding factors leading to the adoption or rejection of cloud computing services in Oman; and 3) provide two case studies, one for a successful Cloud provider and another for a successful adopter. This paper is based on multiple research methods including conducting a set of interviews with cloud service providers and current cloud users in Oman; and collecting data using questionnaires from experts in the field and potential users of cloud services. Despite the limitation of bandwidth capacity and Internet coverage offered in Oman that create a challenge in adopting the cloud, it was found that many information technology professionals are encouraged to move to the cloud while few are resistant to change. The recent launch of a new Omani cloud service provider and the entrance of other international cloud service providers in the Omani market make this research extremely valuable as it aims to provide real-life experience as well as two case studies on the successful provision of cloud services and the successful adoption of these services.

Keywords: cloud computing, cloud deployment models, cloud service models, deciding factors

Procedia PDF Downloads 280
38422 Multivariate Rainfall Disaggregation Using MuDRain Model: Malaysia Experience

Authors: Ibrahim Suliman Hanaish

Abstract:

Disaggregation daily rainfall using stochastic models formulated based on multivariate approach (MuDRain) is discussed in this paper. Seven rain gauge stations are considered in this study for different distances from the referred station starting from 4 km to 160 km in Peninsular Malaysia. The hourly rainfall data used are covered the period from 1973 to 2008 and July and November months are considered as an example of dry and wet periods. The cross-correlation among the rain gauges is considered for the available hourly rainfall information at the neighboring stations or not. This paper discussed the applicability of the MuDRain model for disaggregation daily rainfall to hourly rainfall for both sources of cross-correlation. The goodness of fit of the model was based on the reproduction of fitting statistics like the means, variances, coefficients of skewness, lag zero cross-correlation of coefficients and the lag one auto correlation of coefficients. It is found the correlation coefficients based on extracted correlations that was based on daily are slightly higher than correlations based on available hourly rainfall especially for neighboring stations not more than 28 km. The results showed also the MuDRain model did not reproduce statistics very well. In addition, a bad reproduction of the actual hyetographs comparing to the synthetic hourly rainfall data. Mean while, it is showed a good fit between the distribution function of the historical and synthetic hourly rainfall. These discrepancies are unavoidable because of the lowest cross correlation of hourly rainfall. The overall performance indicated that the MuDRain model would not be appropriate choice for disaggregation daily rainfall.

Keywords: rainfall disaggregation, multivariate disaggregation rainfall model, correlation, stochastic model

Procedia PDF Downloads 500
38421 Studying Projection Distance and Flow Properties by Shape Variations of Foam Monitor

Authors: Hyun-Kyu Cho, Jun-Su Kim, Choon-Geun Huh, Geon Lee Young-Chul Park

Abstract:

In this study, the relationship between flow properties and fluid projection distance look into connection for shape variations of foam monitor. A numerical analysis technique for fluid analysis of a foam monitor was developed for the prediction. Shape of foam monitor the flow path of fluid flow according to the shape, The fluid losses were calculated from flow analysis result.. The modified model used the length increase model of the flow path, and straight line of the model. Inlet pressure was 7 [bar] and external was atmosphere codition. am. The results showed that the length increase model of the flow path and straight line of the model was improved in the nozzle projection distance.

Keywords: injection performance, finite element method, foam monitor, Projection distance

Procedia PDF Downloads 332
38420 Permeable Reactive Pavement for Controlling the Transport of Benzene, Toluene, Ethyl-Benzene, and Xylene (BTEX) Contaminants

Authors: Shengyi Huang, Chenju Liang

Abstract:

Volatile organic compounds such as benzene, toluene, ethyl-benzene, and xylene (BTEX) are common contaminants in environment, which could come from asphalt concrete or exhaust emissions of vehicles. The BTEX may invade to the subsurface environment via wet and dry atmospheric depositions. If there aren’t available ways for controlling contaminants’ fate and transport, they would extensively harm natural environment. In the 1st phase of this study, various adsorbents were screened for a suitable one to be an additive in the porous asphalt mixture. In the 2nd phase, addition of the selected adsorbent was incorporated with the design of porous asphalt concrete (PAC) to produce the permeable reactive pavement (PRP), which was subsequently tested for the potential of adsorbing aqueous BTEX as compared to the PAC, in the 3rd phase. The PRP was prepared according to the following steps: firstly, the suitable adsorbent was chosen based on the analytical results of specific surface area analysis, thermal-gravimetric analysis, adsorption kinetics and isotherms, and thermal dynamics analysis; secondly, the materials of coarse aggregate, fine aggregate, filler, asphalt, and fiber were tested in order to meet regulated specifications (e.g., water adsorption, soundness, viscosity etc.) for preparing the PRP; thirdly, the amount of adsorbent additive was determined in the PRP; fourthly, the prepared PAC and PRP were examined for their physical properties (e.g., abrasion loss, drain-down loss, Marshall stability, Marshall flow, dynamic stability etc.). As a result of comparison between PRP and PAC, the PRP showed better physical performance than the traditional PAC. At last, the Marshall Specimen column tests were conducted to explore the adsorption capacities of PAC and PRPs. The BTEX adsorption capacities of PRPs are higher than those obtained from traditional PAC. In summary, PRPs showed superior physical performance and adsorption capacities, which exhibit the potential of PRP to be applied as a replacement of PAC for better controlling the transport of non-point source pollutants.

Keywords: porous asphalt concrete, volatile organic compounds, permeable reactive pavement, non-point source pollution

Procedia PDF Downloads 198
38419 Alternative of Lead-Based Ionization Radiation Shielding Property: Epoxy-Based Composite Design

Authors: Md. Belal Uudin Rabbi, Sakib Al Montasir, Saifur Rahman, Niger Nahid, Esmail Hossain Emon

Abstract:

The practice of radiation shielding protects against the detrimental effects of ionizing radiation. Radiation shielding depletes radiation by inserting a shield of absorbing material between any radioactive source. It is a primary concern when building several industrial fields, so using potent (high activity) radioisotopes in food preservation, cancer treatment, and particle accelerator facilities is significant. Radiation shielding is essential for radiation-emitting equipment users to reduce or mitigate radiation damage. Polymer composites (especially epoxy based) with high atomic number fillers can replace toxic Lead in ionizing radiation shielding applications because of their excellent mechanical properties, superior solvent and chemical resistance, good dimensional stability, adhesive, and less toxic. Due to being lightweight, good neutron shielding ability in almost the same order as concrete, epoxy-based radiation shielding can be the next big thing. Micro and nano-particles for the epoxy resin increase the epoxy matrix's radiation shielding property. Shielding is required to protect users of such facilities from ionizing radiation as recently, and considerable attention has been paid to polymeric composites as a radiation shielding material. This research will examine the radiation shielding performance of epoxy-based nano-WO3 reinforced composites, exploring the performance of epoxy-based nano-WO3 reinforced composites. The samples will be prepared using the direct pouring method to block radiation. The practice of radiation shielding protects against the detrimental effects of ionizing radiation.

Keywords: radiation shielding materials, ionizing radiation, epoxy resin, Tungsten oxide, polymer composites

Procedia PDF Downloads 93
38418 Simulation-Based Unmanned Surface Vehicle Design Using PX4 and Robot Operating System With Kubernetes and Cloud-Native Tooling

Authors: Norbert Szulc, Jakub Wilk, Franciszek Górski

Abstract:

This paper presents an approach for simulating and testing robotic systems based on PX4, using a local Kubernetes cluster. The approach leverages modern cloud-native tools and runs on single-board computers. Additionally, this solution enables the creation of datasets for computer vision and the evaluation of control system algorithms in an end-to-end manner. This paper compares this approach to method commonly used Docker based approach. This approach was used to develop simulation environment for an unmanned surface vehicle (USV) for RoboBoat 2023 by running a containerized configuration of the PX4 Open-source Autopilot connected to ROS and the Gazebo simulation environment.

Keywords: cloud computing, Kubernetes, single board computers, simulation, ROS

Procedia PDF Downloads 62
38417 Examining Pre-Consumer Textile Waste Recycling, Barriers to Implementation, and Participant Demographics: A Review of Literature

Authors: Madeline W. Miller

Abstract:

The global textile industry produces pollutants in the form of liquid discharge, solid waste, and emissions into the natural environment. Textile waste resulting from garment production and other manufacturing processes makes a significant contribution to the amount of waste landfilled globally. While the majority of curbside and other convenient recycling methods cater to post-consumer paper and plastics, pre-consumer textile waste is often discarded with trash and is commonly classified as ‘other’ in municipal solid waste breakdowns. On a larger scale, many clothing manufacturers and other companies utilizing textiles have not yet identified or began using the most sustainable methods for discarding their post-industrial, pre-consumer waste. To lessen the amount of waste sent to landfills, there are post-industrial, pre-consumer textile waste recycling methods that can be used to give textiles a new life. This process requires that textile and garment manufacturers redirect their waste to companies that use industrial machinery to shred or fiberize these materials in preparation for their second life. The goal of this literature review is to identify the recycling and reuse challenges faced by producers within the clothing and textile industry that prevent these companies from utilizing the described recycling methods, causing them to opt for landfill. The literature analyzed in this review reflects manufacturer sentiments toward waste disposal and recycling. The results of this review indicate that the cost of logistics is the determining factor when it comes to companies recycling their pre-consumer textile waste and that the most applicable and successful textile waste recycling methods require a company separate from the manufacturer to account for waste production, provide receptacles for waste, arrange waste transport, and identify a secondary use for the material at a price-point below that of traditional waste disposal service.

Keywords: leadership demographics, post-industrial textile waste, pre-consumer textile waste, industrial shoddy

Procedia PDF Downloads 132
38416 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison

Authors: Xiangtuo Chen, Paul-Henry Cournéde

Abstract:

Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.

Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest

Procedia PDF Downloads 219
38415 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis

Authors: H. Jung, N. Kim, B. Kang, J. Choe

Abstract:

History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.

Keywords: history matching, principal component analysis, reservoir modelling, support vector machine

Procedia PDF Downloads 146
38414 Novel GPU Approach in Predicting the Directional Trend of the S&P500

Authors: A. J. Regan, F. J. Lidgey, M. Betteridge, P. Georgiou, C. Toumazou, K. Hayatleh, J. R. Dibble

Abstract:

Our goal is development of an algorithm capable of predicting the directional trend of the Standard and Poor’s 500 index (S&P 500). Extensive research has been published attempting to predict different financial markets using historical data testing on an in-sample and trend basis, with many authors employing excessively complex mathematical techniques. In reviewing and evaluating these in-sample methodologies, it became evident that this approach was unable to achieve sufficiently reliable prediction performance for commercial exploitation. For these reasons, we moved to an out-of-sample strategy based on linear regression analysis of an extensive set of financial data correlated with historical closing prices of the S&P 500. We are pleased to report a directional trend accuracy of greater than 55% for tomorrow (t+1) in predicting the S&P 500.

Keywords: financial algorithm, GPU, S&P 500, stock market prediction

Procedia PDF Downloads 338
38413 Effect of Architecture and Operating Conditions of Vehicle on Bulb Lifetime in Automotive

Authors: Hatice Özbek, Caner Çil, Ahmet Rodoplu

Abstract:

Automotive lighting is the leading function in the configuration of vehicle architecture. Especially headlights and taillights from external lighting functions are among the structures that determine the stylistic character of the vehicle. At the same time, the fact that lighting functions are related to many other functions brings along difficulties in design. Customers expect maximum quality from the vehicle. In these circumstances, it is necessary to make designs that aim to keep the performance of bulbs with limited working lives at the highest level. With this study, the factors that influence the working lives of filament lamps were examined and bulb explosions that can occur sooner than anticipated in the future were prevented while the vehicle was still in the design phase by determining the relations with electrical, dynamical and static variables. Especially the filaments of the bulbs used in the front lighting of the vehicle are deformed in a shorter time due to the high voltage requirement. In addition to this, rear lighting lamps vibrate as a result of the tailgate opening and closing and cause the filaments to be exposed to high stress. With this study, the findings that cause bulb explosions were evaluated. Among the most important findings: 1. The structure of the cables to the lighting functions of the vehicle and the effect of the voltage values are drawn; 2. The effect of the vibration to bulb throughout the life of the vehicle; 3 The effect of the loads carried to bulb while the vehicle doors are opened and closed. At the end of the study, the maximum performance was established in the bulb lifetimes with the optimum changes made in the vehicle architecture based on the findings obtained.

Keywords: vehicle architecture, automotive lighting functions, filament lamps, bulb lifetime

Procedia PDF Downloads 142
38412 An Examination of the Effectiveness of iPad-Based Augmentative and Alternative Intervention on Acquisition, Generalization and Maintenance of the Requesting Information Skills of Children with Autism

Authors: Amaal Almigal

Abstract:

Technology has been argued to offer distinct advantages and benefits for teaching children with autism spectrum disorder (ASD) to communicate. One aspect of this technology is augmentative and alternative communication (AAC) systems such as picture exchange or speech generation devices. Whilst there has been significant progress in teaching these children to request their wants and needs with AAC, there remains a need for developing technologies that can really make a difference in teaching them to ask questions. iPad-based AAC can be effective for communication. However, the effectiveness of this type of AAC in teaching children to ask questions needs to be examined. Thus, in order to examine the effectiveness of iPad-based AAC in teaching children with ASD to ask questions, This research will test whether iPad leads to more learning than a traditional approach picture and text cards does. Two groups of children who use AAC will be taught to ask ‘What is it?’ questions. With the first group, low-tech AAC picture and text cards will be used, while an iPad-based AAC application called Proloquo2Go will be used with the second group. Interviews with teachers and parents will be conducted before and after the experiment. The children’s perspectives will also be considered. The initial outcomes of this research indicate that iPad can be an effective tool to help children with autism to ask questions.

Keywords: autism, communication, information, iPad, pictures, requesting

Procedia PDF Downloads 254
38411 Development of Multi-Leaf Collimator-Based Isocenter Verification Tool Using Electrical Portal Imaging Device for Stereotactic Radiosurgery

Authors: Panatda Intanin, Sangutid Thongsawad, Chirapha Tannanonta, Todsaporn Fuangrod

Abstract:

Stereotactic radiosurgery (SRS) is a highly precision delivery technique that requires comprehensive quality assurance (QA) tests prior to treatment delivery. An isocenter of delivery beam plays a critical role that affect the treatment accuracy. The uncertainty of isocenter is traditionally accessed using circular cone equipment, Winston-Lutz (WL) phantom and film. This technique is considered time consuming and highly dependent on the observer. In this work, the development of multileaf collimator (MLC)-based isocenter verification tool using electronic portal imaging device (EPID) was proposed and evaluated. A mechanical isocenter alignment with ball bearing diameter 5 mm and circular cone diameter 10 mm fixed to gantry head defines the radiation field was set as the conventional WL test method. The conventional setup was to compare to the proposed setup; using MLC (10 x 10 mm) to define the radiation filed instead of cone. This represents more realistic delivery field than using circular cone equipment. The acquisition from electronic portal imaging device (EPID) and radiographic film were performed in both experiments. The gantry angles were set as following: 0°, 90°, 180° and 270°. A software tool was in-house developed using MATLAB/SIMULINK programming to determine the centroid of radiation field and shadow of WL phantom automatically. This presents higher accuracy than manual measurement. The deviation between centroid of both cone-based and MLC-based WL tests were quantified. To compare between film and EPID image, the deviation for all gantry angle was 0.26±0.19mm and 0.43±0.30 for cone-based and MLC-based WL tests. For the absolute deviation calculation on EPID images between cone and MLC-based WL test was 0.59±0.28 mm and the absolute deviation on film images was 0.14±0.13 mm. Therefore, the MLC-based isocenter verification using EPID present high sensitivity tool for SRS QA.

Keywords: isocenter verification, quality assurance, EPID, SRS

Procedia PDF Downloads 136
38410 Optimization of Electric Vehicle (EV) Charging Station Allocation Based on Multiple Data - Taking Nanjing (China) as an Example

Authors: Yue Huang, Yiheng Feng

Abstract:

Due to the global pressure on climate and energy, many countries are vigorously promoting electric vehicles and building charging (public) charging facilities. Faced with the supply-demand gap of existing electric vehicle charging stations and unreasonable space usage in China, this paper takes the central city of Nanjing as an example, establishes a site selection model through multivariate data integration, conducts multiple linear regression SPSS analysis, gives quantitative site selection results, and provides optimization models and suggestions for charging station layout planning.

Keywords: electric vehicle, charging station, allocation optimization, urban mobility, urban infrastructure, nanjing

Procedia PDF Downloads 77
38409 Social Crises and Its Impact on the Environment: Case Study of Jos, Plateau State

Authors: A. B. Benshak, M. G. Yilkangnha, V. Y. Nanle

Abstract:

Social crises and violent conflict can inflict direct (short-term) impact on the environment like poisoning water bodies, climate change, deforestation, destroying the chemical component of the soil due to the chemical and biological weapons used. It can also impact the environment indirectly (long-term), e.g., the destruction of political and economic infrastructure to manage the environmental resources and breaking down traditional conservation practices, population displacement and refugee flows which puts pressure on the already inadequate resources, infrastructure, facilities, amenities, services etc. This study therefore examines the impact of social crises on the environment in Jos Plateau State with emphasis on the long-term impact, analyze the relationship between crises and the environment and assess the perception of people on social crises because much work have concentrated on other repercussions such as the economy, health etc that are more politically expedient. The data for this research were collected mostly through interviews, questionnaire, dailies and reports on the subject matter. The data and findings were presented in tables and results showed that the environment is directly and indirectly impacted by crises and that these impacts can in turn result to a continuous cycle of violent activities if not addressed because of the inadequacies in the supply of infrastructural facilities, resources and so on caused by the inflow of displaced population. Recommendations were made on providing security to minimize conflict occurrences in Jos and its environs, minimizing the impact of social crises on the environment, provision of adequate infrastructural facilities to carter for population rise, renewal and regeneration schemes, etc. which will go a long way in mitigating the impact of crises on the environment.

Keywords: environment, impact, long-term, social crises

Procedia PDF Downloads 325
38408 Crack Propagation in Concrete Gravity Dam

Authors: Faramarz Khoshnoudian

Abstract:

A seismic stability assessment of the concrete gravity dam was performed. Initially (Phase 1), a linear response spectrum analysis was performed to verify the potential for crack formation. The result shows the possibility of developing cracks in the upstream face of the dam close to the lowest gallery, which were sufficiently long that the dam would not be stable following the earthquake. The results show the dam has potentially inadequate seismic and post-earthquake resistance and recommended an update of the stability analysis.

Keywords: crack propgation, concrete gravity dam, seismic, assesment

Procedia PDF Downloads 54
38407 Aerothermal Analysis of the Brazilian 14-X Hypersonic Aerospace Vehicle at Mach Number 7

Authors: Felipe J. Costa, João F. A. Martos, Ronaldo L. Cardoso, Israel S. Rêgo, Marco A. S. Minucci, Antonio C. Oliveira, Paulo G. P. Toro

Abstract:

The Prof. Henry T. Nagamatsu Laboratory of Aerothermodynamics and Hypersonics, at the Institute for Advanced Studies designed the Brazilian 14-X Hypersonic Aerospace Vehicle, which is a technological demonstrator endowed with two innovative technologies: waverider technology, to obtain lift from conical shockwave during the hypersonic flight; and uses hypersonic airbreathing propulsion system called scramjet that is based on supersonic combustion, to perform flights on Earth's atmosphere at 30 km altitude at Mach numbers 7 and 10. The scramjet is an aeronautical engine without moving parts that promote compression and deceleration of freestream atmospheric air at the inlet through the conical/oblique shockwaves generated during the hypersonic flight. During high speed flight, the shock waves and the viscous forces yield the phenomenon called aerodynamic heating, where this physical meaning is the friction between the fluid filaments and the body or compression at the stagnation regions of the leading edge that converts the kinetic energy into heat within a thin layer of air which blankets the body. The temperature of this layer increases with the square of the speed. This high temperature is concentrated in the boundary-layer, where heat will flow readily from the boundary-layer to the hypersonic aerospace vehicle structure. Fay and Riddell and Eckert methods are applied to the stagnation point and to the flat plate segments in order to calculate the aerodynamic heating. On the understanding of the aerodynamic heating it is important to analyze the heat conduction transfer to the 14-X waverider internal structure. ANSYS Workbench software provides the Thermal Numerical Analysis, using Finite Element Method of the 14-X waverider unpowered scramjet at 30 km altitude at Mach number 7 and 10 in terms of temperature and heat flux. Finally, it is possible to verify if the internal temperature complies with the requirements for embedded systems, and, if is necessary to do modifications on the structure in terms of wall thickness and materials.

Keywords: aerodynamic heating, hypersonic, scramjet, thermal analysis

Procedia PDF Downloads 432
38406 Intersubjectivity of Forensic Handwriting Analysis

Authors: Marta Nawrocka

Abstract:

In each of the legal proceedings, in which expert evidence is carried out, a major concern is the assessment of the evidential value of expert reports. Judicial institutions, while making decisions, rely heavily on the expert reports, because they usually do not possess 'special knowledge' from a certain fields of science which makes it impossible for them to verify the results presented in the processes. In handwriting studies, the standards of analysis are developed. They unify procedures used by experts in comparing signs and in constructing expert reports. However, the methods used by experts are usually of a qualitative nature. They rely on the application of knowledge and experience of expert and in effect give significant range of margin in the assessment. Moreover, the standards used by experts are still not very precise and the process of reaching the conclusions is poorly understood. The above-mentioned circumstances indicate that expert opinions in the field of handwriting analysis, for many reasons, may not be sufficiently reliable. It is assumed that this state of affairs has its source in a very low level of intersubjectivity of measuring scales and analysis procedures, which consist elements of this kind of analysis. Intersubjectivity is a feature of cognition which (in relation to methods) indicates the degree of consistency of results that different people receive using the same method. The higher the level of intersubjectivity is, the more reliable and credible the method can be considered. The aim of the conducted research was to determine the degree of intersubjectivity of the methods used by the experts from the scope of handwriting analysis. 30 experts took part in the study and each of them received two signatures, with varying degrees of readability, for analysis. Their task was to distinguish graphic characteristics in the signature, estimate the evidential value of the found characteristics and estimate the evidential value of the signature. The obtained results were compared with each other using the Alpha Krippendorff’s statistic, which numerically determines the degree of compatibility of the results (assessments) that different people receive under the same conditions using the same method. The estimation of the degree of compatibility of the experts' results for each of these tasks allowed to determine the degree of intersubjectivity of the studied method. The study showed that during the analysis, the experts identified different signature characteristics and attributed different evidential value to them. In this scope, intersubjectivity turned out to be low. In addition, it turned out that experts in various ways called and described the same characteristics, and the language used was often inconsistent and imprecise. Thus, significant differences have been noted on the basis of language and applied nomenclature. On the other hand, experts attributed a similar evidential value to the entire signature (set of characteristics), which indicates that in this range, they were relatively consistent.

Keywords: forensic sciences experts, handwriting analysis, inter-rater reliability, reliability of methods

Procedia PDF Downloads 138
38405 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 149
38404 Predictive Modelling of Aircraft Component Replacement Using Imbalanced Learning and Ensemble Method

Authors: Dangut Maren David, Skaf Zakwan

Abstract:

Adequate monitoring of vehicle component in other to obtain high uptime is the goal of predictive maintenance, the major challenge faced by businesses in industries is the significant cost associated with a delay in service delivery due to system downtime. Most of those businesses are interested in predicting those problems and proactively prevent them in advance before it occurs, which is the core advantage of Prognostic Health Management (PHM) application. The recent emergence of industry 4.0 or industrial internet of things (IIoT) has led to the need for monitoring systems activities and enhancing system-to-system or component-to- component interactions, this has resulted to a large generation of data known as big data. Analysis of big data represents an increasingly important, however, due to complexity inherently in the dataset such as imbalance classification problems, it becomes extremely difficult to build a model with accurate high precision. Data-driven predictive modeling for condition-based maintenance (CBM) has recently drowned research interest with growing attention to both academics and industries. The large data generated from industrial process inherently comes with a different degree of complexity which posed a challenge for analytics. Thus, imbalance classification problem exists perversely in industrial datasets which can affect the performance of learning algorithms yielding to poor classifier accuracy in model development. Misclassification of faults can result in unplanned breakdown leading economic loss. In this paper, an advanced approach for handling imbalance classification problem is proposed and then a prognostic model for predicting aircraft component replacement is developed to predict component replacement in advanced by exploring aircraft historical data, the approached is based on hybrid ensemble-based method which improves the prediction of the minority class during learning, we also investigate the impact of our approach on multiclass imbalance problem. We validate the feasibility and effectiveness in terms of the performance of our approach using real-world aircraft operation and maintenance datasets, which spans over 7 years. Our approach shows better performance compared to other similar approaches. We also validate our approach strength for handling multiclass imbalanced dataset, our results also show good performance compared to other based classifiers.

Keywords: prognostics, data-driven, imbalance classification, deep learning

Procedia PDF Downloads 158
38403 Photocatalytic Degradation of Nd₂O₃@SiO₂ Core-Shell Nanocomposites Under UV Irradiation Against Methylene Blue and Rhodamine B Dyes

Authors: S. Divya, M. Jose

Abstract:

Over the past years, industrial dyes have emerged as a significant threat to aquatic life, extensively detected in drinking water and groundwater, thus contributing to water pollution due to their improper and excessive use. To address this issue, the utilization of core-shell structures has been prioritized as it demonstrates remarkable efficiency in utilizing light energy for catalytic reactions and exhibiting excellent photocatalytic activity despite the availability of various photocatalysts. This work focuses on the photocatalytic degradation of Nd₂O₃@SiO₂ CSNs under UV light irradiation against MB and RhB dyes. Different characterization techniques, including XRD, FTIR, and TEM analyses, were employed to reveal the material's structure, functional groups, and morphological features. VSM and XPS analyses confirmed the soft, paramagnetic nature and chemical states with respective atomic percentages, respectively. Optical band gaps, determined using the Tauc plot model, indicated 4.24 eV and 4.13 eV for Nd₂O₃ NPs and Nd₂O₃@SiO₂ CSNs, respectively. The reduced bandgap energy of Nd₂O₃@SiO₂ CSNs enhances light absorption in the UV range, potentially leading to improved photocatalytic efficiency. The Nd₂O₃@SiO₂ CSNs exhibited greater degradation efficiency, reaching 95% and 96% against MB and RhB dyes, while Nd₂O₃ NPs showed 90% and 92%, respectively. The enhanced efficiency of Nd₂O₃@SiO₂ CSNs can be attributed to the larger specific surface area provided by the SiO₂ shell, as confirmed by surface area analysis using the BET surface area analyzer through N₂ adsorption-desorption.

Keywords: core shell nanocomposites, rare earth oxides, photocatalysis, advanced oxidation process

Procedia PDF Downloads 46
38402 A Model of Empowerment Evaluation of Knowledge Management in Private Banks Using Fuzzy Inference System

Authors: Nazanin Pilevari, Kamyar Mahmoodi

Abstract:

The purpose of this research is to provide a model based on fuzzy inference system for evaluating empowerment of Knowledge management. The first prototype of the research was developed based on the study of literature. In the next step, experts were provided with these models and after implementing consensus-based reform, the views of Fuzzy Delphi experts and techniques, components and Index research model were finalized. Culture, structure, IT and leadership were considered as dimensions of empowerment. Then, In order to collect and extract data for fuzzy inference system based on knowledge and Experience, the experts were interviewed. The values obtained from designed fuzzy inference system, made review and assessment of the organization's empowerment of Knowledge management possible. After the design and validation of systems to measure indexes ,empowerment of Knowledge management and inputs into fuzzy inference) in the AYANDEH Bank, a questionnaire was used. In the case of this bank, the system output indicates that the status of empowerment of Knowledge management, culture, organizational structure and leadership are at the moderate level and information technology empowerment are relatively high. Based on these results, the status of knowledge management empowerment in AYANDE Bank, was moderate. Eventually, some suggestions for improving the current situation of banks were provided. According to studies of research history, the use of powerful tools in Fuzzy Inference System for assessment of Knowledge management and knowledge management empowerment such an assessment in the field of banking, are the innovation of this Research.

Keywords: knowledge management, knowledge management empowerment, fuzzy inference system, fuzzy Delphi

Procedia PDF Downloads 346
38401 Water-Repellent Coating Based on Thermoplastic Polyurethane, Silica Nanoparticles and Graphene Nanoplatelets

Authors: S. Naderizadeh, A. Athanassiou, I. S. Bayer

Abstract:

This work describes a layer-by-layer spraying method to produce a non-wetting coating, based on thermoplastic polyurethane (TPU) and silica nanoparticles (Si-NPs). The main purpose of this work was to transform a hydrophilic polymer to superhydrophobic coating. The contact angle of pure TPU was measured about 77˚ ± 2, and water droplets did not roll away upon tilting even at 90°. But after applying a layer of Si-NPs on top of this, not only the contact angle increased to 165˚ ± 2, but also water droplets can roll away even below 5˚ tilting. The most important restriction in this study was the weak interfacial adhesion between polymer and nanoparticles, which had a bad effect on durability of the coatings. To overcome this problem, we used a very thin layer of graphene nanoplatelets (GNPs) as an interlayer between TPU and Si-NPs layers, followed by thermal treatment at 150˚C. The sample’s morphology and topography were characterized by scanning electron microscopy (SEM), EDX analysis and atomic force microscopy (AFM). It was observed that Si-NPs embedded into the polymer phase in the presence of GNPs layer. It is probably because of the high surface area and considerable thermal conductivity of the graphene platelets. The contact angle value for the sample containing graphene decreased a little bit respected to the coating without graphene and reached to 156.4˚ ± 2, due to the depletion of the surface roughness. The durability of the coatings against abrasion was evaluated by Taber® abrasion test, and it was observed that superhydrophobicity of the coatings remains for a longer time, in the presence of GNPs layer. Due to the simple fabrication method and good durability of the coating, this coating can be used as a durable superhydrophobic coating for metals and can be produced in large scale.

Keywords: graphene, silica nanoparticles, superhydrophobicity, thermoplastic polyurethane

Procedia PDF Downloads 170
38400 A Methodology for Investigating Public Opinion Using Multilevel Text Analysis

Authors: William Xiu Shun Wong, Myungsu Lim, Yoonjin Hyun, Chen Liu, Seongi Choi, Dasom Kim, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, many users have begun to frequently share their opinions on diverse issues using various social media. Therefore, numerous governments have attempted to establish or improve national policies according to the public opinions captured from various social media. In this paper, we indicate several limitations of the traditional approaches to analyze public opinion on science and technology and provide an alternative methodology to overcome these limitations. First, we distinguish between the science and technology analysis phase and the social issue analysis phase to reflect the fact that public opinion can be formed only when a certain science and technology is applied to a specific social issue. Next, we successively apply a start list and a stop list to acquire clarified and interesting results. Finally, to identify the most appropriate documents that fit with a given subject, we develop a new logical filter concept that consists of not only mere keywords but also a logical relationship among the keywords. This study then analyzes the possibilities for the practical use of the proposed methodology thorough its application to discover core issues and public opinions from 1,700,886 documents comprising SNS, blogs, news, and discussions.

Keywords: big data, social network analysis, text mining, topic modeling

Procedia PDF Downloads 276
38399 Evaluation of Adaptive Fitness of Indian Teak (Tectona grandis L. F.) Metapopulation through Inter Simple Sequence Repeat Markers

Authors: Vivek Vaishnav, Shamim Akhtar Ansari

Abstract:

Teak (Tectona grandis L.f.) belonging to plant family Lamiaceae and the most commercialized timber species is endemic to South-Asia. The adaptive fitness of the species metapopulation was evaluated through its genetic differentiation and assessing the influence of geo-climatic conditions. 290 genotypes were sampled from 29 locations of its natural distribution and the genetic data was incorporated with geo-climatic parameters. Through Bayesian approach based analysis of 43 highly polymorphic ISSR markers, six homogeneous clusters (0.8% genetic variability) were identified. The six clusters were found with the various regimes of the temperature range, i.e., I - 9.10±1.35⁰C, II -6.35±0.21⁰C, III -12.21±0.43⁰C, IV - 10.8±1.06⁰C, V - 11.67±3.04⁰C, and VI - 12.35±0.21⁰C. The population had a very high percentage of LD (21.48%) among the amplified loci possibly due to experiencing restricted gene flow as well as co-adaptation and association of distant/diverse loci/alleles as a result of the stabilized climatic conditions and countless cycles of historical recombination events on a large geological timescale. The same possibly accounts for the narrow distribution of teak as a climax species in the tropical deciduous forests of the country. The regions of strong LD in teak genome significantly associated with climatic parameters also reflect that the species is tolerant to the wide regimes of the temperature range and may possibly withstand global warming and climate change in the coming millennium.

Keywords: Bayesian analysis, inter simple sequence repeat, linkage disequilibrium, marker-geoclimatic association

Procedia PDF Downloads 250
38398 The Effectiveness of Prenatal Breastfeeding Education on Breastfeeding Uptake Postpartum: A Systematic Review

Authors: Jennifer Kehinde, Claire O’Donnell, Annmarie Grealish

Abstract:

Introduction: Breastfeeding has been shown to provide numerous health benefits for both infants and mothers. The decision to breastfeed is influenced by physiological, psychological, and emotional factors. However, the importance of equipping mothers with the necessary knowledge for successful breastfeeding practice cannot be ruled out. The decline in global breastfeeding rate can be linked to a lack of adequate breastfeeding education during the prenatal stage. This systematic review examined the effectiveness of prenatal breastfeeding education on breastfeeding uptake postpartum. Method: This review was undertaken and reported in conformity with the Preferred Reporting Items for Systemic Reviews and Meta-Analysis statement (PRISMA) and was registered on the international prospective register for systematic reviews (PROSPERO: CRD42020213853). A PICO analysis (population, intervention, comparison, outcome) was undertaken to inform the choice of keywords in the search strategy to formulate the review question, which was aimed at determining the effectiveness of prenatal breastfeeding educational programs in improving breastfeeding uptake following birth. A systematic search of five databases (Cumulative Index to Nursing and Allied Health Literature, Medline, Psych INFO, and Applied Social Sciences Index and Abstracts) was searched between January 2014 until July 2021 to identify eligible studies. Quality assessment and narrative synthesis were subsequently undertaken. Results: Fourteen studies were included. All 14 studies used different types of breastfeeding programs; eight used a combination of curriculum-based breastfeeding education programs, group prenatal breastfeeding counselling, and one-to-one breastfeeding educational programs, which were all delivered in person; four studies used web-based learning platforms to deliver breastfeeding education prenatally which were both delivered online and face to face over a period of 3 weeks to 2 months with follow-up periods ranging from 3 weeks to 6 months; one study delivered breastfeeding educational intervention using mother-to-mother breastfeeding support groups in promoting exclusive breastfeeding, and one study disseminated breastfeeding education to participants based on the theory of planned behaviour. The most effective interventions were those that included both theory and hands-on demonstrations. Results showed an increase in breastfeeding uptake, breastfeeding knowledge, an increase in a positive attitude to breastfeeding, and an increase in maternal breastfeeding self-efficacy among mothers who participated in breastfeeding educational programs during prenatal care. Conclusion: Prenatal breastfeeding education increases women’s knowledge of breastfeeding. Mothers who are knowledgeable about breastfeeding and hold a positive approach towards breastfeeding have the tendency to initiate breastfeeding and continue for a lengthened period. Findings demonstrate a general correlation between prenatal breastfeeding education and increased breastfeeding uptake postpartum. The high level of positive breastfeeding outcomes inherent in all the studies can be attributed to prenatal breastfeeding education. This review provides rigorous contemporary evidence that healthcare professionals and policymakers can apply when developing effective strategies to improve breastfeeding rates and ultimately improve the health outcomes of mothers and infants.

Keywords: breastfeeding, breastfeeding programs, breastfeeding self-efficacy, prenatal breastfeeding education

Procedia PDF Downloads 71
38397 Optimal Resource Configuration and Allocation Planning Problem for Bottleneck Machines and Auxiliary Tools

Authors: Yin-Yann Chen, Tzu-Ling Chen

Abstract:

This study presents the case of an actual Taiwanese semiconductor assembly and testing manufacturer. Three major bottleneck manufacturing processes, namely, die bond, wire bond, and molding, are analyzed to determine how to use finite resources to achieve the optimal capacity allocation. A medium-term capacity allocation planning model is developed by considering the optimal total profit to satisfy the promised volume demanded by customers and to obtain the best migration decision among production lines for machines and tools. Finally, sensitivity analysis based on the actual case is provided to explore the effect of various parameter levels.

Keywords: capacity planning, capacity allocation, machine migration, resource configuration

Procedia PDF Downloads 441
38396 Effect of Co Substitution on Structural, Magnetocaloric, Magnetic, and Electrical Properties of Sm0.6Sr0.4CoxMn1-xO3 Synthesized by Sol-gel Method

Authors: A. A. Azab

Abstract:

In this work, Sm0.6Sr0.4CoxMn1-xO3 (x=0, 0.1, 0.2 and 0.3) was synthesized by sol-gel method for magnetocaloric effect (MCE) applications. XRD analysis confirmed formation of the required orthorhombic phase of perovskite, and there is crystallographic phase transition as a result of substitution. Maxwell-Wagner interfacial polarisation and Koops phenomenological theory were used to investigate and analyze the temperature and frequency dependency of the dielectric permittivity. The phase transition from the ferromagnetic to the paramagnetic state was demonstrated to be second order. Based on the isothermal magnetization curves obtained at various temperatures, the magnetic entropy change was calculated. A magnetocaloric effect (MCE) over a wide temperature range was studied by determining DSM and the relative cooling power (RCP).

Keywords: magnetocaloric effect, pperovskite, magnetic phase transition, dielectric permittivity

Procedia PDF Downloads 54
38395 Suggestion for Malware Detection Agent Considering Network Environment

Authors: Ji-Hoon Hong, Dong-Hee Kim, Nam-Uk Kim, Tai-Myoung Chung

Abstract:

Smartphone users are increasing rapidly. Accordingly, many companies are running BYOD (Bring Your Own Device: Policies to bring private-smartphones to the company) policy to increase work efficiency. However, smartphones are always under the threat of malware, thus the company network that is connected smartphone is exposed to serious risks. Most smartphone malware detection techniques are to perform an independent detection (perform the detection of a single target application). In this paper, we analyzed a variety of intrusion detection techniques. Based on the results of analysis propose an agent using the network IDS.

Keywords: android malware detection, software-defined network, interaction environment, android malware detection, software-defined network, interaction environment

Procedia PDF Downloads 416