Search results for: sequential extraction process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16965

Search results for: sequential extraction process

15675 A Transformer-Based Question Answering Framework for Software Contract Risk Assessment

Authors: Qisheng Hu, Jianglei Han, Yue Yang, My Hoa Ha

Abstract:

When a company is considering purchasing software for commercial use, contract risk assessment is critical to identify risks to mitigate the potential adverse business impact, e.g., security, financial and regulatory risks. Contract risk assessment requires reviewers with specialized knowledge and time to evaluate the legal documents manually. Specifically, validating contracts for a software vendor requires the following steps: manual screening, interpreting legal documents, and extracting risk-prone segments. To automate the process, we proposed a framework to assist legal contract document risk identification, leveraging pre-trained deep learning models and natural language processing techniques. Given a set of pre-defined risk evaluation problems, our framework utilizes the pre-trained transformer-based models for question-answering to identify risk-prone sections in a contract. Furthermore, the question-answering model encodes the concatenated question-contract text and predicts the start and end position for clause extraction. Due to the limited labelled dataset for training, we leveraged transfer learning by fine-tuning the models with the CUAD dataset to enhance the model. On a dataset comprising 287 contract documents and 2000 labelled samples, our best model achieved an F1 score of 0.687.

Keywords: contract risk assessment, NLP, transfer learning, question answering

Procedia PDF Downloads 129
15674 Using Nonhomogeneous Poisson Process with Compound Distribution to Price Catastrophe Options

Authors: Rong-Tsorng Wang

Abstract:

In this paper, we derive a pricing formula for catastrophe equity put options (or CatEPut) with non-homogeneous loss and approximated compound distributions. We assume that the loss claims arrival process is a nonhomogeneous Poisson process (NHPP) representing the clustering occurrences of loss claims, the size of loss claims is a sequence of independent and identically distributed random variables, and the accumulated loss distribution forms a compound distribution and is approximated by a heavy-tailed distribution. A numerical example is given to calibrate parameters, and we discuss how the value of CatEPut is affected by the changes of parameters in the pricing model we provided.

Keywords: catastrophe equity put options, compound distributions, nonhomogeneous Poisson process, pricing model

Procedia PDF Downloads 167
15673 The Effect of Online Learning During the COVID-19 Pandemic on Student Mental

Authors: Adelia Desi Agnesita

Abstract:

The advent of a new disease called covid-19 made many major changes in the world, one of which is the process of learning and teaching. Learning formerly offline but now is done online, which makes students need adaptation to the learning process. The covid-19 pandemic that occurs almost worldwide causes activities that involve many people to be avoided, one of which is learning to teach. In Indonesia, since March 2020, the process of college learning is turning into online/ long-distance learning. It's to prevent the spread of the covid-19. Student online learning presents some of the obstacles to poor signals, many of the tasks, lack of focus, difficulty sleeping, and resulting stress.

Keywords: learning, online, covid-19, pandemic

Procedia PDF Downloads 214
15672 Experimental Investigations on the Mechanism of Stratified Liquid Mixing in a Cylinder

Authors: Chai Mingming, Li Lei, Lu Xiaoxia

Abstract:

In this paper, the mechanism of stratified liquids’ mixing in a cylinder is investigated. It is focused on the effects of Rayleigh-Taylor Instability (RTI) and rotation of the cylinder on liquid interface mixing. For miscible liquids, Planar Laser Induced Fluorescence (PLIF) technique is applied to record the concentration field for one liquid. Intensity of Segregation (IOS) is used to describe the mixing status. For immiscible liquids, High Speed Camera is adopted to record the development of the interface. The experiment of RTI indicates that it plays a great role in the mixing process, and meanwhile the large-scale mixing is triggered, and subsequently the span of the stripes decreases, showing that the mesoscale mixing is coming into being. The rotation experiments show that the spin-down process has a great role in liquid mixing, during which the upper liquid falls down rapidly along the wall and crashes into the lower liquid. During this process, a lot of interface instabilities are excited. Liquids mix rapidly in the spin-down process. It can be concluded that no matter what ways have been adopted to speed up liquid mixing, the fundamental reason is the interface instabilities which increase the area of the interface between liquids and increase the relative velocity of the two liquids.

Keywords: interface instability, liquid mixing, Rayleigh-Taylor Instability, spin-down process, spin-up process

Procedia PDF Downloads 301
15671 The Volume–Volatility Relationship Conditional to Market Efficiency

Authors: Massimiliano Frezza, Sergio Bianchi, Augusto Pianese

Abstract:

The relation between stock price volatility and trading volume represents a controversial issue which has received a remarkable attention over the past decades. In fact, an extensive literature shows a positive relation between price volatility and trading volume in the financial markets, but the causal relationship which originates such association is an open question, from both a theoretical and empirical point of view. In this regard, various models, which can be considered as complementary rather than competitive, have been introduced to explain this relationship. They include the long debated Mixture of Distributions Hypothesis (MDH); the Sequential Arrival of Information Hypothesis (SAIH); the Dispersion of Beliefs Hypothesis (DBH); the Noise Trader Hypothesis (NTH). In this work, we analyze whether stock market efficiency can explain the diversity of results achieved during the years. For this purpose, we propose an alternative measure of market efficiency, based on the pointwise regularity of a stochastic process, which is the Hurst–H¨older dynamic exponent. In particular, we model the stock market by means of the multifractional Brownian motion (mBm) that displays the property of a time-changing regularity. Mostly, such models have in common the fact that they locally behave as a fractional Brownian motion, in the sense that their local regularity at time t0 (measured by the local Hurst–H¨older exponent in a neighborhood of t0 equals the exponent of a fractional Brownian motion of parameter H(t0)). Assuming that the stock price follows an mBm, we introduce and theoretically justify the Hurst–H¨older dynamical exponent as a measure of market efficiency. This allows to measure, at any time t, markets’ departures from the martingale property, i.e. from efficiency as stated by the Efficient Market Hypothesis. This approach is applied to financial markets; using data for the SP500 index from 1978 to 2017, on the one hand we find that when efficiency is not accounted for, a positive contemporaneous relationship emerges and is stable over time. Conversely, it disappears as soon as efficiency is taken into account. In particular, this association is more pronounced during time frames of high volatility and tends to disappear when market becomes fully efficient.

Keywords: volume–volatility relationship, efficient market hypothesis, martingale model, Hurst–Hölder exponent

Procedia PDF Downloads 78
15670 Development of Mg-Containing Hydroxyapatite-Based Bioceramics From Phosphate Rock for Bone Applications

Authors: Sara Mercedes Barroso Pinzón, Álvaro Jesús Caicedo Castro, Antonio Javer Sánchez Herencia

Abstract:

In recent years there has been increased academic and industrial research into the development of orthopaedic implants with structural properties and functionality similar to mechanical strength, osseointegration, thermal stability and antibacterial capacity similar to bone structure. Hydroxyapatite has been considered for decades as an ideal biomaterial for bone regeneration due to its chemical and crystallographic similarity to the mineral structure bioapatites. However, the lack of trace elements in the hydroxyapatite structure confers very low mechanical and biological properties. Under this scenario, the objective of the research is the synthesis of hydroxyapatite with Mg from the francolite mineral present in phosphate rock from the central-eastern region of Colombia, taking advantage of the extraction of mineral species as natural precursors of Ca, P and Mg. The minerals present were studied, fluorapatite as the mineral of interest associated with magnesium carbonates and quartz. The chemical and mineralogical composition was determined by X-ray fluorescence (XRF) and X-ray diffraction (XRD), scanning electron microscopy (SEM) and energy dispersive X-ray spectroscopy (EDX); the optimum conditions were established using the acid leaching mechanism in the wet concentration process. From the products obtained and characterised by XRD, XRF, SEM, FTIR, RAMAN, HAp-Mg biocomposite scaffolds are fabricated and the influence of Mg on morphometric parameters, mechanical and biological properties in the formed materials is evaluated.

Keywords: phosphate rock, hydroxyapatite, magnesium, biomaterials

Procedia PDF Downloads 56
15669 Assessing Acute Toxicity and Endocrine Disruption Potential of Selected Packages Internal Layers Extracts

Authors: N. Szczepanska, B. Kudlak, G. Yotova, S. Tsakovski, J. Namiesnik

Abstract:

In the scientific literature related to the widely understood issue of packaging materials designed to have contact with food (food contact materials), there is much information on raw materials used for their production, as well as their physiochemical properties, types, and parameters. However, not much attention is given to the issues concerning migration of toxic substances from packaging and its actual influence on the health of the final consumer, even though health protection and food safety are the priority tasks. The goal of this study was to estimate the impact of particular foodstuff packaging type, food production, and storage conditions on the degree of leaching of potentially toxic compounds and endocrine disruptors to foodstuffs using the acute toxicity test Microtox and XenoScreen YES YAS assay. The selected foodstuff packaging materials were metal cans used for fish storage and tetrapak. Five stimulants respectful to specific kinds of food were chosen in order to assess global migration: distilled water for aqueous foods with a pH above 4.5; acetic acid at 3% in distilled water for acidic aqueous food with pH below 4.5; ethanol at 5% for any food that may contain alcohol; dimethyl sulfoxide (DMSO) and artificial saliva were used in regard to the possibility of using it as an simulation medium. For each packaging three independent variables (temperature and contact time) factorial design simulant was performed. Xenobiotics migration from epoxy resins was studied at three different temperatures (25°C, 65°C, and 121°C) and extraction time of 12h, 48h and 2 weeks. Such experimental design leads to 9 experiments for each food simulant as conditions for each experiment are obtained by combination of temperature and contact time levels. Each experiment was run in triplicate for acute toxicity and in duplicate for estrogen disruption potential determination. Multi-factor analysis of variation (MANOVA) was used to evaluate the effects of the three main factors solvent, temperature (temperature regime for cup), contact time and their interactions on the respected dependent variable (acute toxicity or estrogen disruption potential). From all stimulants studied the most toxic were can and tetrapak lining acetic acid extracts that are indication for significant migration of toxic compounds. This migration increased with increase of contact time and temperature and justified the hypothesis that food products with low pH values cause significant damage internal resin filling. Can lining extracts of all simulation medias excluding distilled water and artificial saliva proved to contain androgen agonists even at 25°C and extraction time of 12h. For tetrapak extracts significant endocrine potential for acetic acid, DMSO and saliva were detected.

Keywords: food packaging, extraction, migration, toxicity, biotest

Procedia PDF Downloads 181
15668 A Survey of 2nd Year Students' Frequent Writing Error and the Effects of Participatory Error Correction Process

Authors: Chaiwat Tantarangsee

Abstract:

The purposes of this study are 1) to study the effects of participatory error correction process and 2) to find out the students’ satisfaction of such error correction process. This study is a Quasi Experimental Research with single group, in which data is collected 5 times preceding and following 4 experimental studies of participatory error correction process including providing coded indirect corrective feedback in the students’ texts with error treatment activities. Samples include 28 2nd year English Major students, Faculty of Humanities and Social Sciences, Suan Sunandha Rajabhat University. Tool for experimental study includes the lesson plan of the course; Reading and Writing English for Academic Purposes II, and tools for data collection include 5 writing tests of short texts and a questionnaire. Based on formative evaluation of the students’ writing ability prior to and after each of the 4 experiments, the research findings disclose the students’ higher scores with statistical difference at 0.05. Moreover, in terms of the effect size of such process, it is found that for mean of the students’ scores prior to and after the 4 experiments; d equals 1.0046, 1.1374, 1.297, and 1.0065 respectively. It can be concluded that participatory error correction process enables all of the students to learn equally well and there is improvement in their ability to write short texts. Finally, the students’ overall satisfaction of the participatory error correction process is in high level (Mean=4.32, S.D.=0.92).

Keywords: coded indirect corrective feedback, participatory error correction process, error treatment, humanities and social sciences

Procedia PDF Downloads 523
15667 Performance Evaluation of Production Schedules Based on Process Mining

Authors: Kwan Hee Han

Abstract:

External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.

Keywords: data mining, event log, process mining, production scheduling

Procedia PDF Downloads 279
15666 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data

Authors: Wanhyun Cho, Soonja Kang, Sanggoon Kim, Soonyoung Park

Abstract:

We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered an efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.

Keywords: multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, importance sampling, approximate posterior distribution, marginal likelihood evidence

Procedia PDF Downloads 444
15665 Simulation Study of Asphaltene Deposition and Solubility of CO2 in the Brine during Cyclic CO2 Injection Process in Unconventional Tight Reservoirs

Authors: Rashid S. Mohammad, Shicheng Zhang, Sun Lu, Syed Jamal-Ud-Din, Xinzhe Zhao

Abstract:

A compositional reservoir simulation model (CMG-GEM) was used for cyclic CO2 injection process in unconventional tight reservoir. Cyclic CO2 injection is an enhanced oil recovery process consisting of injection, shut-in, and production. The study of cyclic CO2 injection and hydrocarbon recovery in ultra-low permeability reservoirs is mainly a function of rock, fluid, and operational parameters. CMG-GEM was used to study several design parameters of cyclic CO2 injection process to distinguish the parameters with maximum effect on the oil recovery and to comprehend the behavior of cyclic CO2 injection in tight reservoir. On the other hand, permeability reduction induced by asphaltene precipitation is one of the major issues in the oil industry due to its plugging onto the porous media which reduces the oil productivity. In addition to asphaltene deposition, solubility of CO2 in the aquifer is one of the safest and permanent trapping techniques when considering CO2 storage mechanisms in geological formations. However, the effects of the above uncertain parameters on the process of CO2 enhanced oil recovery have not been understood systematically. Hence, it is absolutely necessary to study the most significant parameters which dominate the process. The main objective of this study is to improve techniques for designing cyclic CO2 injection process while considering the effects of asphaltene deposition and solubility of CO2 in the brine in order to prevent asphaltene precipitation, minimize CO2 emission, optimize cyclic CO2 injection, and maximize oil production.

Keywords: tight reservoirs, cyclic O₂ injection, asphaltene, solubility, reservoir simulation

Procedia PDF Downloads 386
15664 Defect Management Life Cycle Process for Software Quality Improvement

Authors: Aedah Abd Rahman, Nurdatillah Hasim

Abstract:

Software quality issues require special attention especially in view of the demands of quality software product to meet customer satisfaction. Software development projects in most organisations need proper defect management process in order to produce high quality software product and reduce the number of defects. The research question of this study is how to produce high quality software and reducing the number of defects. Therefore, the objective of this paper is to provide a framework for managing software defects by following defined life cycle processes. The methodology starts by reviewing defects, defect models, best practices and standards. A framework for defect management life cycle is proposed. The major contribution of this study is to define a defect management road map in software development. The adoption of an effective defect management process helps to achieve the ultimate goal of producing high quality software products and contributes towards continuous software process improvement.

Keywords: defects, defect management, life cycle process, software quality

Procedia PDF Downloads 306
15663 Treadmill Negotiation: The Stagnation of the Israeli – Palestinian Peace Process

Authors: Itai Kohavi, Wojciech Nowiak

Abstract:

This article explores the stagnation of the Israeli -Palestinian peace negotiation process, and the reasons behind the failure of more than 12 international initiatives to resolve the conflict. Twenty-seven top members of the Israeli national security elite (INSE) were interviewed, including heads of the negotiation teams, the National Security Council, the Mossad, and other intelligence and planning arms. The interviewees provided their insights on the Israeli challenges in reaching a sustainable and stable peace agreement and in dealing with the international pressure on Israel to negotiate a peace agreement while preventing anti-Israeli UN decisions and sanctions. The findings revealed a decision tree, with red herring deception strategies implemented to postpone the negotiation process and to delay major decisions during the negotiation process. Beyond the possible applications for the Israeli – Palestinian conflict, the findings shed more light on the phenomenon of rational deception of allies in a negotiation process, a subject less frequently researched as compared with deception of rivals.

Keywords: deception, Israeli-Palestinian conflict, negotiation, red herring, terrorist state, treadmill negotiation

Procedia PDF Downloads 303
15662 Distribution-Free Exponentially Weighted Moving Average Control Charts for Monitoring Process Variability

Authors: Chen-Fang Tsai, Shin-Li Lu

Abstract:

Distribution-free control chart is an oncoming area from the statistical process control charts in recent years. Some researchers have developed various nonparametric control charts and investigated the detection capability of these charts. The major advantage of nonparametric control charts is that the underlying process is not specifically considered the assumption of normality or any parametric distribution. In this paper, two nonparametric exponentially weighted moving average (EWMA) control charts based on nonparametric tests, namely NE-S and NE-M control charts, are proposed for monitoring process variability. Generally, weighted moving average (GWMA) control charts are extended by utilizing design and adjustment parameters for monitoring the changes in the process variability, namely NG-S and NG-M control charts. Statistical performance is also investigated on NG-S and NG-M control charts with run rules. Moreover, sensitivity analysis is performed to show the effects of design parameters under the nonparametric NG-S and NG-M control charts.

Keywords: Distribution-free control chart, EWMA control charts, GWMA control charts

Procedia PDF Downloads 272
15661 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach

Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar

Abstract:

Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.

Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI

Procedia PDF Downloads 153
15660 Experimental Modeling and Simulation of Zero-Surface Temperature of Controlled Water Jet Impingement Cooling System for Hot-Rolled Steel Plates

Authors: Thomas Okechukwu Onah, Onyekachi Marcel Egwuagu

Abstract:

Zero-surface temperature, which controlled the cooling profile, was modeled and used to investigate the effect of process parameters on the hot-rolled steel plates. The parameters include impingement gaps of 40mm to 70mm; pipe diameters of 20mm to 45mm feeding jet nozzle with 30 holes of 8mm diameters each; and flow rates within 2.896x10-⁶m³/s and 3.13x10-⁵m³/s. The developed simulation model of the Zero-Surface Temperature, upon validation, showed 99% prediction accuracy with dimensional homogeneity established. The evaluated Zero-Surface temperature of Controlled Water Jet Impingement Steel plates showed a high cooling rate of 36.31 Celsius degree/sec at an optimal cooling nozzle diameter of 20mm, impingement gap of 70mm and a flow rate of 1.77x10-⁵m³/s resulting in Reynold's number 2758.586, in the turbulent regime was obtained. It was also deduced that as the nozzle diameter was increasing, the impingement gap was reducing. This achieved a faster rate of cooling to an optimum temperature of 300oC irrespective of the starting surface cooling temperature. The results additionally showed that with a tested-plate initial temperature of 550oC, a controlled cooling temperature of about 160oC produced a film and nucleated boiling heat extraction that was particularly beneficial at the end of controlled cooling and influenced the microstructural properties of the test plates.

Keywords: temperature, mechanistic-model, plates, impingements, dimensionless-numbers

Procedia PDF Downloads 46
15659 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis

Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan

Abstract:

Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of Big Data Technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centers or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through Vader and Roberta model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and TFIDF Vectorization, and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.

Keywords: counter vectorization, convolutional neural network, crawler, data technology, long short-term memory, web scraping, sentiment analysis

Procedia PDF Downloads 88
15658 A System for Visual Management of Research Resources Focusing on Accumulation of Polish Processes

Authors: H. Anzai, H. Nakayama, H. Kaminaga, Y. Morimoto, Y. Miyadera, S. Nakamura

Abstract:

Various research resources such as papers and presentation slides are handled in the process of research activities. It is extremely important for smooth progress of the research to skillfully manage those research resources and utilize them for further investigations. However, number of the research resources increases more and more. Moreover, there are the differences in usage of each kind of research resource and their accumulation styles. So, it is actually difficult to satisfactorily manage and use the accumulated research resources. Therefore, a lack of tidiness of the resources causes the problems such as an oversight of the problem to be polish. Although there have existed research projects on support for management of research resources and for sharing of know-how, almost existing systems have not been effective enough since these systems have not sufficiently considered the polish process. This paper mainly describes a system that enables the strategic management of research resources together with polish process and their practical use.

Keywords: research resource, polish process, information sharing, knowledge management, information visualization

Procedia PDF Downloads 389
15657 Role of Process Parameters on Pocket Milling with Abrasive Water Jet Machining Technique

Authors: T. V. K. Gupta, J. Ramkumar, Puneet Tandon, N. S. Vyas

Abstract:

Abrasive Water Jet Machining (AWJM) is an unconventional machining process well known for machining hard to cut materials. The primary research focus on the process was for through cutting and a very limited literature is available on pocket milling using AWJM. The present work is an attempt to use this process for milling applications considering a set of various process parameters. Four different input parameters, which were considered by researchers for part separation, are selected for the above application i.e. abrasive size, flow rate, standoff distance, and traverse speed. Pockets of definite size are machined to investigate surface roughness, material removal rate, and pocket depth. Based on the data available through experiments on SS304 material, it is observed that higher traverse speeds gives a better finish because of reduction in the particle energy density and lower depth is also observed. Increase in the standoff distance and abrasive flow rate reduces the rate of material removal as the jet loses its focus and occurrence of collisions within the particles. ANOVA for individual output parameter has been studied to know the significant process parameters.

Keywords: abrasive flow rate, surface finish, abrasive size, standoff distance, traverse speed

Procedia PDF Downloads 304
15656 Influence of Ligature Tightening on Bone Fracture Risk in Interspinous Process Surgery

Authors: Dae Kyung Choi, Won Man Park, Kyungsoo Kim, Yoon Hyuk Kim

Abstract:

The interspinous process devices have been recently used due to its advantages such as minimal invasiveness and less subsidence of the implant to the osteoporotic bone. In this paper, we have analyzed the influences of ligature tightening of several interspinous process devices using finite element analysis. Four types of interspinous process implants were inserted to the L3-4 spinal motion segment based on their surgical protocols. Inferior plane of L4 vertebra was fixed and 7.5 Nm of extension moment were applied on superior plane of L3 vertebra with 400N of compressive load along follower load direction and pretension of the ligature. The stability of the spinal unit was high enough than that of intact model. The higher value of pretension in the ligature led the decrease of dynamic stabilization effect in cases of the WallisTM, DiamTM, Viking, and Spear®. The results of present study could be used to evaluate surgical option and validate the biomechanical characteristics of the spinal implants.

Keywords: interspinous process device, bone fracture risk, lumbar spine, finite element analysis

Procedia PDF Downloads 400
15655 Communication Policies of Turkey Related to European Union

Authors: Muhammet Erbay

Abstract:

The phenomenon of communication that has been studied by different disciplines has social, political and economical aspects. The scope of communication has extended from a traditional content to the modern world which is under the control of mass media. Nowadays, thanks to globalization and technological facilities, many companies, public or international institutions take advantage of new communication technologies and overhaul their policies. European Union (EU) is one of the effective institutions in this sphere. It aims to harmonize the communication infrastructure and policies of member countries which have gone through the process of political unification. It is a significant problem for the unification of EU to have legal restrictions or critical differences in communication facilities among countries while technology stands at the center of economic and social life. Therefore, EU institutions place a particular importance to their communication policies. Besides, communication processes have a vital importance in creating a European public opinion in the process of political integration. Based on the evaluation above, the aim of this paper is to analyze the cohesion process of Turkey that tries to take an active role in EU communication policies and has on-going negotiations. This article does not only confine itself to the technical details of communication policies but also aims to evaluate socio-political dimension of the process. Therefore, a corporate review has been featured in the study and Turkey's compliance process in communication policies on European Union has been evaluated by the means of deduction method. Some problematic areas have been identified in compliance process on communication policies such as human rights and minority rights, whereas compliance process on communication infrastructure and technology proceeds effectively.

Keywords: communication policies, European Union, integration, Turkey

Procedia PDF Downloads 411
15654 Single Event Transient Tolerance Analysis in 8051 Microprocessor Using Scan Chain

Authors: Jun Sung Go, Jong Kang Park, Jong Tae Kim

Abstract:

As semi-conductor manufacturing technology evolves; the single event transient problem becomes more significant issue. Single event transient has a critical impact on both combinational and sequential logic circuits, so it is important to evaluate the soft error tolerance of the circuits at the design stage. In this paper, we present a soft error detecting simulation using scan chain. The simulation model generates a single event transient randomly in the circuit, and detects the soft error during the execution of the test patterns. We verified this model by inserting a scan chain in an 8051 microprocessor using 65 nm CMOS technology. While the test patterns generated by ATPG program are passing through the scan chain, we insert a single event transient and detect the number of soft errors per sub-module. The experiments show that the soft error rates per cell area of the SFR module is 277% larger than other modules.

Keywords: scan chain, single event transient, soft error, 8051 processor

Procedia PDF Downloads 347
15653 Correlations between Obesity Indices and Cardiometabolic Risk Factors in Obese Subgroups in Severely Obese Women

Authors: Seung Hun Lee, Sang Yeoup Lee

Abstract:

Objectives: To investigate associations between degrees of obesity using correlations between obesity indices and cardiometabolic risk factors. Methods: BMI, waist circumference (WC), fasting insulin, fasting glucose, lipids, and visceral adipose tissue (VAT) area using computed tomographic images were measured in 113 obese female without cardiovascular disease (CVD). Correlations between obesity indices and cardiometabolic risk factors were analyzed in obese subgroups defined using sequential obesity indices. Results: Mean BMI and WC were 29.6 kg/m2 and 92.8 cm. BMI showed significant correlations with all five cardiometabolic risk factors until the BMI cut-off point reached 27 kg/m2, but when it exceeded 30 kg/m2, correlations no longer existed. WC was significantly correlated with all five cardiometabolic risk factors up to a value of 85 cm, but when WC exceeded 90 cm, correlations no longer existed. Conclusions: Our data suggest that moderate weight-loss goals may not be enough to ameliorate cardiometabolic markers in severely obese patients. Therefore, individualized weight-loss goals should be recommended to such patients to improve health benefits.

Keywords: correlation, cardiovascular disease, risk factors, obesity

Procedia PDF Downloads 357
15652 Multi-source Question Answering Framework Using Transformers for Attribute Extraction

Authors: Prashanth Pillai, Purnaprajna Mangsuli

Abstract:

Oil exploration and production companies invest considerable time and efforts to extract essential well attributes (like well status, surface, and target coordinates, wellbore depths, event timelines, etc.) from unstructured data sources like technical reports, which are often non-standardized, multimodal, and highly domain-specific by nature. It is also important to consider the context when extracting attribute values from reports that contain information on multiple wells/wellbores. Moreover, semantically similar information may often be depicted in different data syntax representations across multiple pages and document sources. We propose a hierarchical multi-source fact extraction workflow based on a deep learning framework to extract essential well attributes at scale. An information retrieval module based on the transformer architecture was used to rank relevant pages in a document source utilizing the page image embeddings and semantic text embeddings. A question answering framework utilizingLayoutLM transformer was used to extract attribute-value pairs incorporating the text semantics and layout information from top relevant pages in a document. To better handle context while dealing with multi-well reports, we incorporate a dynamic query generation module to resolve ambiguities. The extracted attribute information from various pages and documents are standardized to a common representation using a parser module to facilitate information comparison and aggregation. Finally, we use a probabilistic approach to fuse information extracted from multiple sources into a coherent well record. The applicability of the proposed approach and related performance was studied on several real-life well technical reports.

Keywords: natural language processing, deep learning, transformers, information retrieval

Procedia PDF Downloads 193
15651 Utilizing Reflection as a Tool for Experiential Learning through a Simulated Activity

Authors: Nadira Zaidi

Abstract:

The aim of this study is to gain direct feedback of interviewees in a simulated interview process. Reflection based on qualitative data analysis has been utilized through the Gibbs Reflective Cycle, with 30 students as respondents at the Undergraduate level. The respondents reflected on the positive and negative aspects of this active learning process in order to increase their performance in actual job interviews. Results indicate that students engaged in the process successfully imbibed the feedback that they received from the interviewers and also identified the areas that needed improvement.

Keywords: experiential learning, positive and negative impact, reflection, simulated

Procedia PDF Downloads 143
15650 Optimization of End Milling Process Parameters for Minimization of Surface Roughness of AISI D2 Steel

Authors: Pankaj Chandna, Dinesh Kumar

Abstract:

The present work analyses different parameters of end milling to minimize the surface roughness for AISI D2 steel. D2 Steel is generally used for stamping or forming dies, punches, forming rolls, knives, slitters, shear blades, tools, scrap choppers, tyre shredders etc. Surface roughness is one of the main indices that determines the quality of machined products and is influenced by various cutting parameters. In machining operations, achieving desired surface quality by optimization of machining parameters, is a challenging job. In case of mating components the surface roughness become more essential and is influenced by the cutting parameters, because, these quality structures are highly correlated and are expected to be influenced directly or indirectly by the direct effect of process parameters or their interactive effects (i.e. on process environment). In this work, the effects of selected process parameters on surface roughness and subsequent setting of parameters with the levels have been accomplished by Taguchi’s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L9 orthogonal array. Experimental investigation of the end milling of AISI D2 steel with carbide tool by varying feed, speed and depth of cut and the surface roughness has been measured using surface roughness tester. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the contribution of the different process parameters on the process.

Keywords: D2 steel, orthogonal array, optimization, surface roughness, Taguchi methodology

Procedia PDF Downloads 544
15649 Process Assessment Model for Process Capability Determination Based on ISO/IEC 20000-1:2011

Authors: Harvard Najoan, Sarwono Sutikno, Yusep Rosmansyah

Abstract:

Most enterprises are now using information technology services as their assets to support business objectives. These kinds of services are provided by the internal service provider (inside the enterprise) or external service provider (outside enterprise). To deliver quality information technology services, the service provider (which from now on will be called ‘organization’) either internal or external, must have a standard for service management system. At present, the standard that is recognized as best practice for service management system for the organization is international standard ISO/IEC 20000:2011. The most important part of this international standard is the first part or ISO/IEC 20000-1:2011-Service Management System Requirement, because it contains 22 for organization processes as a requirement to be implemented in an organizational environment in order to build, manage and deliver quality service to the customer. Assessing organization management processes is the first step to implementing ISO/IEC 20000:2011 into the organization management processes. This assessment needs Process Assessment Model (PAM) as an assessment instrument. PAM comprises two parts: Process Reference Model (PRM) and Measurement Framework (MF). PRM is built by transforming the 22 process of ISO/IEC 20000-1:2011 and MF is based on ISO/IEC 33020. This assessment instrument was designed to assess the capability of service management process in Divisi Teknologi dan Sistem Informasi (Information Systems and Technology Division) as an internal organization of PT Pos Indonesia. The result of this assessment model can be proposed to improve the capability of service management system.

Keywords: ISO/IEC 20000-1:2011, ISO/IEC 33020:2015, process assessment, process capability, service management system

Procedia PDF Downloads 465
15648 Effect of Two Different Method for Juice Processing on the Anthocyanins and Polyphenolics of Blueberry (Vaccinium corymbosum)

Authors: Onur Ercan, Buket Askin, Erdogan Kucukoner

Abstract:

Blueberry (Vaccinium corymbosum, bluegold) has become popular beverage due to their nutritional values such as vitamins, minerals, and antioxidants. In the study, the effects of pressing, mashing, enzymatic treatment, and pasteurization on anthocyanins, colour, and polyphenolics of blueberry juice (BJ) were studied. The blueberry juice (BJ) was produced with two different methods that direct juice extraction (DJE) and mash treatment process (MTP) were applied. After crude blueberry juice (CBJ) production, the samples were first treated with commercial enzymes [Novoferm-61 (Novozymes A/S) (2–10 mL/L)], to break down the hydrocolloid polysaccharides, mainly pectin and starch. The enzymes were added at various concentrations. The highest transmittance (%) was obtained for Novoferm-61 at a concentration of 2 mL/L was 66.53%. After enzymatic treatment, clarification trials were applied to the enzymatically treated BJs with adding various amounts of bentonite (10%, w/v), gelatin (1%, w/v) and kiselsol (15%, v/v). The turbidities of the clarified samples were then determined. However, there was no significant differences between transmittances (%) for samples. For that, only enzymatic treatment was applied to the blueberry juice processing (DDBJ, depectinized direct blueberry juice). Based on initial pressing process made to evaluate press function, it was determined that pressing fresh blueberries with no other processing did not render adequate juice due to lack of liquefaction. Therefore, the blueberries were mash into small pieces (3 mm) and then enzymatic treatments and clarification trials were performed. Finally, both BJ samples were pasteurized. Compositional analyses, colour properties, polyphenols and antioxidant properties were compared. Enzymatic treatment caused significant reductions in ACN content (30%) in Direct Blueberry Juice Processing (DBJ), while there was a significant increasing in Mash Treatment Processing (MTP). Overall anthocyanin levels were higher intreated samples after each processing step in MTP samples, but polyphenolic levels were slightly higher for both processes (DBJ and MTP). There was a reduction for ACNs and polyphenolics only after pasteurization. It has a result that the methods for tried to blueberry juice is suitable into obtain fresh juice. In addition, we examined fruit juice during processing stages; anthocyanin, phenolic substance content and antioxidant activity are higher, and yield is higher in fruit juice compared to DBJ method in MTP method, the MTP method should be preferred in processing juice of blueberry into fruit juice.

Keywords: anthocyanins, blueberry, depectinization, polyphenols

Procedia PDF Downloads 94
15647 Efficient Video Compression Technique Using Convolutional Neural Networks and Generative Adversarial Network

Authors: P. Karthick, K. Mahesh

Abstract:

Video has become an increasingly significant component of our digital everyday contact. With the advancement of greater contents and shows of the resolution, its significant volume poses serious obstacles to the objective of receiving, distributing, compressing, and revealing video content of high quality. In this paper, we propose the primary beginning to complete a deep video compression model that jointly upgrades all video compression components. The video compression method involves splitting the video into frames, comparing the images using convolutional neural networks (CNN) to remove duplicates, repeating the single image instead of the duplicate images by recognizing and detecting minute changes using generative adversarial network (GAN) and recorded with long short-term memory (LSTM). Instead of the complete image, the small changes generated using GAN are substituted, which helps in frame level compression. Pixel wise comparison is performed using K-nearest neighbours (KNN) over the frame, clustered with K-means, and singular value decomposition (SVD) is applied for each and every frame in the video for all three color channels [Red, Green, Blue] to decrease the dimension of the utility matrix [R, G, B] by extracting its latent factors. Video frames are packed with parameters with the aid of a codec and converted to video format, and the results are compared with the original video. Repeated experiments on several videos with different sizes, duration, frames per second (FPS), and quality results demonstrate a significant resampling rate. On average, the result produced had approximately a 10% deviation in quality and more than 50% in size when compared with the original video.

Keywords: video compression, K-means clustering, convolutional neural network, generative adversarial network, singular value decomposition, pixel visualization, stochastic gradient descent, frame per second extraction, RGB channel extraction, self-detection and deciding system

Procedia PDF Downloads 187
15646 Environmental Cost and Benefits Analysis of Different Electricity Option: A Case Study of Kuwait

Authors: Mohammad Abotalib, Hamid Alhamadi

Abstract:

In Kuwait, electricity is generated from two primary sources that are heavy fuel combustion and natural gas combustion. As Kuwait relies mainly on petroleum-based products for electricity generation, identifying and understanding the environmental trade-off of such operations should be carefully investigated. The life cycle assessment (LCA) tool is applied to identify the potential environmental impact of electricity generation under three scenarios by considering the material flow in various stages involved, such as raw-material extraction, transportation, operations, and waste disposal. The three scenarios investigated represent current and futuristic electricity grid mixes. The analysis targets six environmental impact categories: (1) global warming potential (GWP), (2) acidification potential (AP), (3) water depletion (WD), (4) acidification potential (AP), (4) eutrophication potential (EP), (5) human health particulate matter (HHPM), and (6) smog air (SA) per one kWh of electricity generated. Results indicate that one kWh of electricity generated would have a GWP (881-1030) g CO₂-eq, mainly from the fuel combustion process, water depletion (0.07-0.1) m³ of water, about 68% from cooling processes, AP (15.3-17.9) g SO₂-eq, EP (0.12-0.14) g N eq., HHPA (1.13- 1.33)g PM₂.₅ eq., and SA (64.8-75.8) g O₃ eq. The variation in results depend on the scenario investigated. It can be observed from the analysis that introducing solar photovoltaic and wind to the electricity grid mix improves the performance of scenarios 2 and 3 where 15% of the electricity comes from renewables correspond to a further decrease in LCA results.

Keywords: energy, functional uni, global warming potential, life cycle assessment, energy, functional unit

Procedia PDF Downloads 135