Search results for: computer processing of large databases
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12745

Search results for: computer processing of large databases

11995 The Output Fallacy: An Investigation into Input, Noticing, and Learners’ Mechanisms

Authors: Samantha Rix

Abstract:

The purpose of this research paper is to investigate the cognitive processing of learners who receive input but produce very little or no output, and who, when they do produce output, exhibit a similar language proficiency as do those learners who produced output more regularly in the language classroom. Previous studies have investigated the benefits of output (with somewhat differing results); therefore, the presentation will begin with an investigation of what may underlie gains in proficiency without output. Consequently, a pilot study was designed and conducted to gain insight into the cognitive processing of low-output language learners looking, for example, at quantity and quality of noticing. This will be carried out within the paradigm of action classroom research, observing and interviewing low-output language learners in an intensive English program at a small Midwest university. The results of the pilot study indicated that autonomy in language learning, specifically utilizing strategies such self-monitoring, self-talk, and thinking 'out-loud', were crucial in the development of language proficiency for academic-level performance. The presentation concludes with an examination of pedagogical implication for classroom use in order to aide students in their language development.

Keywords: cognitive processing, language learners, language proficiency, learning strategies

Procedia PDF Downloads 475
11994 Radiation Usage Impact of on Anti-Nutritional Compounds (Antitrypsin and Phytic Acid) of Livestock and Poultry Foods

Authors: Mohammad Khosravi, Ali Kiani, Behroz Dastar, Parvin Showrang

Abstract:

Review was carried out on important anti-nutritional compounds of livestock and poultry foods and the effect of radiation usage. Nowadays, with advancement in technology, different methods have been considered for the optimum usage of nutrients in livestock and poultry foods. Steaming, extruding, pelleting, and the use of chemicals are the most common and popular methods in food processing. Use of radiation in food processing researches in the livestock and poultry industry is currently highly regarded. Ionizing (electrons, gamma) and non-ionizing beams (microwave and infrared) are the most useable rays in animal food processing. In recent researches, these beams have been used to remove and reduce the anti-nutritional factors and microbial contamination and improve the digestibility of nutrients in poultry and livestock food. The evidence presented will help researchers to recognize techniques of relevance to them. Simplification of some of these techniques, especially in developing countries, must be addressed so that they can be used more widely.

Keywords: antitrypsin, gamma anti-nutritional components, phytic acid, radiation

Procedia PDF Downloads 343
11993 Profiling Risky Code Using Machine Learning

Authors: Zunaira Zaman, David Bohannon

Abstract:

This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.

Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties

Procedia PDF Downloads 107
11992 Application of Optical Method Based on Laser Devise as Non-Destructive Testing for Calculus of Mechanical Deformation

Authors: R. Daïra, V. Chalvidan

Abstract:

We present the speckle interferometry method to determine the deformation of a piece. This method of holographic imaging using a CCD camera for simultaneous digital recording of two states object and reference. The reconstruction is obtained numerically. This latest method has the advantage of being simpler than the methods currently available, and it does not suffer the holographic configuration faults online. Furthermore, it is entirely digital and avoids heavy analysis after recording the hologram. This work was carried out in the laboratory HOLO 3 (optical metrology laboratory in Saint Louis, France) and it consists in controlling qualitatively and quantitatively the deformation of object by using a camera CCD connected to a computer equipped with software of Fringe Analysis.

Keywords: speckle, nondestructive testing, interferometry, image processing

Procedia PDF Downloads 497
11991 Development of Pasta Production by Using of Hard and Soft Domestic Sorts of Wheat

Authors: A.N. Zhilkaidarov, G.K. Iskakova, V.Y. Chernyh

Abstract:

High-qualified and not-expensive products of daily usage have a big demand on food products’ market. Moreover, it is about independent and irreplaceable product as pasta. Pasta is a product, which represents itself the conserved dough from wheat flour made through special milling process. A wide assortment of the product and its pleasant taste properties allow to use pasta products in very different combinations with other food products. Pasta industry of Kazakhstan has large perspectives of development. There are many premises for it, which includes first an importance of pasta as a social product. Due to for its nutritional and energetically value pasta is the part of must have food. Besides that, the pasta production in Kazakhstan has traditional bases, and nowadays the market of this product develops rapidly as in quantity as well as in quality aspects. Moreover, one of the advantages of this branch is an economical aspect – pasta is the product of secondary processing, and therefore price for sailing is much higher as its own costs.

Keywords: pasta, new wheat sorts, domesic sorts of wheat, macaronic flour

Procedia PDF Downloads 526
11990 Implications of Learning Resource Centre in a Web Environment

Authors: Darshana Lal, Sonu Rana

Abstract:

Learning Resource Centers (LRC) are acquiring different kinds of documents like books, journals, thesis, dissertations, standard, databases etc. in print and e-form. This article deals with the different types of sources available in LRC. It also discusses the concept of the web, as a tool, as a multimedia system and the different interfaces available on the web. The reasons for establishing LRC are highlighted along with the assignments of LRC. Different features of LRC‘S like self-learning and group learning are described. It also implements a group of activities like reading, learning, educational etc. The use of LRC by students and faculties are given and concluded with the benefits.

Keywords: internet, search engine, resource centre, opac, self-learning, group learning

Procedia PDF Downloads 378
11989 Extracting Actions with Improved Part of Speech Tagging for Social Networking Texts

Authors: Yassine Jamoussi, Ameni Youssfi, Henda Ben Ghezala

Abstract:

With the growing interest in social networking, the interaction of social actors evolved to a source of knowledge in which it becomes possible to perform context aware-reasoning. The information extraction from social networking especially Twitter and Facebook is one of the problems in this area. To extract text from social networking, we need several lexical features and large scale word clustering. We attempt to expand existing tokenizer and to develop our own tagger in order to support the incorrect words currently in existence in Facebook and Twitter. Our goal in this work is to benefit from the lexical features developed for Twitter and online conversational text in previous works, and to develop an extraction model for constructing a huge knowledge based on actions

Keywords: social networking, information extraction, part-of-speech tagging, natural language processing

Procedia PDF Downloads 305
11988 Computer Network Applications, Practical Implementations and Structural Control System Representations

Authors: El Miloudi Djelloul

Abstract:

The computer network play an important position for practical implementations of the differently system. To implement a system into network above all is needed to know all the configurations, which is responsible to be a part of the system, and to give adequate information and solution in realtime. So if want to implement this system for example in the school or relevant institutions, the first step is to analyze the types of model which is needed to be configured and another important step is to organize the works in the context of devices, as a part of the general system. Often before configuration, as important point is descriptions and documentations from all the works into the respective process, and then to organize in the aspect of problem-solving. The computer network as critic infrastructure is very specific so the paper present the effectiveness solutions in the structured aspect viewed from one side, and another side is, than the paper reflect the positive aspect in the context of modeling and block schema presentations as an better alternative to solve the specific problem because of continually distortions of the system from the line of devices, programs and signals or packed collisions, which are in movement from one computer node to another nodes.

Keywords: local area networks, LANs, block schema presentations, computer network system, computer node, critical infrastructure packed collisions, structural control system representations, computer network, implementations, modeling structural representations, companies, computers, context, control systems, internet, software

Procedia PDF Downloads 365
11987 Wasteless Solid-Phase Method for Conversion of Iron Ores Contaminated with Silicon and Phosphorus Compounds

Authors: А. V. Panko, Е. V. Ablets, I. G. Kovzun, М. А. Ilyashov

Abstract:

Based upon generalized analysis of modern know-how in the sphere of processing, concentration and purification of iron-ore raw materials (IORM), in particular, the most widespread ferrioxide-silicate materials (FOSM), containing impurities of phosphorus and other elements compounds, noted special role of nano technological initiatives in improvement of such processes. Considered ideas of role of nano particles in processes of FOSM carbonization with subsequent direct reduction of ferric oxides contained in them to metal phase, as well as in processes of alkali treatment and separation of powered iron from phosphorus compounds. Using the obtained results the wasteless solid-phase processing, concentration and purification of IORM and FOSM from compounds of phosphorus, silicon and other impurities excelling known methods of direct iron reduction from iron ores and metallurgical slimes.

Keywords: iron ores, solid-phase reduction, nanoparticles in reduction and purification of iron from silicon and phosphorus, wasteless method of ores processing

Procedia PDF Downloads 488
11986 Genomic Sequence Representation Learning: An Analysis of K-Mer Vector Embedding Dimensionality

Authors: James Jr. Mashiyane, Risuna Nkolele, Stephanie J. Müller, Gciniwe S. Dlamini, Rebone L. Meraba, Darlington S. Mapiye

Abstract:

When performing language tasks in natural language processing (NLP), the dimensionality of word embeddings is chosen either ad-hoc or is calculated by optimizing the Pairwise Inner Product (PIP) loss. The PIP loss is a metric that measures the dissimilarity between word embeddings, and it is obtained through matrix perturbation theory by utilizing the unitary invariance of word embeddings. Unlike in natural language, in genomics, especially in genome sequence processing, unlike in natural language processing, there is no notion of a “word,” but rather, there are sequence substrings of length k called k-mers. K-mers sizes matter, and they vary depending on the goal of the task at hand. The dimensionality of word embeddings in NLP has been studied using the matrix perturbation theory and the PIP loss. In this paper, the sufficiency and reliability of applying word-embedding algorithms to various genomic sequence datasets are investigated to understand the relationship between the k-mer size and their embedding dimension. This is completed by studying the scaling capability of three embedding algorithms, namely Latent Semantic analysis (LSA), Word2Vec, and Global Vectors (GloVe), with respect to the k-mer size. Utilising the PIP loss as a metric to train embeddings on different datasets, we also show that Word2Vec outperforms LSA and GloVe in accurate computing embeddings as both the k-mer size and vocabulary increase. Finally, the shortcomings of natural language processing embedding algorithms in performing genomic tasks are discussed.

Keywords: word embeddings, k-mer embedding, dimensionality reduction

Procedia PDF Downloads 138
11985 Gis Database Creation for Impacts of Domestic Wastewater Disposal on BIDA Town, Niger State Nigeria

Authors: Ejiobih Hyginus Chidozie

Abstract:

Geographic Information System (GIS) is a configuration of computer hardware and software specifically designed to effectively capture, store, update, manipulate, analyse and display and display all forms of spatially referenced information. GIS database is referred to as the heart of GIS. It has location data, attribute data and spatial relationship between the objects and their attributes. Sewage and wastewater management have assumed increased importance lately as a result of general concern expressed worldwide about the problems of pollution of the environment contamination of the atmosphere, rivers, lakes, oceans and ground water. In this research GIS database was created to study the impacts of domestic wastewater disposal methods on Bida town, Niger State as a model for investigating similar impacts on other cities in Nigeria. Results from GIS database are very useful to decision makers and researchers. Bida Town was subdivided into four regions, eight zones, and 24 sectors based on the prevailing natural morphology of the town. GIS receiver and structured questionnaire were used to collect information and attribute data from 240 households of the study area. Domestic wastewater samples were collected from twenty four sectors of the study area for laboratory analysis. ArcView 3.2a GIS software, was used to create the GIS databases for ecological, health and socioeconomic impacts of domestic wastewater disposal methods in Bida town.

Keywords: environment, GIS, pollution, software, wastewater

Procedia PDF Downloads 421
11984 Cost Effective Real-Time Image Processing Based Optical Mark Reader

Authors: Amit Kumar, Himanshu Singal, Arnav Bhavsar

Abstract:

In this modern era of automation, most of the academic exams and competitive exams are Multiple Choice Questions (MCQ). The responses of these MCQ based exams are recorded in the Optical Mark Reader (OMR) sheet. Evaluation of the OMR sheet requires separate specialized machines for scanning and marking. The sheets used by these machines are special and costs more than a normal sheet. Available process is non-economical and dependent on paper thickness, scanning quality, paper orientation, special hardware and customized software. This study tries to tackle the problem of evaluating the OMR sheet without any special hardware and making the whole process economical. We propose an image processing based algorithm which can be used to read and evaluate the scanned OMR sheets with no special hardware required. It will eliminate the use of special OMR sheet. Responses recorded in normal sheet is enough for evaluation. The proposed system takes care of color, brightness, rotation, little imperfections in the OMR sheet images.

Keywords: OMR, image processing, hough circle trans-form, interpolation, detection, binary thresholding

Procedia PDF Downloads 173
11983 Treating Voxels as Words: Word-to-Vector Methods for fMRI Meta-Analyses

Authors: Matthew Baucum

Abstract:

With the increasing popularity of fMRI as an experimental method, psychology and neuroscience can greatly benefit from advanced techniques for summarizing and synthesizing large amounts of data from brain imaging studies. One promising avenue is automated meta-analyses, in which natural language processing methods are used to identify the brain regions consistently associated with certain semantic concepts (e.g. “social”, “reward’) across large corpora of studies. This study builds on this approach by demonstrating how, in fMRI meta-analyses, individual voxels can be treated as vectors in a semantic space and evaluated for their “proximity” to terms of interest. In this technique, a low-dimensional semantic space is built from brain imaging study texts, allowing words in each text to be represented as vectors (where words that frequently appear together are near each other in the semantic space). Consequently, each voxel in a brain mask can be represented as a normalized vector sum of all of the words in the studies that showed activation in that voxel. The entire brain mask can then be visualized in terms of each voxel’s proximity to a given term of interest (e.g., “vision”, “decision making”) or collection of terms (e.g., “theory of mind”, “social”, “agent”), as measured by the cosine similarity between the voxel’s vector and the term vector (or the average of multiple term vectors). Analysis can also proceed in the opposite direction, allowing word cloud visualizations of the nearest semantic neighbors for a given brain region. This approach allows for continuous, fine-grained metrics of voxel-term associations, and relies on state-of-the-art “open vocabulary” methods that go beyond mere word-counts. An analysis of over 11,000 neuroimaging studies from an existing meta-analytic fMRI database demonstrates that this technique can be used to recover known neural bases for multiple psychological functions, suggesting this method’s utility for efficient, high-level meta-analyses of localized brain function. While automated text analytic methods are no replacement for deliberate, manual meta-analyses, they seem to show promise for the efficient aggregation of large bodies of scientific knowledge, at least on a relatively general level.

Keywords: FMRI, machine learning, meta-analysis, text analysis

Procedia PDF Downloads 449
11982 Mixotropohic Growth of Chlorella sp. on Raw Food Processing Industrial Wastewater: Effect of COD Tolerance

Authors: Suvidha Gupta, R. A. Pandey, Sanjay Pawar

Abstract:

The effluents from various food processing industries are found with high BOD, COD, suspended solids, nitrate, and phosphate. Mixotrophic growth of microalgae using food processing industrial wastewater as an organic carbon source has emerged as more effective and energy intensive means for the nutrient removal and COD reduction. The present study details the treatment of non-sterilized unfiltered food processing industrial wastewater by microalgae for nutrient removal as well as to determine the tolerance to COD by taking different dilutions of wastewater. In addition, the effect of different inoculum percentages of microalgae on removal efficiency of the nutrients for given dilution has been studied. To see the effect of dilution and COD tolerance, the wastewater having initial COD 5000 mg/L (±5), nitrate 28 mg/L (±10), and phosphate 24 mg/L (±10) was diluted to get COD of 3000 mg/L and 1000 mg/L. The experiments were carried out in 1L conical flask by intermittent aeration with different inoculum percentage i.e. 10%, 20%, and 30% of Chlorella sp. isolated from nearby area of NEERI, Nagpur. The experiments were conducted for 6 days by providing 12:12 light- dark period and determined various parameters such as COD, TOC, NO3-- N, PO4-- P, and total solids on daily basis. Results revealed that, for 10% and 20% inoculum, over 90% COD and TOC reduction was obtained with wastewater containing COD of 3000 mg/L whereas over 80% COD and TOC reduction was obtained with wastewater containing COD of 1000 mg/L. Moreover, microalgae was found to tolerate wastewater containing COD 5000 mg/L and obtained over 60% and 80% reduction in COD and TOC respectively. The obtained results were found similar with 10% and 20% inoculum in all COD dilutions whereas for 30% inoculum over 60% COD and 70% TOC reduction was obtained. In case of nutrient removal, over 70% nitrate removal and 45% phosphate removal was obtained with 20% inoculum in all dilutions. The obtained results indicated that Microalgae assisted nutrient removal gives maximum COD and TOC reduction with 3000 mg/L COD and 20% inoculum. Hence, microalgae assisted wastewater treatment is not only effective for removal of nutrients but also can tolerate high COD up to 5000 mg/L and solid content.

Keywords: Chlorella sp., chemical oxygen demand, food processing industrial wastewater, mixotrophic growth

Procedia PDF Downloads 333
11981 A Case Study of Open Source Development Practices within a Large Company Setting

Authors: Alma Orucevic-Alagic, Martin Höst

Abstract:

Open source communities have demonstrated that complex and enterprise grade software can be produced, supported, and maintained by self-organizing groups of developers using primarily electronic form of communication. Due to the inherent nature of open source development, a specific set of open source software development practices has evolved. While there is an ongoing research on the topic of applicability of open source development practices within a company setting, still little is known about their benefits and challenges. The objective of this research is to understand if and to what degree open source development practices observed within a mature open source community are aligned with development practices within a large software and hardware company setting. For the purpose of this case study a set of open source development practices that are present in a mature open source community has been identified. Then, development practices of a large, international, hardware and software company based in Sweden were assessed and compared to the identified open source community practices. It is shown that there are many similarities between a mature open source community and a large company setting in regard to software development practices. We also identify practices that exist in open source communities and that are not standard within a company setting, but whose implementation can result in an improved software development efficiency within the company setting.

Keywords: development practices, open source software, innersource, closed open source

Procedia PDF Downloads 558
11980 Tool Condition Monitoring of Ceramic Inserted Tools in High Speed Machining through Image Processing

Authors: Javier A. Dominguez Caballero, Graeme A. Manson, Matthew B. Marshall

Abstract:

Cutting tools with ceramic inserts are often used in the process of machining many types of superalloy, mainly due to their high strength and thermal resistance. Nevertheless, during the cutting process, the plastic flow wear generated in these inserts enhances and propagates cracks due to high temperature and high mechanical stress. This leads to a very variable failure of the cutting tool. This article explores the relationship between the continuous wear that ceramic SiAlON (solid solutions based on the Si3N4 structure) inserts experience during a high-speed machining process and the evolution of sparks created during the same process. These sparks were analysed through pictures of the cutting process recorded using an SLR camera. Features relating to the intensity and area of the cutting sparks were extracted from the individual pictures using image processing techniques. These features were then related to the ceramic insert’s crater wear area.

Keywords: ceramic cutting tools, high speed machining, image processing, tool condition monitoring, tool wear

Procedia PDF Downloads 298
11979 Partial Privatization, Control Rights of Large Shareholders and Privatized Shares Transfer: Evidence from Chinese State-Owned Listed Companies

Authors: Tingting Zhou

Abstract:

The partial privatization of state-owned enterprises (SOEs) is a dynamic process. The main features of this process lie in not only gradual and sequential privatizations, but also privatized shares transfer. For partially privatized SOEs, the introduction of private sector ownership is not the end of the story because the previously introduced private owners may choose to leave the SOEs by transferring the privatized shares after privatization, a process that is called “privatized shares transfer”. This paper investigates the determinants of privatized shares transfer from the perspective of large shareholders’ control rights. The results captures the fact that the higher control rights of large shareholders lead to more privatized shares transfer. After exploring the impacts of excessive control rights, the results provide evidence supporting the idea that firms with excessive numbers of directors, senior managers or supervisors who also have positions in the largest controlling shareholder’s entity are more likely to transfer privatized shares owned by private owners. In addition, the largest shareholders’ ownership also plays a role in privatized shares transfer. This evidence suggests that the large shareholders’ control rights should be limited to an appropriate range during the process of privatization, thereby giving private shareholders more opportunity to participate in the operation of firms, strengthen the state and enhance the competitiveness of state capital.

Keywords: control rights of large shareholders, partial privatization, privatized shares transfer, state-owned listed companies

Procedia PDF Downloads 284
11978 Optimizing Machine Learning Through Python Based Image Processing Techniques

Authors: Srinidhi. A, Naveed Ahmed, Twinkle Hareendran, Vriksha Prakash

Abstract:

This work reviews some of the advanced image processing techniques for deep learning applications. Object detection by template matching, image denoising, edge detection, and super-resolution modelling are but a few of the tasks. The paper looks in into great detail, given that such tasks are crucial preprocessing steps that increase the quality and usability of image datasets in subsequent deep learning tasks. We review some of the methods for the assessment of image quality, more specifically sharpness, which is crucial to ensure a robust performance of models. Further, we will discuss the development of deep learning models specific to facial emotion detection, age classification, and gender classification, which essentially includes the preprocessing techniques interrelated with model performance. Conclusions from this study pinpoint the best practices in the preparation of image datasets, targeting the best trade-off between computational efficiency and retaining important image features critical for effective training of deep learning models.

Keywords: image processing, machine learning applications, template matching, emotion detection

Procedia PDF Downloads 17
11977 Models and Metamodels for Computer-Assisted Natural Language Grammar Learning

Authors: Evgeny Pyshkin, Maxim Mozgovoy, Vladislav Volkov

Abstract:

The paper follows a discourse on computer-assisted language learning. We examine problems of foreign language teaching and learning and introduce a metamodel that can be used to define learning models of language grammar structures in order to support teacher/student interaction. Special attention is paid to the concept of a virtual language lab. Our approach to language education assumes to encourage learners to experiment with a language and to learn by discovering patterns of grammatically correct structures created and managed by a language expert.

Keywords: computer-assisted instruction, language learning, natural language grammar models, HCI

Procedia PDF Downloads 519
11976 High-Temperature Behavior of Boiler Steel by Friction Stir Processing

Authors: Supreet Singh, Manpreet Kaur, Manoj Kumar

Abstract:

High temperature corrosion is an imperative material degradation method experienced in thermal power plants and other energy generation sectors. Metallic materials such as ferritic steels have special properties such as easy fabrication and machinibilty, low cost, but a serious drawback of these materials is the worsening in properties initiating from the interaction with the environments. The metallic materials do not endure higher temperatures for extensive period of time because of their poor corrosion resistance. Friction Stir Processing (FSP), has emerged as the potent surface modification means and control of microstructure in thermo mechanically heat affecting zones of various metal alloys. In the current research work, FSP was done on the boiler tube of SA 210 Grade A1 material which is regularly used by thermal power plants. The strengthening of SA210 Grade A1 boiler steel through microstructural refinement by Friction Stir Processing (FSP) and analyze the effect of the same on high temperature corrosion behavior. The high temperature corrosion performance of the unprocessed and the FSPed specimens were evaluated in the laboratory using molten salt environment of Na₂SO₄-82%Fe₂(SO₄). The unprocessed and FSPed low carbon steel Gr A1 evaluation was done in terms of microstructure, corrosion resistance, mechanical properties like hardness- tensile. The in-depth characterization was done by EBSD, SEM/EDS and X-ray mapping analyses with an aim to propose the mechanism behind high temperature corrosion behavior of the FSPed steel.

Keywords: boiler steel, characterization, corrosion, EBSD/SEM/EDS/XRD, friction stir processing

Procedia PDF Downloads 238
11975 Reduction of Residual Stress by Variothermal Processing and Validation via Birefringence Measurement Technique on Injection Molded Polycarbonate Samples

Authors: Christoph Lohr, Hanna Wund, Peter Elsner, Kay André Weidenmann

Abstract:

Injection molding is one of the most commonly used techniques in the industrial polymer processing. In the conventional process of injection molding, the liquid polymer is injected into the cavity of the mold, where the polymer directly starts hardening at the cooled walls. To compensate the shrinkage, which is caused predominantly by the immediate cooling, holding pressure is applied. Through that whole process, residual stresses are produced by the temperature difference of the polymer melt and the injection mold and the relocation of the polymer chains, which were oriented by the high process pressures and injection speeds. These residual stresses often weaken or change the structural behavior of the parts or lead to deformation of components. One solution to reduce the residual stresses is the use of variothermal processing. Hereby the mold is heated – i.e. near/over the glass transition temperature of the polymer – the polymer is injected and before opening the mold and ejecting the part the mold is cooled. For the next cycle, the mold gets heated again and the procedure repeats. The rapid heating and cooling of the mold are realized indirectly by convection of heated and cooled liquid (here: water) which is pumped through fluid channels underneath the mold surface. In this paper, the influences of variothermal processing on the residual stresses are analyzed with samples in a larger scale (500 mm x 250 mm x 4 mm). In addition, the influence on functional elements, such as abrupt changes in wall thickness, bosses, and ribs, on the residual stress is examined. Therefore the polycarbonate samples are produced by variothermal and isothermal processing. The melt is injected into a heated mold, which has in our case a temperature varying between 70 °C and 160 °C. After the filling of the cavity, the closed mold is cooled down varying from 70 °C to 100 °C. The pressure and temperature inside the mold are monitored and evaluated with cavity sensors. The residual stresses of the produced samples are illustrated by birefringence where the effect on the refractive index on the polymer under stress is used. The colorful spectrum can be uncovered by placing the sample between a polarized light source and a second polarization filter. To show the achievement and processing effects on the reduction of residual stress the birefringence images of the isothermal and variothermal produced samples are compared and evaluated. In this comparison to the variothermal produced samples have a lower amount of maxima of each color spectrum than the isothermal produced samples, which concludes that the residual stress of the variothermal produced samples is lower.

Keywords: birefringence, injection molding, polycarbonate, residual stress, variothermal processing

Procedia PDF Downloads 283
11974 Understanding the Heart of the Matter: A Pedagogical Framework for Apprehending Successful Second Language Development

Authors: Cinthya Olivares Garita

Abstract:

Untangling language processing in second language development has been either a taken-for-granted and overlooked task for some English language teaching (ELT) instructors or a considerable feat for others. From the most traditional language instruction to the most communicative methodologies, how to assist L2 learners in processing language in the classroom has become a challenging matter in second language teaching. Amidst an ample array of methods, strategies, and techniques to teach a target language, finding a suitable model to lead learners to process, interpret, and negotiate meaning to communicate in a second language has imposed a great responsibility on language teachers; committed teachers are those who are aware of their role in equipping learners with the appropriate tools to communicate in the target language in a 21stcentury society. Unfortunately, one might find some English language teachers convinced that their job is only to lecture students; others are advocates of textbook-based instruction that might hinder second language processing, and just a few might courageously struggle to facilitate second language learning effectively. Grounded on the most representative empirical studies on comprehensible input, processing instruction, and focus on form, this analysis aims to facilitate the understanding of how second language learners process and automatize input and propose a pedagogical framework for the successful development of a second language. In light of this, this paper is structured to tackle noticing and attention and structured input as the heart of processing instruction, comprehensible input as the missing link in second language learning, and form-meaning connections as opposed to traditional grammar approaches to language teaching. The author finishes by suggesting a pedagogical framework involving noticing-attention-comprehensible-input-form (NACIF based on their acronym) to support ELT instructors, teachers, and scholars on the challenging task of facilitating the understanding of effective second language development.

Keywords: second language development, pedagogical framework, noticing, attention, comprehensible input, form

Procedia PDF Downloads 30
11973 Effectiveness of the Bundle Care to Relieve the Thirst for Intensive Care Unit Patients: Meta-Analysis

Authors: Wen Hsin Hsu, Pin Lin

Abstract:

Objective: Thirst discomfort is the most common yet often overlooked symptom in patients in the intensive care unit (ICU), with an incidence rate of 69.8%. If not properly cared for, it can easily lead to irritability, affect sleep quality, and increase the incidence of delirium, thereby extending the length of hospital stay. Research points out that the sensation of coldness is an effective strategy to alleviate thirst. Using a combined care approach for thirst can prolong the sensation of coldness in the mouth and reduce thirst discomfort. Therefore, it needs to be further analyzed and its effectiveness reviewed. Methods: This study uses systematic literature review and meta-analysis methodologies and searched databases including PubMed, MEDLINE, EMBASE, Cochrane, CINAHL, and two Chinese databases (CEPS and CJTD) based on keywords. JBI was used to appraise the quality of the literature. RevMen 5.4 software package was used, and Fix Effect Model was applied for data analysis. We selected experimental articles, including those in English and Chinese, that met the inclusion and exclusion criteria. Three research articles were included in total, with a sample size of 416 people. Two were randomized controlled trials, and one was a quasi-experimental design. Results: The results show that the combined care for thirst, which includes ice water spray or oral swab wipes, menthol mouthwash, and lip balm, can significantly relieve thirst intensity MD=-1.36 (3 studies, 95% CI (-1.77, -0.95), p <0.001) and thirst distress MD=-0.71 (2 studies, 95% CI (-1.32, -0.10), p =0.02). Therefore, it is recommended that medical staff identify high-risk groups for thirst early on. Implications for Practice: For patients who cannot eat orally, providing combined care for thirst can increase oral comfort and improve the quality of care.

Keywords: thirst bundle care, intensive care units, meta-analysis, ice water spray, menthol

Procedia PDF Downloads 77
11972 Nano-Enhanced In-Situ and Field Up-Gradation of Heavy Oil

Authors: Devesh Motwani, Ranjana S. Baruah

Abstract:

The prime incentive behind up gradation of heavy oil is to increase its API gravity for ease of transportation to refineries, thus expanding the market access of bitumen-based crude to the refineries. There has always been a demand for an integrated approach that aims at simplifying the upgrading scheme, making it adaptable to the production site in terms of economics, environment, and personnel safety. Recent advances in nanotechnology have facilitated the development of two lines of heavy oil upgrading processes that make use of nano-catalysts for producing upgraded oil: In Situ Upgrading and Field Upgrading. The In-Situ upgrading scheme makes use of Hot Fluid Injection (HFI) technique where heavy fractions separated from produced oil are injected into the formations to reintroduce heat into the reservoir along with suspended nano-catalysts and hydrogen. In the presence of hydrogen, catalytic exothermic hydro-processing reactions occur that produce light gases and volatile hydrocarbons which contribute to increased oil detachment from the rock resulting in enhanced recovery. In this way the process is a combination of enhanced heavy oil recovery along with up gradation that effectively handles the heat load within the reservoirs, reduces hydrocarbon waste generation and minimizes the need for diluents. By eliminating most of the residual oil, the Synthetic Crude Oil (SCO) is much easier to transport and more amenable for processing in refineries. For heavy oil reservoirs seriously impacted by the presence of aquifers, the nano-catalytic technology can still be implemented on field though with some additional investments and reduced synergies; however still significantly serving the purpose of production of transportable oil with substantial benefits with respect to both large scale upgrading, and known commercial field upgrading technologies currently on the market. The paper aims to delve deeper into the technology discussed, and the future compatibility.

Keywords: upgrading, synthetic crude oil, nano-catalytic technology, compatibility

Procedia PDF Downloads 408
11971 Image Processing of Scanning Electron Microscope Micrograph of Ferrite and Pearlite Steel for Recognition of Micro-Constituents

Authors: Subir Gupta, Subhas Ganguly

Abstract:

In this paper, we demonstrate the new area of application of image processing in metallurgical images to develop the more opportunity for structure-property correlation based approaches of alloy design. The present exercise focuses on the development of image processing tools suitable for phrase segmentation, grain boundary detection and recognition of micro-constituents in SEM micrographs of ferrite and pearlite steels. A comprehensive data of micrographs have been experimentally developed encompassing the variation of ferrite and pearlite volume fractions and taking images at different magnification (500X, 1000X, 15000X, 2000X, 3000X and 5000X) under scanning electron microscope. The variation in the volume fraction has been achieved using four different plain carbon steel containing 0.1, 0.22, 0.35 and 0.48 wt% C heat treated under annealing and normalizing treatments. The obtained data pool of micrographs arbitrarily divided into two parts to developing training and testing sets of micrographs. The statistical recognition features for ferrite and pearlite constituents have been developed by learning from training set of micrographs. The obtained features for microstructure pattern recognition are applied to test set of micrographs. The analysis of the result shows that the developed strategy can successfully detect the micro constitutes across the wide range of magnification and variation of volume fractions of the constituents in the structure with an accuracy of about +/- 5%.

Keywords: SEM micrograph, metallurgical image processing, ferrite pearlite steel, microstructure

Procedia PDF Downloads 199
11970 A Design of the Infrastructure and Computer Network for Distance Education, Online Learning via New Media, E-Learning and Blended Learning

Authors: Sumitra Nuanmeesri

Abstract:

The research focus on study, analyze and design the model of the infrastructure and computer networks for distance education, online learning via new media, e-learning and blended learning. The collected information from study and analyze process that information was evaluated by the index of item objective congruence (IOC) by 9 specialists to design model. The results of evaluate the model with the mean and standard deviation by the sample of 9 specialists value is 3.85. The results showed that the infrastructure and computer networks are designed to be appropriate to a great extent appropriate to a great extent.

Keywords: blended learning, new media, infrastructure and computer network, tele-education, online learning

Procedia PDF Downloads 402
11969 Development of Fake News Model Using Machine Learning through Natural Language Processing

Authors: Sajjad Ahmed, Knut Hinkelmann, Flavio Corradini

Abstract:

Fake news detection research is still in the early stage as this is a relatively new phenomenon in the interest raised by society. Machine learning helps to solve complex problems and to build AI systems nowadays and especially in those cases where we have tacit knowledge or the knowledge that is not known. We used machine learning algorithms and for identification of fake news; we applied three classifiers; Passive Aggressive, Naïve Bayes, and Support Vector Machine. Simple classification is not completely correct in fake news detection because classification methods are not specialized for fake news. With the integration of machine learning and text-based processing, we can detect fake news and build classifiers that can classify the news data. Text classification mainly focuses on extracting various features of text and after that incorporating those features into classification. The big challenge in this area is the lack of an efficient way to differentiate between fake and non-fake due to the unavailability of corpora. We applied three different machine learning classifiers on two publicly available datasets. Experimental analysis based on the existing dataset indicates a very encouraging and improved performance.

Keywords: fake news detection, natural language processing, machine learning, classification techniques.

Procedia PDF Downloads 167
11968 Induction Machine Bearing Failure Detection Using Advanced Signal Processing Methods

Authors: Abdelghani Chahmi

Abstract:

This article examines the detection and localization of faults in electrical systems, particularly those using asynchronous machines. First, the process of failure will be characterized, relevant symptoms will be defined and based on those processes and symptoms, a model of those malfunctions will be obtained. Second, the development of the diagnosis of the machine will be shown. As studies of malfunctions in electrical systems could only rely on a small amount of experimental data, it has been essential to provide ourselves with simulation tools which allowed us to characterize the faulty behavior. Fault detection uses signal processing techniques in known operating phases.

Keywords: induction motor, modeling, bearing damage, airgap eccentricity, torque variation

Procedia PDF Downloads 139
11967 Normalized P-Laplacian: From Stochastic Game to Image Processing

Authors: Abderrahim Elmoataz

Abstract:

More and more contemporary applications involve data in the form of functions defined on irregular and topologically complicated domains (images, meshs, points clouds, networks, etc). Such data are not organized as familiar digital signals and images sampled on regular lattices. However, they can be conveniently represented as graphs where each vertex represents measured data and each edge represents a relationship (connectivity or certain affinities or interaction) between two vertices. Processing and analyzing these types of data is a major challenge for both image and machine learning communities. Hence, it is very important to transfer to graphs and networks many of the mathematical tools which were initially developed on usual Euclidean spaces and proven to be efficient for many inverse problems and applications dealing with usual image and signal domains. Historically, the main tools for the study of graphs or networks come from combinatorial and graph theory. In recent years there has been an increasing interest in the investigation of one of the major mathematical tools for signal and image analysis, which are Partial Differential Equations (PDEs) variational methods on graphs. The normalized p-laplacian operator has been recently introduced to model a stochastic game called tug-of-war-game with noise. Part interest of this class of operators arises from the fact that it includes, as particular case, the infinity Laplacian, the mean curvature operator and the traditionnal Laplacian operators which was extensiveley used to models and to solve problems in image processing. The purpose of this paper is to introduce and to study a new class of normalized p-Laplacian on graphs. The introduction is based on the extension of p-harmonious function introduced in as discrete approximation for both infinity Laplacian and p-Laplacian equations. Finally, we propose to use these operators as a framework for solving many inverse problems in image processing.

Keywords: normalized p-laplacian, image processing, stochastic game, inverse problems

Procedia PDF Downloads 512
11966 General Purpose Graphic Processing Units Based Real Time Video Tracking System

Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai

Abstract:

Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.

Keywords: connected components, embrace threads, local weighted kernel, structuring elements

Procedia PDF Downloads 440