Search results for: sentence processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3760

Search results for: sentence processing

2830 Computational and Experimental Determination of Acoustic Impedance of Internal Combustion Engine Exhaust

Authors: A. O. Glazkov, A. S. Krylova, G. G. Nadareishvili, A. S. Terenchenko, S. I. Yudin

Abstract:

The topic of the presented materials concerns the design of the exhaust system for a certain internal combustion engine. The exhaust system can be divided into two parts. The first is the engine exhaust manifold, turbocharger, and catalytic converters, which are called “hot part.” The second part is the gas exhaust system, which contains elements exclusively for reducing exhaust noise (mufflers, resonators), the accepted designation of which is the "cold part." The design of the exhaust system from the point of view of acoustics, that is, reducing the exhaust noise to a predetermined level, consists of working on the second part. Modern computer technology and software make it possible to design "cold part" with high accuracy in a given frequency range but with the condition of accurately specifying the input parameters, namely, the amplitude spectrum of the input noise and the acoustic impedance of the noise source in the form of an engine with a "hot part". Getting this data is a difficult problem: high temperatures, high exhaust gas velocities (turbulent flows), and high sound pressure levels (non-linearity mode) do not allow the calculated results to be applied with sufficient accuracy. The aim of this work is to obtain the most reliable acoustic output parameters of an engine with a "hot part" based on a complex of computational and experimental studies. The presented methodology includes several parts. The first part is a finite element simulation of the "cold part" of the exhaust system (taking into account the acoustic impedance of radiation of outlet pipe into open space) with the result in the form of the input impedance of "cold part". The second part is a finite element simulation of the "hot part" of the exhaust system (taking into account acoustic characteristics of catalytic units and geometry of turbocharger) with the result in the form of the input impedance of the "hot part". The next third part of the technique consists of the mathematical processing of the results according to the proposed formula for the convergence of the mathematical series of summation of multiple reflections of the acoustic signal "cold part" - "hot part". This is followed by conducting a set of tests on an engine stand with two high-temperature pressure sensors measuring pulsations in the nozzle between "hot part" and "cold part" of the exhaust system and subsequent processing of test results according to a well-known technique in order to separate the "incident" and "reflected" waves. The final stage consists of the mathematical processing of all calculated and experimental data to obtain a result in the form of a spectrum of the amplitude of the engine noise and its acoustic impedance.

Keywords: acoustic impedance, engine exhaust system, FEM model, test stand

Procedia PDF Downloads 38
2829 The Differences and Similarities in Neurocognitive Deficits in Mild Traumatic Brain Injury and Depression

Authors: Boris Ershov

Abstract:

Depression is the most common mood disorder experienced by patients who have sustained a traumatic brain injury (TBI) and is associated with poorer cognitive functional outcomes. However, in some cases, similar cognitive impairments can also be observed in depression. There is not enough information about the features of the cognitive deficit in patients with TBI in relation to patients with depression. TBI patients without depressive symptoms (TBInD, n25), TBI patients with depressive symptoms (TBID, n31), and 28 patients with bipolar II disorder (BP) were included in the study. There were no significant differences in participants in respect to age, handedness and educational level. The patients clinical status was determined by using Montgomery–Asberg Depression Rating Scale (MADRS). All participants completed a cognitive battery (The Brief Assessment of Cognition in Affective Disorders (BAC-A)). Additionally, the Rey–Osterrieth Complex Figure (ROCF) was used to assess visuospatial construction abilities and visual memory, as well as planning and organizational skills. Compared to BP, TBInD and TBID showed a significant impairments in visuomotor abilities, verbal and visual memory. There were no significant differences between BP and TBID groups in working memory, speed of information processing, problem solving. Interference effect (cognitive inhibition) was significantly greater in TBInD and TBID compared to BP. Memory bias towards mood-related information in BP and TBID was greater in comparison with TBInD. These results suggest that depressive symptoms are associated with impairments some executive functions in combination at decrease of speed of information processing.

Keywords: bipolar II disorder, depression, neurocognitive deficits, traumatic brain injury

Procedia PDF Downloads 333
2828 A Review on Cloud Computing and Internet of Things

Authors: Sahar S. Tabrizi, Dogan Ibrahim

Abstract:

Cloud Computing is a convenient model for on-demand networks that uses shared pools of virtual configurable computing resources, such as servers, networks, storage devices, applications, etc. The cloud serves as an environment for companies and organizations to use infrastructure resources without making any purchases and they can access such resources wherever and whenever they need. Cloud computing is useful to overcome a number of problems in various Information Technology (IT) domains such as Geographical Information Systems (GIS), Scientific Research, e-Governance Systems, Decision Support Systems, ERP, Web Application Development, Mobile Technology, etc. Companies can use Cloud Computing services to store large amounts of data that can be accessed from anywhere on Earth and also at any time. Such services are rented by the client companies where the actual rent depends upon the amount of data stored on the cloud and also the amount of processing power used in a given time period. The resources offered by the cloud service companies are flexible in the sense that the user companies can increase or decrease their storage requirements or the processing power requirements at any time, thus minimizing the overall rental cost of the service they receive. In addition, the Cloud Computing service providers offer fast processors and applications software that can be shared by their clients. This is especially important for small companies with limited budgets which cannot afford to purchase their own expensive hardware and software. This paper is an overview of the Cloud Computing, giving its types, principles, advantages, and disadvantages. In addition, the paper gives some example engineering applications of Cloud Computing and makes suggestions for possible future applications in the field of engineering.

Keywords: cloud computing, cloud systems, cloud services, IaaS, PaaS, SaaS

Procedia PDF Downloads 221
2827 An Image Processing Scheme for Skin Fungal Disease Identification

Authors: A. A. M. A. S. S. Perera, L. A. Ranasinghe, T. K. H. Nimeshika, D. M. Dhanushka Dissanayake, Namalie Walgampaya

Abstract:

Nowadays, skin fungal diseases are mostly found in people of tropical countries like Sri Lanka. A skin fungal disease is a particular kind of illness caused by fungus. These diseases have various dangerous effects on the skin and keep on spreading over time. It becomes important to identify these diseases at their initial stage to control it from spreading. This paper presents an automated skin fungal disease identification system implemented to speed up the diagnosis process by identifying skin fungal infections in digital images. An image of the diseased skin lesion is acquired and a comprehensive computer vision and image processing scheme is used to process the image for the disease identification. This includes colour analysis using RGB and HSV colour models, texture classification using Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix and Local Binary Pattern, Object detection, Shape Identification and many more. This paper presents the approach and its outcome for identification of four most common skin fungal infections, namely, Tinea Corporis, Sporotrichosis, Malassezia and Onychomycosis. The main intention of this research is to provide an automated skin fungal disease identification system that increase the diagnostic quality, shorten the time-to-diagnosis and improve the efficiency of detection and successful treatment for skin fungal diseases.

Keywords: Circularity Index, Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix, Local Binary Pattern, Object detection, Ring Detection, Shape Identification

Procedia PDF Downloads 212
2826 Experimental Correlation for Erythrocyte Aggregation Rate in Population Balance Modeling

Authors: Erfan Niazi, Marianne Fenech

Abstract:

Red Blood Cells (RBCs) or erythrocytes tend to form chain-like aggregates under low shear rate called rouleaux. This is a reversible process and rouleaux disaggregate in high shear rates. Therefore, RBCs aggregation occurs in the microcirculation where low shear rates are present but does not occur under normal physiological conditions in large arteries. Numerical modeling of RBCs interactions is fundamental in analytical models of a blood flow in microcirculation. Population Balance Modeling (PBM) is particularly useful for studying problems where particles agglomerate and break in a two phase flow systems to find flow characteristics. In this method, the elementary particles lose their individual identity due to continuous destructions and recreations by break-up and agglomeration. The aim of this study is to find RBCs aggregation in a dynamic situation. Simplified PBM was used previously to find the aggregation rate on a static observation of the RBCs aggregation in a drop of blood under the microscope. To find aggregation rate in a dynamic situation we propose an experimental set up testing RBCs sedimentation. In this test, RBCs interact and aggregate to form rouleaux. In this configuration, disaggregation can be neglected due to low shear stress. A high-speed camera is used to acquire video-microscopic pictures of the process. The sizes of the aggregates and velocity of sedimentation are extracted using an image processing techniques. Based on the data collection from 5 healthy human blood samples, the aggregation rate was estimated as 2.7x103(±0.3 x103) 1/s.

Keywords: red blood cell, rouleaux, microfluidics, image processing, population balance modeling

Procedia PDF Downloads 337
2825 Arabicization and Terminology with Reference to Social Media Terms

Authors: Ahmed Al-Awthan

Abstract:

This study addresses the prevalence of English terminology in published Arabic documentation on social media. Although the problem of using English terms in translation instead of existing native ones has been addressed in general by researchers around the world, to the best of the author’s knowledge the attitude of the translators as professionals to this phenomenon in Qatar and Yemen has not received a detailed study. This study examines the impact of the use of English, social media terms in the Arab world on aspiring and professional translators; it explores the benefits and drawbacks of linguistic borrowing as identified by the translators and investigates whether translators consider any means of resisting linguistic borrowing and prioritizing Arabic. It also aims to answer the following questions: i. Is there any prevalence of English, social media terms in Arabic translation? Why or why not? ii. Do Arabic translators prefer using English, social media terms to their equivalents in Arabic? If so, why? iii. Which measures could be adopted to help reduce the frequently observed borrowing of English terms? In particular, how do translators see the role of the Arabic Language Academies in preserving Arabic? iv. This research is descriptive, comparative and analytical in nature. It is both qualitative and quantitative. To validate the problem, the researcher will analyze articles published by Al-Jazeera in 2016-2018 that refer to the use of social media in diplomacy. It will be examined whether the increased international discussion of political events in social media increased the amount of transliterated English terminology referring to this mode of communication.To investigate whether the translators recognize the phenomenon of borrowing, the researcher proposes to use a survey. This survey will use multiple choice questions. It will target 20 aspiring translators from Yemen and 20 participants from Qatar. It will offer 15 English, social media terms used in discourse in 15 sentences. For each sentence, the researcher will provide three different translations and will ask the translators to rate them and offer their rendition. After collecting all the answers online, the researcher will analyze the data. The results are expected to confirm whether there is a prevalence of English terms in translating into Arabic. It is also expected to show what measures the translators used to render the English, social media terms, and it raises awareness of borrowing English terms. It will guide the translator toward using Arabicization methods in order to contribute to preserving Arabic.

Keywords: Arabicization, trans lingual borrowing, social media terms, terminology

Procedia PDF Downloads 137
2824 Using Artificial Intelligence Technology to Build the User-Oriented Platform for Integrated Archival Service

Authors: Lai Wenfang

Abstract:

Tthis study will describe how to use artificial intelligence (AI) technology to build the user-oriented platform for integrated archival service. The platform will be launched in 2020 by the National Archives Administration (NAA) in Taiwan. With the progression of information communication technology (ICT) the NAA has built many systems to provide archival service. In order to cope with new challenges, such as new ICT, artificial intelligence or blockchain etc. the NAA will try to use the natural language processing (NLP) and machine learning (ML) skill to build a training model and propose suggestions based on the data sent to the platform. NAA expects the platform not only can automatically inform the sending agencies’ staffs which records catalogues are against the transfer or destroy rules, but also can use the model to find the details hidden in the catalogues and suggest NAA’s staff whether the records should be or not to be, to shorten the auditing time. The platform keeps all the users’ browse trails; so that the platform can predict what kinds of archives user could be interested and recommend the search terms by visualization, moreover, inform them the new coming archives. In addition, according to the Archives Act, the NAA’s staff must spend a lot of time to mark or remove the personal data, classified data, etc. before archives provided. To upgrade the archives access service process, the platform will use some text recognition pattern to black out automatically, the staff only need to adjust the error and upload the correct one, when the platform has learned the accuracy will be getting higher. In short, the purpose of the platform is to deduct the government digital transformation and implement the vision of a service-oriented smart government.

Keywords: artificial intelligence, natural language processing, machine learning, visualization

Procedia PDF Downloads 152
2823 Visco - Plastic Transition and Transfer of Plastic Material with SGF in case of Linear Dry Friction Contact on Steel Surfaces

Authors: Lucian Capitanu, Virgil Florescu

Abstract:

Often for the laboratory studies, modeling of specific tribological processes raises special problems. One such problem is the modeling of some temperatures and extremely high contact pressures, allowing modeling of temperatures and pressures at which the injection or extrusion processing of thermoplastic materials takes place. Tribological problems occur mainly in thermoplastics materials reinforced with glass fibers. They produce an advanced wear to the barrels and screws of processing machines, in short time. Obtaining temperatures around 210 °C and higher, as well as pressures around 100 MPa is very difficult in the laboratory. This paper reports a simple and convenient solution to get these conditions, using friction sliding couples with linear contact, cylindrical liner plastic filled with glass fibers on plate steel samples, polished and super-finished. C120 steel, which is a steel for moulds and Rp3 steel, high speed steel for tools, were used. Obtaining the pressure was achieved by continuous request of the liner in rotational movement up to its elasticity limits, when the dry friction coefficient reaches or exceeds the hardness value of 0.5 HB. By dissipation of the power lost by friction on flat steel sample, are reached contact temperatures at the metal surface that reach and exceed 230 °C, being placed in the range temperature values of the injection. Contact pressures (in load and materials conditions used) ranging from 16.3-36.4 MPa were obtained depending on the plastic material used and the glass fibers content.

Keywords: plastics with glass fibers, dry friction, linear contact, contact temperature, contact pressure, experimental simulation

Procedia PDF Downloads 290
2822 Preparation of Carbon Nanofiber Reinforced HDPE Using Dialkylimidazolium as a Dispersing Agent: Effect on Thermal and Rheological Properties

Authors: J. Samuel, S. Al-Enezi, A. Al-Banna

Abstract:

High-density polyethylene reinforced with carbon nanofibers (HDPE/CNF) have been prepared via melt processing using dialkylimidazolium tetrafluoroborate (ionic liquid) as a dispersion agent. The prepared samples were characterized by thermogravimetric (TGA) and differential scanning calorimetric (DSC) analyses. The samples blended with imidazolium ionic liquid exhibit higher thermal stability. DSC analysis showed clear miscibility of ionic liquid in the HDPE matrix and showed single endothermic peak. The melt rheological analysis of HDPE/CNF composites was performed using an oscillatory rheometer. The influence of CNF and ionic liquid concentration (ranging from 0, 0.5, and 1 wt%) on the viscoelastic parameters was investigated at 200 °C with an angular frequency range of 0.1 to 100 rad/s. The rheological analysis shows the shear-thinning behavior for the composites. An improvement in the viscoelastic properties was observed as the nanofiber concentration increases. The progress in the modulus values was attributed to the structural rigidity imparted by the high aspect ratio CNF. The modulus values and complex viscosity of the composites increased significantly at low frequencies. Composites blended with ionic liquid exhibit slightly lower values of complex viscosity and modulus over the corresponding HDPE/CNF compositions. Therefore, reduction in melt viscosity is an additional benefit for polymer composite processing as a result of wetting effect by polymer-ionic liquid combinations.

Keywords: high-density polyethylene, carbon nanofibers, ionic liquid, complex viscosity

Procedia PDF Downloads 110
2821 Task Scheduling and Resource Allocation in Cloud-based on AHP Method

Authors: Zahra Ahmadi, Fazlollah Adibnia

Abstract:

Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).

Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow

Procedia PDF Downloads 129
2820 Towards Law Data Labelling Using Topic Modelling

Authors: Daniel Pinheiro Da Silva Junior, Aline Paes, Daniel De Oliveira, Christiano Lacerda Ghuerren, Marcio Duran

Abstract:

The Courts of Accounts are institutions responsible for overseeing and point out irregularities of Public Administration expenses. They have a high demand for processes to be analyzed, whose decisions must be grounded on severity laws. Despite the existing large amount of processes, there are several cases reporting similar subjects. Thus, previous decisions on already analyzed processes can be a precedent for current processes that refer to similar topics. Identifying similar topics is an open, yet essential task for identifying similarities between several processes. Since the actual amount of topics is considerably large, it is tedious and error-prone to identify topics using a pure manual approach. This paper presents a tool based on Machine Learning and Natural Language Processing to assists in building a labeled dataset. The tool relies on Topic Modelling with Latent Dirichlet Allocation to find the topics underlying a document followed by Jensen Shannon distance metric to generate a probability of similarity between documents pairs. Furthermore, in a case study with a corpus of decisions of the Rio de Janeiro State Court of Accounts, it was noted that data pre-processing plays an essential role in modeling relevant topics. Also, the combination of topic modeling and a calculated distance metric over document represented among generated topics has been proved useful in helping to construct a labeled base of similar and non-similar document pairs.

Keywords: courts of accounts, data labelling, document similarity, topic modeling

Procedia PDF Downloads 158
2819 The Reasons for Food Losses and Waste and the Trends of Their Management in Basic Vegetal Production in Poland

Authors: Krystian Szczepanski, Sylwia Łaba

Abstract:

Production of fruit and vegetables, food cereals or oilseeds affects the natural environment via intake of nutrients being contained in the soil, use of the resources of water, fertilizers and food protection products, and energy. The limitation of the mentioned effects requires the introduction of techniques and methods for cultivation being friendly to the environment and counteracting losses and waste of agricultural raw materials as well as the appropriate management of food waste in every stage of the agri-food supply chain. The link to basic production includes obtaining a vegetal raw material and its storage in agricultural farm and transport to a collecting point. When the plants are ready to be harvested is the initial point; the stage before harvesting is not considered in the system of measuring and monitoring the food losses. The moment at which the raw material enters the stage of processing, i.e., its receipt at the gate of the processing plant, is considered as a final point of basic production. According to the Regulation (EC) No 178/2002 of the European Parliament and of the Council of 28 January 2002, Art. 2, “food” means any substance or product, intended to be, or reasonably expected to be consumed by humans. For the needs of the studies and their analysis, it was determined when raw material is considered as food – the plants (fruit, vegetables, cereals, oilseeds), after being harvested, arrive at storehouses. The aim of the studies was to determine the reasons for loss generation and to analyze the directions of their management in basic vegetal production in Poland in the years 2017 and 2018. The studies on food losses and waste in basic vegetal production were carried out in three sectors – fruit and vegetables, cereals and oilseeds. The studies of the basic production were conducted during the period of March-May 2019 at the territory of the whole country on a representative trail of 250 farms in each sector. The surveys were carried out using the questionnaires by the PAP method; the pollsters conducted the direct questionnaire interviews. From the conducted studies, it is followed that in 19% of the examined farms, any losses were not recorded during preparation, loading, and transport of the raw material to the manufacturing plant. In the farms, where the losses were indicated, the main reason in production of fruit and vegetables was rotting and it constituted more than 20% of the reported reasons, while in the case of cereals and oilseeds’ production, the respondents identified damages, moisture and pests as the most frequent reason. The losses and waste, generated in vegetal production as well as in processing and trade of fruit and vegetables, or cereal products should be appropriately managed or recovered. The respondents indicated composting (more than 60%) as the main direction of waste management in all categories. Animal feed and landfill sites were the other indicated directions of management. Prevention and minimization of loss generation are important in every stage of production as well as in basic production. When possessing the knowledge on the reasons for loss generation, we may introduce the preventive measures, mainly connected with the appropriate conditions and methods of the storage. Production of fruit and vegetables, food cereals or oilseeds affects the natural environment via intake of nutrients being contained in the soil, use of the resources of water, fertilizers and food protection products, and energy. The limitation of the mentioned effects requires the introduction of techniques and methods for cultivation being friendly to the environment and counteracting losses and waste of agricultural raw materials as well as the appropriate management of food waste in every stage of the agri-food supply chain. The link to basic production includes obtaining a vegetal raw material and its storage in agricultural farm and transport to a collecting point. The starting point is when the plants are ready to be harvested; the stage before harvesting is not considered in the system of measuring and monitoring the food losses. The successive stage is the transport of the collected crops to the collecting point or its storage and transport. The moment, at which the raw material enters the stage of processing, i.e. its receipt at the gate of the processing plant, is considered as a final point of basic production. Processing is understood as the change of the raw material into food products. According to the Regulation (EC) No 178/2002 of the European Parliament and of the Council of 28 January 2002, Art. 2, “food” means any substance or product, intended to be, or reasonably expected to be consumed by humans. It was determined (for the needs of the present studies) when raw material is considered as a food; it is the moment when the plants (fruit, vegetables, cereals, oilseeds), after being harvested, arrive at storehouses. The aim of the studies was to determine the reasons for loss generation and to analyze the directions of their management in basic vegetal production in Poland in the years 2017 and 2018. The studies on food losses and waste in basic vegetal production were carried out in three sectors – fruit and vegetables, cereals and oilseeds. The studies of the basic production were conducted during the period of March-May 2019 at the territory of the whole country on a representative trail of 250 farms in each sector. The surveys were carried out using the questionnaires by the PAPI (Paper & Pen Personal Interview) method; the pollsters conducted the direct questionnaire interviews. From the conducted studies, it is followed that in 19% of the examined farms, any losses were not recorded during preparation, loading, and transport of the raw material to the manufacturing plant. In the farms, where the losses were indicated, the main reason in production of fruit and vegetables was rotting and it constituted more than 20% of the reported reasons, while in the case of cereals and oilseeds’ production, the respondents identified damages, moisture, and pests as the most frequent reason. The losses and waste, generated in vegetal production as well as in processing and trade of fruit and vegetables, or cereal products should be appropriately managed or recovered. The respondents indicated composting (more than 60%) as the main direction of waste management in all categories. Animal feed and landfill sites were the other indicated directions of management. Prevention and minimization of loss generation are important in every stage of production as well as in basic production. When possessing the knowledge on the reasons for loss generation, we may introduce the preventive measures, mainly connected with the appropriate conditions and methods of the storage. ACKNOWLEDGEMENT The article was prepared within the project: "Development of a waste food monitoring system and an effective program to rationalize losses and reduce food waste", acronym PROM implemented under the STRATEGIC SCIENTIFIC AND LEARNING PROGRAM - GOSPOSTRATEG financed by the National Center for Research and Development in accordance with the provisions of Gospostrateg1 / 385753/1/2018

Keywords: food losses, food waste, PAP method, vegetal production

Procedia PDF Downloads 100
2818 Simplifying Writing Composition to Assist Students in Rural Areas: An Experimental Study for the Comparison of Guided and Unguided Instruction

Authors: Neha Toppo

Abstract:

Method and strategies of teaching instruction highly influence learning of students. In second language teaching, number of ways and methods has been suggested by different scholars and researchers through times. The present article deals with the role of teaching instruction in developing compositional ability of students in writing. It focuses on the secondary level students of rural areas, whose exposure to English language is limited and they face challenges even in simple compositions. The students till high school suffer with their disability in writing formal letter, application, essay, paragraph etc. They face problem in note making, writing answers in examination using their own words and depend fully on rote learning. It becomes difficult for them to give language to their own ideas. Teaching writing composition deserves special attention as writing is an integral part of language learning and students at this level are expected to have sound compositional ability for it is useful in numerous domains. Effective method of instruction could help students to learn expression of self, correct selection of vocabulary and grammar, contextual writing, composition of formal and informal writing. It is not limited to school but continues to be important in various other fields outside the school such as in newspaper and magazine, official work, legislative work, material writing, academic writing, personal writing, etc. The study is based on the experimental method, which hypothesize that guided instruction will be more effective in teaching writing compositions than usual instruction in which students are left to compose by their own without any help. In the test, students of one section are asked to write an essay on the given topic without guidance and another section are asked to write the same but with the assistance of guided instruction in which students have been provided with a few vocabulary and sentence structure. This process is repeated in few more schools to get generalize data. The study shows the difference on students’ performance using both the instructions; guided and unguided. The conclusion of the study is followed by the finding that writing skill of the students is quite poor but with the help of guided instruction they perform better. The students are in need of better teaching instruction to develop their writing skills.

Keywords: composition, essay, guided instruction, writing skill

Procedia PDF Downloads 264
2817 Quantification of E-Waste: A Case Study in Federal University of Espírito Santo, Brazil

Authors: Andressa S. T. Gomes, Luiza A. Souza, Luciana H. Yamane, Renato R. Siman

Abstract:

The segregation of waste of electrical and electronic equipment (WEEE) in the generating source, its characterization (quali-quantitative) and identification of origin, besides being integral parts of classification reports, are crucial steps to the success of its integrated management. The aim of this paper was to count WEEE generation at the Federal University of Espírito Santo (UFES), Brazil, as well as to define sources, temporary storage sites, main transportations routes and destinations, the most generated WEEE and its recycling potential. Quantification of WEEE generated at the University in the years between 2010 and 2015 was performed using data analysis provided by UFES’s sector of assets management. EEE and WEEE flow in the campuses information were obtained through questionnaires applied to the University workers. It was recorded 6028 WEEEs units of data processing equipment disposed by the university between 2010 and 2015. Among these waste, the most generated were CRT screens, desktops, keyboards and printers. Furthermore, it was observed that these WEEEs are temporarily stored in inappropriate places at the University campuses. In general, these WEEE units are donated to NGOs of the city, or sold through auctions (2010 and 2013). As for recycling potential, from the primary processing and further sale of printed circuit boards (PCB) from the computers, the amount collected could reach U$ 27,839.23. The results highlight the importance of a WEEE management policy at the University.

Keywords: solid waste, waste of electrical and electronic equipment, waste management, institutional solid waste generation

Procedia PDF Downloads 239
2816 Opinion Mining to Extract Community Emotions on Covid-19 Immunization Possible Side Effects

Authors: Yahya Almurtadha, Mukhtar Ghaleb, Ahmed M. Shamsan Saleh

Abstract:

The world witnessed a fierce attack from the Covid-19 virus, which affected public life socially, economically, healthily and psychologically. The world's governments tried to confront the pandemic by imposing a number of precautionary measures such as general closure, curfews and social distancing. Scientists have also made strenuous efforts to develop an effective vaccine to train the immune system to develop antibodies to combat the virus, thus reducing its symptoms and limiting its spread. Artificial intelligence, along with researchers and medical authorities, has accelerated the vaccine development process through big data processing and simulation. On the other hand, one of the most important negatives of the impact of Covid 19 was the state of anxiety and fear due to the blowout of rumors through social media, which prompted governments to try to reassure the public with the available means. This study aims to proposed using Sentiment Analysis (AKA Opinion Mining) and deep learning as efficient artificial intelligence techniques to work on retrieving the tweets of the public from Twitter and then analyze it automatically to extract their opinions, expression and feelings, negatively or positively, about the symptoms they may feel after vaccination. Sentiment analysis is characterized by its ability to access what the public post in social media within a record time and at a lower cost than traditional means such as questionnaires and interviews, not to mention the accuracy of the information as it comes from what the public expresses voluntarily.

Keywords: deep learning, opinion mining, natural language processing, sentiment analysis

Procedia PDF Downloads 152
2815 Enhanced Tensor Tomographic Reconstruction: Integrating Absorption, Refraction and Temporal Effects

Authors: Lukas Vierus, Thomas Schuster

Abstract:

A general framework is examined for dynamic tensor field tomography within an inhomogeneous medium characterized by refraction and absorption, treated as an inverse source problem concerning the associated transport equation. Guided by Fermat’s principle, the Riemannian metric within the specified domain is determined by the medium's refractive index. While considerable literature exists on the inverse problem of reconstructing a tensor field from its longitudinal ray transform within a static Euclidean environment, limited inversion formulas and algorithms are available for general Riemannian metrics and time-varying tensor fields. It is established that tensor field tomography, akin to an inverse source problem for a transport equation, persists in dynamic scenarios. Framing dynamic tensor tomography as an inverse source problem embodies a comprehensive perspective within this domain. Ensuring well-defined forward mappings necessitates establishing existence and uniqueness for the underlying transport equations. However, the bilinear forms of the associated weak formulations fail to meet the coercivity condition. Consequently, recourse to viscosity solutions is taken, demonstrating their unique existence within suitable Sobolev spaces (in the static case) and Sobolev-Bochner spaces (in the dynamic case), under a specific assumption restricting variations in the refractive index. Notably, the adjoint problem can also be reformulated as a transport equation, with analogous results regarding uniqueness. Analytical solutions are expressed as integrals over geodesics, facilitating more efficient evaluation of forward and adjoint operators compared to solving partial differential equations. Certainly, here's the revised sentence in English: Numerical experiments are conducted using a Nesterov-accelerated Landweber method, encompassing various fields, absorption coefficients, and refractive indices, thereby illustrating the enhanced reconstruction achieved through this holistic modeling approach.

Keywords: attenuated refractive dynamic ray transform of tensor fields, geodesics, transport equation, viscosity solutions

Procedia PDF Downloads 25
2814 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations

Authors: Ram Mohan, Richard Haney, Ajit Kelkar

Abstract:

Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.

Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance

Procedia PDF Downloads 351
2813 Processing Methods for Increasing the Yield, Nutritional Value and Stability of Coconut Milk

Authors: Archana G. Lamdande, Shyam R. Garud, K. S. M. S. Raghavarao

Abstract:

Coconut has two edible parts, that is, a white kernel (solid endosperm) and coconut water (liquid endosperm). The white kernel is generally used in fresh or dried form for culinary purposes. Coconut testa, is the brown skin, covering the coconut kernel. It is removed by paring of wet coconut and obtained as a by-product in coconut processing industries during the production of products such as desiccated coconut, coconut milk, whole coconut milk powder and virgin coconut oil. At present, it is used as animal feed component after drying and recovering the residual oil (by expelling). Experiments were carried out on expelling of coconut milk for shredded coconut with and without testa removal, in order to explore the possibility of increasing the milk yield and value addition in terms of increased polyphenol content. The color characteristics of coconut milk obtained from the grating without removal of testa were observed to be L* 82.79, a* 0.0125, b* 6.245, while that obtained from grating with removal of testa were L* 83.24, a* -0.7925, b* 3.1. A significant increase was observed in total phenol content of coconut milk obtained from the grating with testa (833.8 µl/ml) when compared to that from without testa (521.3 µl/ml). However, significant difference was not observed in protein content of coconut milk obtained from the grating with and without testa (4.9 and 5.0% w/w, respectively). Coconut milk obtained from grating without removal of testa showed higher milk yield (62% w/w) when compared to that obtained from grating with removal of testa (60% w/w). The fat content in coconut milk was observed to be 32% (w/w), and it is unstable due to such a high fat content. Therefore, several experiments were carried out for examining its stability by adjusting the fat content at different levels (32, 28, 24, and 20% w/w). It was found that the coconut milk was more stable with a fat content of 24 % (w/w). Homogenization and ultrasonication and their combinations were used for exploring the possibility of increasing the stability of coconut milk. The microscopic study was carried out for analyzing the size of fat globules and the degree of their uniform distribution.

Keywords: coconut milk, homogenization, stability, testa, ultrasonication

Procedia PDF Downloads 299
2812 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights

Authors: Julian Wise

Abstract:

Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.

Keywords: mineral technology, big data, machine learning operations, data lake

Procedia PDF Downloads 94
2811 Contribution of Spatial Teledetection to the Geological Mapping of the Imiter Buttonhole: Application to the Mineralized Structures of the Principal Corps B3 (CPB3) of the Imiter Mine (Anti-atlas, Morocco)

Authors: Bouayachi Ali, Alikouss Saida, Baroudi Zouhir, Zerhouni Youssef, Zouhair Mohammed, El Idrissi Assia, Essalhi Mourad

Abstract:

The world-class Imiter silver deposit is located on the northern flank of the Precambrian Imiter buttonhole. This deposit is formed by epithermal veins hosted in the sandstone-pelite formations of the lower complex and in the basic conglomerates of the upper complex, these veins are controlled by a regional scale fault cluster, oriented N70°E to N90°E. The present work on the contribution of remote sensing on the geological mapping of the Imiter buttonhole and application to the mineralized structures of the Principal Corps B3. Mapping on satellite images is a very important tool in mineral prospecting. It allows the localization of the zones of interest in order to orientate the field missions by helping the localization of the major structures which facilitates the interpretation, the programming and the orientation of the mining works. The predictive map also allows for the correction of field mapping work, especially the direction and dimensions of structures such as dykes, corridors or scrapings. The use of a series of processing such as SAM, PCA, MNF and unsupervised and supervised classification on a Landsat 8 satellite image of the study area allowed us to highlight the main facies of the Imite area. To improve the exploration research, we used another processing that allows to realize a spatial distribution of the alteration mineral indices, and the application of several filters on the different bands to have lineament maps.

Keywords: principal corps B3, teledetection, Landsat 8, Imiter II, silver mineralization, lineaments

Procedia PDF Downloads 81
2810 A U-Net Based Architecture for Fast and Accurate Diagram Extraction

Authors: Revoti Prasad Bora, Saurabh Yadav, Nikita Katyal

Abstract:

In the context of educational data mining, the use case of extracting information from images containing both text and diagrams is of high importance. Hence, document analysis requires the extraction of diagrams from such images and processes the text and diagrams separately. To the author’s best knowledge, none among plenty of approaches for extracting tables, figures, etc., suffice the need for real-time processing with high accuracy as needed in multiple applications. In the education domain, diagrams can be of varied characteristics viz. line-based i.e. geometric diagrams, chemical bonds, mathematical formulas, etc. There are two broad categories of approaches that try to solve similar problems viz. traditional computer vision based approaches and deep learning approaches. The traditional computer vision based approaches mainly leverage connected components and distance transform based processing and hence perform well in very limited scenarios. The existing deep learning approaches either leverage YOLO or faster-RCNN architectures. These approaches suffer from a performance-accuracy tradeoff. This paper proposes a U-Net based architecture that formulates the diagram extraction as a segmentation problem. The proposed method provides similar accuracy with a much faster extraction time as compared to the mentioned state-of-the-art approaches. Further, the segmentation mask in this approach allows the extraction of diagrams of irregular shapes.

Keywords: computer vision, deep-learning, educational data mining, faster-RCNN, figure extraction, image segmentation, real-time document analysis, text extraction, U-Net, YOLO

Procedia PDF Downloads 114
2809 Interlanguage Acquisition of a Postposition ‘e’ in Korean: Analysis of the Korean Novice Learners’ Output

Authors: Eunjung Lee

Abstract:

This study aims to analyze the sentences generated by the beginners who learn ‘e,’ a postposition in Korean and to find out the regularity of learners’ interlanguage upon investigating the usages of ‘e’ that appears by meanings and functions in their interlanguage, and conditions that ‘e’ is used. This study was conducted with mainly two assumptions; first, the learner’s language has the specific type of interlanguage; and second, there is the regularity of interlanguage when students produce ‘e’ under the specific conditions. Learners’ output has various values and can be used as the useful data to understand interlanguage. Therefore, all the sentences containing a postposition ‘e’ by English speaking learners were searched in ‘Learners’ corpus sharing center in The National Institute of Korean Language’ in Korea, and the data were collected upon limiting the levels of learners with Level 1 and 2. 789 sentences that were used with ‘e’ were selected as the final subjects of the analysis. First, to understand the environmental characteristics to be used with a postposition, ‘e’ after summarizing 13 meaning and functions of ‘e’ appeared in three books of Korean dictionary that summarized the Korean grammar, 1) meaning function of ‘e’ that were used in each sentence was classified; 2) the nouns that were combined with ‘e,’ keywords of the sentences, and the characteristics of modifiers, linkers, and predicates appeared in front of ‘e’ were analyzed; 3) the regularity by the novice learners’ meaning and functions were reviewed; and 4) the differences of the regularity by level 1 and 2 learners’ meaning and functions were found. Upon the study results, the novice learners showed 1) they used the nouns related to ‘time(시간), before(전), after(후), next(다음), the next(그다음), then(때), day of the week(요일), and season(계절)’ mainly in front of ‘e’ when they used ‘e’ as the meaning function of time; 2) they used mainly the verbs of ‘go(가다),’ ‘come(오다),’ and ‘go round(다니다)’ as the predicate to match with ‘e’ that was the meaning function of direction and destination; and 3) they used mainly the nouns related to ‘locations or countries’ in front of ‘e,’ a meaning function postposition of ‘place,’ used mainly the verbs ‘be(있다), not be(없다), live(살다), be many(많다)’ after ‘e,’ and ‘i(이) or ka(가)’ was combined mainly in the subject words in case of ‘be(있다), not be(없다)’ or ‘be many(많다),’ and ‘eun(은) or nun(는)’ was combined mainly in the subject words in front of ‘live at’ In addition, 4) they used ‘e’ which indicates ‘cause or reason’ in the form of ‘because( 때문에),’ and 5) used ‘e’ of the subjects as the predicates to match with the predicates such as ‘treat(대하다), like(들다), and catch(걸리다).’ From these results, ‘e’ usage patterns of the Korean novice learners demonstrated very differently by the meaning functions and the learners’ interlanguage regularity could be deducted. However, little difference was found in interlanguage regularity between level 1 and 2. This study has the meaning to try to understand the interlanguage system and regularity in the learners’ acquisition process of postposition ‘e’ and this can be utilized to lessen their errors.

Keywords: interlanguage, interlagnage anaylsis, postposition ‘e’, Korean acquisition

Procedia PDF Downloads 111
2808 The Importance of Visual Communication in Artificial Intelligence

Authors: Manjitsingh Rajput

Abstract:

Visual communication plays an important role in artificial intelligence (AI) because it enables machines to understand and interpret visual information, similar to how humans do. This abstract explores the importance of visual communication in AI and emphasizes the importance of various applications such as computer vision, object emphasis recognition, image classification and autonomous systems. In going deeper, with deep learning techniques and neural networks that modify visual understanding, In addition to AI programming, the abstract discusses challenges facing visual interfaces for AI, such as data scarcity, domain optimization, and interpretability. Visual communication and other approaches, such as natural language processing and speech recognition, have also been explored. Overall, this abstract highlights the critical role that visual communication plays in advancing AI capabilities and enabling machines to perceive and understand the world around them. The abstract also explores the integration of visual communication with other modalities like natural language processing and speech recognition, emphasizing the critical role of visual communication in AI capabilities. This methodology explores the importance of visual communication in AI development and implementation, highlighting its potential to enhance the effectiveness and accessibility of AI systems. It provides a comprehensive approach to integrating visual elements into AI systems, making them more user-friendly and efficient. In conclusion, Visual communication is crucial in AI systems for object recognition, facial analysis, and augmented reality, but challenges like data quality, interpretability, and ethics must be addressed. Visual communication enhances user experience, decision-making, accessibility, and collaboration. Developers can integrate visual elements for efficient and accessible AI systems.

Keywords: visual communication AI, computer vision, visual aid in communication, essence of visual communication.

Procedia PDF Downloads 69
2807 Development of Mobile Application for Internship Program Management Using the Concept of Model View Controller (MVC) Pattern

Authors: Shutchapol Chopvitayakun

Abstract:

Nowadays, especially for the last 5 years, mobile devices, mobile applications and mobile users, through the deployment of wireless communication and mobile phone cellular network, all these components are growing significantly bigger and stronger. They are being integrated into each other to create multiple purposes and pervasive deployments into every business and non-business sector such as education, medicine, traveling, finance, real estate and many more. Objective of this study was to develop a mobile application for seniors or last-year students who enroll the internship program at each tertiary school (undergraduate school) and do onsite practice at real field sties, real organizations and real workspaces. During the internship session, all students as the interns are required to exercise, drilling and training onsite with specific locations and specific tasks or may be some assignments from their supervisor. Their work spaces are both private and government corporates and enterprises. This mobile application is developed under schema of a transactional processing system that enables users to keep daily work or practice log, monitor true working locations and ability to follow daily tasks of each trainee. Moreover, it provides useful guidance from each intern’s advisor, in case of emergency. Finally, it can summarize all transactional data then calculate each internship cumulated hours from the field practice session for each individual intern.

Keywords: internship, mobile application, Android OS, smart phone devices, mobile transactional processing system, guidance and monitoring, tertiary education, senior students, model view controller (MVC)

Procedia PDF Downloads 294
2806 X-Ray Diffraction, Microstructure, and Mössbauer Studies of Nanostructured Materials Obtained by High-Energy Ball Milling

Authors: N. Boudinar, A. Djekoun, A. Otmani, B. Bouzabata, J. M. Greneche

Abstract:

High-energy ball milling is a solid-state powder processing technique that allows synthesizing a variety of equilibrium and non-equilibrium alloy phases starting from elemental powders. The advantage of this process technology is that the powder can be produced in large quantities and the processing parameters can be easily controlled, thus it is a suitable method for commercial applications. It can also be used to produce amorphous and nanocrystalline materials in commercially relevant amounts and is also amenable to the production of a variety of alloy compositions. Mechanical alloying (high-energy ball milling) provides an inter-dispersion of elements through a repeated cold welding and fracture of free powder particles; the grain size decreases to nano metric scale and the element mix together. Progressively, the concentration gradients disappear and eventually the elements are mixed at the atomic scale. The end products depend on many parameters such as the milling conditions and the thermodynamic properties of the milled system. Here, the mechanical alloying technique has been used to prepare nano crystalline Fe_50 and Fe_64 wt.% Ni alloys from powder mixtures. Scanning electron microscopy (SEM) with energy-dispersive, X-ray analyses and Mössbauer spectroscopy were used to study the mixing at nanometric scale. The Mössbauer Spectroscopy confirmed the ferromagnetic ordering and was use to calculate the distribution of hyperfin field. The Mössbauer spectrum for both alloys shows the existence of a ferromagnetic phase attributed to γ-Fe-Ni solid solution.

Keywords: nanocrystalline, mechanical alloying, X-ray diffraction, Mössbauer spectroscopy, phase transformations

Procedia PDF Downloads 422
2805 Vehicle Speed Estimation Using Image Processing

Authors: Prodipta Bhowmik, Poulami Saha, Preety Mehra, Yogesh Soni, Triloki Nath Jha

Abstract:

In India, the smart city concept is growing day by day. So, for smart city development, a better traffic management and monitoring system is a very important requirement. Nowadays, road accidents increase due to more vehicles on the road. Reckless driving is mainly responsible for a huge number of accidents. So, an efficient traffic management system is required for all kinds of roads to control the traffic speed. The speed limit varies from road to road basis. Previously, there was a radar system but due to high cost and less precision, the radar system is unable to become favorable in a traffic management system. Traffic management system faces different types of problems every day and it has become a researchable topic on how to solve this problem. This paper proposed a computer vision and machine learning-based automated system for multiple vehicle detection, tracking, and speed estimation of vehicles using image processing. Detection of vehicles and estimating their speed from a real-time video is tough work to do. The objective of this paper is to detect vehicles and estimate their speed as accurately as possible. So for this, a real-time video is first captured, then the frames are extracted from that video, then from that frames, the vehicles are detected, and thereafter, the tracking of vehicles starts, and finally, the speed of the moving vehicles is estimated. The goal of this method is to develop a cost-friendly system that can able to detect multiple types of vehicles at the same time.

Keywords: OpenCV, Haar Cascade classifier, DLIB, YOLOV3, centroid tracker, vehicle detection, vehicle tracking, vehicle speed estimation, computer vision

Procedia PDF Downloads 63
2804 Isolation and Selection of Strains Perspective for Sewage Sludge Processing

Authors: A. Zh. Aupova, A. Ulankyzy, A. Sarsenova, A. Kussayin, Sh. Turarbek, N. Moldagulova, A. Kurmanbayev

Abstract:

One of the methods of organic waste bioconversion into environmentally-friendly fertilizer is composting. Microorganisms that produce hydrolytic enzymes play a significant role in accelerating the process of organic waste composting. We studied the enzymatic potential (amylase, protease, cellulase, lipase, urease activity) of bacteria isolated from the sewage sludge of Nur-Sultan, Rudny, and Fort-Shevchenko cities, the dacha soil of Nur-Sultan city, and freshly cut grass from the dacha for processing organic waste and identifying active strains. Microorganism isolation was carried out by the cultures enrichment method on liquid nutrient media, followed by inoculating on different solid media to isolate individual colonies. As a result, sixty-one microorganisms were isolated, three of which were thermophiles (DS1, DS2, and DS3). The highest number of isolates, twenty-one and eighteen, were isolated from sewage sludge of Nur-Sultan and Rudny cities, respectively. Ten isolates were isolated from the wastewater of the sewage treatment plant in Fort-Shevchenko. From the dacha soil of Nur-Sultan city and freshly cut grass - 9 and 5 isolates were revealed, respectively. The lipolytic, proteolytic, amylolytic, cellulolytic, ureolytic, and oil-oxidizing activities of isolates were studied. According to the results of experiments, starch hydrolysis (amylolytic activity) was found in 2 isolates - CB2/2, and CB2/1. Three isolates - CB2, CB2/1, and CB1/1 were selected for the highest ability to break down casein. Among isolated 61 bacterial cultures, three isolates could break down fats - CB3, CBG1/1, and IL3. Seven strains had cellulolytic activity - DS1, DS2, IL3, IL5, P2, P5, and P3. Six isolates rapidly decomposed urea. Isolate P1 could break down casein and cellulose. Isolate DS3 was a thermophile and had cellulolytic activity. Thus, based on the conducted studies, 15 isolates were selected as a potential for sewage sludge composting - CB2, CB3, CB1/1, CB2/2, CBG1/1, CB2/1, DS1, DS2, DS3, IL3, IL5, P1, P2, P5, P3. Selected strains were identified on a mass spectrometer (Maldi-TOF). The isolate - CB 3 was referred to the genus Rhodococcus rhodochrous; two isolates CB2 and CB1 / 1 - to Bacillus cereus, CB 2/2 - to Cryseobacterium arachidis, CBG 1/1 - to Pseudoxanthomonas sp., CB2/1 - to Bacillus megaterium, DS1 - to Pediococcus acidilactici, DS2 - to Paenibacillus residui, DS3 - to Brevibacillus invocatus, three strains IL3, P5, P3 - to Enterobacter cloacae, two strains IL5, P2 - to Ochrobactrum intermedium, and P1 - Bacillus lichenoformis. Hence, 60 isolates were isolated from the wastewater of the cities of Nur-Sultan, Rudny, Fort-Shevchenko, the dacha soil of Nur-Sultan city, and freshly cut grass from the dacha. Based on the highest enzymatic activity, 15 active isolates were selected and identified. These strains may become the candidates for bio preparation for sewage sludge processing.

Keywords: sewage sludge, composting, bacteria, enzymatic activity

Procedia PDF Downloads 88
2803 Low Temperature Biological Treatment of Chemical Oxygen Demand for Agricultural Water Reuse Application Using Robust Biocatalysts

Authors: Vedansh Gupta, Allyson Lutz, Ameen Razavi, Fatemeh Shirazi

Abstract:

The agriculture industry is especially vulnerable to forecasted water shortages. In the fresh and fresh-cut produce sector, conventional flume-based washing with recirculation exhibits high water demand. This leads to a large water footprint and possible cross-contamination of pathogens. These can be alleviated through advanced water reuse processes, such as membrane technologies including reverse osmosis (RO). Water reuse technologies effectively remove dissolved constituents but can easily foul without pre-treatment. Biological treatment is effective for the removal of organic compounds responsible for fouling, but not at the low temperatures encountered at most produce processing facilities. This study showed that the Microvi MicroNiche Engineering (MNE) technology effectively removes organic compounds (> 80%) at low temperatures (6-8 °C) from wash water. The MNE technology uses synthetic microorganism-material composites with negligible solids production, making it advantageously situated as an effective bio-pretreatment for RO. A preliminary technoeconomic analysis showed 60-80% savings in operation and maintenance costs (OPEX) when using the Microvi MNE technology for organics removal. This study and the accompanying economic analysis indicated that the proposed technology process will substantially reduce the cost barrier for adopting water reuse practices, thereby contributing to increased food safety and furthering sustainable water reuse processes across the agricultural industry.

Keywords: biological pre-treatment, innovative technology, vegetable processing, water reuse, agriculture, reverse osmosis, MNE biocatalysts

Procedia PDF Downloads 114
2802 Quality Analysis of Lake Malawi's Diplotaxodon Fish Species Processed in Solar Tent Dryer versus Open Sun Drying

Authors: James Banda, Jupiter Simbeye, Essau Chisale, Geoffrey Kanyerere, Kings Kamtambe

Abstract:

Improved solar tent dryers for processing small fish species were designed to reduce post-harvest fish losses and improve supply of quality fish products in the southern part of Lake Malawi under CultiAF project. A comparative analysis of the quality of Diplotaxodon (Ndunduma) from Lake Malawi processed in solar tent dryer and open sun drying was conducted using proximate analysis, microbial analysis and sensory evaluation. Proximates for solar tent dried fish and open sun dried fish in terms of proteins, fats, moisture and ash were 63.3±0.15% and 63.3±0.34%, 19.6±0.09% and 19.9±0.25%, 8.3±0.12% and 17.0±0.01%, and 15.6±0.61% and 21.9±0.91% respectively. Crude protein and crude fat showed non-significant differences (p = 0.05), while moisture and ash content were significantly different (p = 001). Open sun dried fish had significantly higher numbers of viable bacteria counts (5.2×10⁶ CFU) than solar tent dried fish (3.9×10² CFU). Most isolated bacteria from solar tent dried and open sun dried fish were 1.0×10¹ and 7.2×10³ for Total coliform, 0 and 4.5 × 10³ for Escherishia coli, 0 and 7.5 × 10³ for Salmonella, 0 and 5.7×10² for shigella, 4.0×10¹ and 6.1×10³ for Staphylococcus, 1.0×10¹ and 7.0×10² for vibrio. Qualitative evaluation of sensory properties showed higher acceptability of 3.8 for solar tent dried fish than 1.7 for open sun dried fish. It is concluded that promotion of solar tent drying in processing small fish species in Malawi would support small-scale fish processors to produce quality fish in terms of nutritive value, reduced microbial contamination, sensory acceptability and reduced moisture content.

Keywords: diplotaxodon, Malawi, open sun drying, solar tent drying

Procedia PDF Downloads 312
2801 Design and Implementation of Collaborative Editing System Based on Physical Simulation Engine Running State

Authors: Zhang Songning, Guan Zheng, Ci Yan, Ding Gangyi

Abstract:

The application of physical simulation engines in collaborative editing systems has an important background and role. Firstly, physical simulation engines can provide real-world physical simulations, enabling users to interact and collaborate in real time in virtual environments. This provides a more intuitive and immersive experience for collaborative editing systems, allowing users to more accurately perceive and understand various elements and operations in collaborative editing. Secondly, through physical simulation engines, different users can share virtual space and perform real-time collaborative editing within it. This real-time sharing and collaborative editing method helps to synchronize information among team members and improve the efficiency of collaborative work. Through experiments, the average model transmission speed of a single person in the collaborative editing system has increased by 141.91%; the average model processing speed of a single person has increased by 134.2%; the average processing flow rate of a single person has increased by 175.19%; the overall efficiency improvement rate of a single person has increased by 150.43%. With the increase in the number of users, the overall efficiency remains stable, and the physical simulation engine running status collaborative editing system also has horizontal scalability. It is not difficult to see that the design and implementation of a collaborative editing system based on physical simulation engines not only enriches the user experience but also optimizes the effectiveness of team collaboration, providing new possibilities for collaborative work.

Keywords: physics engine, simulation technology, collaborative editing, system design, data transmission

Procedia PDF Downloads 56