Search results for: semantic processing
2964 The Reasons for Food Losses and Waste and the Trends of Their Management in Basic Vegetal Production in Poland
Authors: Krystian Szczepanski, Sylwia Łaba
Abstract:
Production of fruit and vegetables, food cereals or oilseeds affects the natural environment via intake of nutrients being contained in the soil, use of the resources of water, fertilizers and food protection products, and energy. The limitation of the mentioned effects requires the introduction of techniques and methods for cultivation being friendly to the environment and counteracting losses and waste of agricultural raw materials as well as the appropriate management of food waste in every stage of the agri-food supply chain. The link to basic production includes obtaining a vegetal raw material and its storage in agricultural farm and transport to a collecting point. When the plants are ready to be harvested is the initial point; the stage before harvesting is not considered in the system of measuring and monitoring the food losses. The moment at which the raw material enters the stage of processing, i.e., its receipt at the gate of the processing plant, is considered as a final point of basic production. According to the Regulation (EC) No 178/2002 of the European Parliament and of the Council of 28 January 2002, Art. 2, “food” means any substance or product, intended to be, or reasonably expected to be consumed by humans. For the needs of the studies and their analysis, it was determined when raw material is considered as food – the plants (fruit, vegetables, cereals, oilseeds), after being harvested, arrive at storehouses. The aim of the studies was to determine the reasons for loss generation and to analyze the directions of their management in basic vegetal production in Poland in the years 2017 and 2018. The studies on food losses and waste in basic vegetal production were carried out in three sectors – fruit and vegetables, cereals and oilseeds. The studies of the basic production were conducted during the period of March-May 2019 at the territory of the whole country on a representative trail of 250 farms in each sector. The surveys were carried out using the questionnaires by the PAP method; the pollsters conducted the direct questionnaire interviews. From the conducted studies, it is followed that in 19% of the examined farms, any losses were not recorded during preparation, loading, and transport of the raw material to the manufacturing plant. In the farms, where the losses were indicated, the main reason in production of fruit and vegetables was rotting and it constituted more than 20% of the reported reasons, while in the case of cereals and oilseeds’ production, the respondents identified damages, moisture and pests as the most frequent reason. The losses and waste, generated in vegetal production as well as in processing and trade of fruit and vegetables, or cereal products should be appropriately managed or recovered. The respondents indicated composting (more than 60%) as the main direction of waste management in all categories. Animal feed and landfill sites were the other indicated directions of management. Prevention and minimization of loss generation are important in every stage of production as well as in basic production. When possessing the knowledge on the reasons for loss generation, we may introduce the preventive measures, mainly connected with the appropriate conditions and methods of the storage. Production of fruit and vegetables, food cereals or oilseeds affects the natural environment via intake of nutrients being contained in the soil, use of the resources of water, fertilizers and food protection products, and energy. The limitation of the mentioned effects requires the introduction of techniques and methods for cultivation being friendly to the environment and counteracting losses and waste of agricultural raw materials as well as the appropriate management of food waste in every stage of the agri-food supply chain. The link to basic production includes obtaining a vegetal raw material and its storage in agricultural farm and transport to a collecting point. The starting point is when the plants are ready to be harvested; the stage before harvesting is not considered in the system of measuring and monitoring the food losses. The successive stage is the transport of the collected crops to the collecting point or its storage and transport. The moment, at which the raw material enters the stage of processing, i.e. its receipt at the gate of the processing plant, is considered as a final point of basic production. Processing is understood as the change of the raw material into food products. According to the Regulation (EC) No 178/2002 of the European Parliament and of the Council of 28 January 2002, Art. 2, “food” means any substance or product, intended to be, or reasonably expected to be consumed by humans. It was determined (for the needs of the present studies) when raw material is considered as a food; it is the moment when the plants (fruit, vegetables, cereals, oilseeds), after being harvested, arrive at storehouses. The aim of the studies was to determine the reasons for loss generation and to analyze the directions of their management in basic vegetal production in Poland in the years 2017 and 2018. The studies on food losses and waste in basic vegetal production were carried out in three sectors – fruit and vegetables, cereals and oilseeds. The studies of the basic production were conducted during the period of March-May 2019 at the territory of the whole country on a representative trail of 250 farms in each sector. The surveys were carried out using the questionnaires by the PAPI (Paper & Pen Personal Interview) method; the pollsters conducted the direct questionnaire interviews. From the conducted studies, it is followed that in 19% of the examined farms, any losses were not recorded during preparation, loading, and transport of the raw material to the manufacturing plant. In the farms, where the losses were indicated, the main reason in production of fruit and vegetables was rotting and it constituted more than 20% of the reported reasons, while in the case of cereals and oilseeds’ production, the respondents identified damages, moisture, and pests as the most frequent reason. The losses and waste, generated in vegetal production as well as in processing and trade of fruit and vegetables, or cereal products should be appropriately managed or recovered. The respondents indicated composting (more than 60%) as the main direction of waste management in all categories. Animal feed and landfill sites were the other indicated directions of management. Prevention and minimization of loss generation are important in every stage of production as well as in basic production. When possessing the knowledge on the reasons for loss generation, we may introduce the preventive measures, mainly connected with the appropriate conditions and methods of the storage. ACKNOWLEDGEMENT The article was prepared within the project: "Development of a waste food monitoring system and an effective program to rationalize losses and reduce food waste", acronym PROM implemented under the STRATEGIC SCIENTIFIC AND LEARNING PROGRAM - GOSPOSTRATEG financed by the National Center for Research and Development in accordance with the provisions of Gospostrateg1 / 385753/1/2018Keywords: food losses, food waste, PAP method, vegetal production
Procedia PDF Downloads 1162963 Cognitive Semantics Study of Conceptual and Metonymical Expressions in Johnson's Speeches about COVID-19
Authors: Hussain Hameed Mayuuf
Abstract:
The study is an attempt to investigate the conceptual metonymies is used in political discourse about COVID-19. Thus, this study tries to analyze and investigate how the conceptual metonymies in Johnson's speech about coronavirus are constructed. This study aims at: Identifying how are metonymies relevant to understand the messages in Boris Johnson speeches and to find out how can conceptual blending theory help people to understand the messages in the political speech about COVID-19. Lastly, it tries to Point out the kinds of integration networks are common in political speech. The study is based on the hypotheses that conceptual blending theory is a powerful tool for investigating the intended messages in Johnson's speech and there are different processes of blending networks and conceptual mapping that enable the listeners to identify the messages in political speech. This study presents a qualitative and quantitative analysis of four speeches about COVID-19; they are said by Boris Johnson. The selected data have been tackled from the cognitive-semantic perspective by adopting Conceptual Blending Theory as a model for the analysis. It concludes that CBT is applicable to the analysis of metonymies in political discourse. Its mechanisms enable listeners to analyze and understand these speeches. Also the listener can identify and understand the hidden messages in Biden and Johnson's discourse about COVID-19 by using different conceptual networks. Finally, it is concluded that the double scope networks are the most common types of blending of metonymies in the political speech.Keywords: cognitive, semantics, conceptual, metonymical, Covid-19
Procedia PDF Downloads 1302962 Quantification of E-Waste: A Case Study in Federal University of Espírito Santo, Brazil
Authors: Andressa S. T. Gomes, Luiza A. Souza, Luciana H. Yamane, Renato R. Siman
Abstract:
The segregation of waste of electrical and electronic equipment (WEEE) in the generating source, its characterization (quali-quantitative) and identification of origin, besides being integral parts of classification reports, are crucial steps to the success of its integrated management. The aim of this paper was to count WEEE generation at the Federal University of Espírito Santo (UFES), Brazil, as well as to define sources, temporary storage sites, main transportations routes and destinations, the most generated WEEE and its recycling potential. Quantification of WEEE generated at the University in the years between 2010 and 2015 was performed using data analysis provided by UFES’s sector of assets management. EEE and WEEE flow in the campuses information were obtained through questionnaires applied to the University workers. It was recorded 6028 WEEEs units of data processing equipment disposed by the university between 2010 and 2015. Among these waste, the most generated were CRT screens, desktops, keyboards and printers. Furthermore, it was observed that these WEEEs are temporarily stored in inappropriate places at the University campuses. In general, these WEEE units are donated to NGOs of the city, or sold through auctions (2010 and 2013). As for recycling potential, from the primary processing and further sale of printed circuit boards (PCB) from the computers, the amount collected could reach U$ 27,839.23. The results highlight the importance of a WEEE management policy at the University.Keywords: solid waste, waste of electrical and electronic equipment, waste management, institutional solid waste generation
Procedia PDF Downloads 2602961 Opinion Mining to Extract Community Emotions on Covid-19 Immunization Possible Side Effects
Authors: Yahya Almurtadha, Mukhtar Ghaleb, Ahmed M. Shamsan Saleh
Abstract:
The world witnessed a fierce attack from the Covid-19 virus, which affected public life socially, economically, healthily and psychologically. The world's governments tried to confront the pandemic by imposing a number of precautionary measures such as general closure, curfews and social distancing. Scientists have also made strenuous efforts to develop an effective vaccine to train the immune system to develop antibodies to combat the virus, thus reducing its symptoms and limiting its spread. Artificial intelligence, along with researchers and medical authorities, has accelerated the vaccine development process through big data processing and simulation. On the other hand, one of the most important negatives of the impact of Covid 19 was the state of anxiety and fear due to the blowout of rumors through social media, which prompted governments to try to reassure the public with the available means. This study aims to proposed using Sentiment Analysis (AKA Opinion Mining) and deep learning as efficient artificial intelligence techniques to work on retrieving the tweets of the public from Twitter and then analyze it automatically to extract their opinions, expression and feelings, negatively or positively, about the symptoms they may feel after vaccination. Sentiment analysis is characterized by its ability to access what the public post in social media within a record time and at a lower cost than traditional means such as questionnaires and interviews, not to mention the accuracy of the information as it comes from what the public expresses voluntarily.Keywords: deep learning, opinion mining, natural language processing, sentiment analysis
Procedia PDF Downloads 1722960 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations
Authors: Ram Mohan, Richard Haney, Ajit Kelkar
Abstract:
Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance
Procedia PDF Downloads 3642959 Processing Methods for Increasing the Yield, Nutritional Value and Stability of Coconut Milk
Authors: Archana G. Lamdande, Shyam R. Garud, K. S. M. S. Raghavarao
Abstract:
Coconut has two edible parts, that is, a white kernel (solid endosperm) and coconut water (liquid endosperm). The white kernel is generally used in fresh or dried form for culinary purposes. Coconut testa, is the brown skin, covering the coconut kernel. It is removed by paring of wet coconut and obtained as a by-product in coconut processing industries during the production of products such as desiccated coconut, coconut milk, whole coconut milk powder and virgin coconut oil. At present, it is used as animal feed component after drying and recovering the residual oil (by expelling). Experiments were carried out on expelling of coconut milk for shredded coconut with and without testa removal, in order to explore the possibility of increasing the milk yield and value addition in terms of increased polyphenol content. The color characteristics of coconut milk obtained from the grating without removal of testa were observed to be L* 82.79, a* 0.0125, b* 6.245, while that obtained from grating with removal of testa were L* 83.24, a* -0.7925, b* 3.1. A significant increase was observed in total phenol content of coconut milk obtained from the grating with testa (833.8 µl/ml) when compared to that from without testa (521.3 µl/ml). However, significant difference was not observed in protein content of coconut milk obtained from the grating with and without testa (4.9 and 5.0% w/w, respectively). Coconut milk obtained from grating without removal of testa showed higher milk yield (62% w/w) when compared to that obtained from grating with removal of testa (60% w/w). The fat content in coconut milk was observed to be 32% (w/w), and it is unstable due to such a high fat content. Therefore, several experiments were carried out for examining its stability by adjusting the fat content at different levels (32, 28, 24, and 20% w/w). It was found that the coconut milk was more stable with a fat content of 24 % (w/w). Homogenization and ultrasonication and their combinations were used for exploring the possibility of increasing the stability of coconut milk. The microscopic study was carried out for analyzing the size of fat globules and the degree of their uniform distribution.Keywords: coconut milk, homogenization, stability, testa, ultrasonication
Procedia PDF Downloads 3152958 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights
Authors: Julian Wise
Abstract:
Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.Keywords: mineral technology, big data, machine learning operations, data lake
Procedia PDF Downloads 1122957 Contribution of Spatial Teledetection to the Geological Mapping of the Imiter Buttonhole: Application to the Mineralized Structures of the Principal Corps B3 (CPB3) of the Imiter Mine (Anti-atlas, Morocco)
Authors: Bouayachi Ali, Alikouss Saida, Baroudi Zouhir, Zerhouni Youssef, Zouhair Mohammed, El Idrissi Assia, Essalhi Mourad
Abstract:
The world-class Imiter silver deposit is located on the northern flank of the Precambrian Imiter buttonhole. This deposit is formed by epithermal veins hosted in the sandstone-pelite formations of the lower complex and in the basic conglomerates of the upper complex, these veins are controlled by a regional scale fault cluster, oriented N70°E to N90°E. The present work on the contribution of remote sensing on the geological mapping of the Imiter buttonhole and application to the mineralized structures of the Principal Corps B3. Mapping on satellite images is a very important tool in mineral prospecting. It allows the localization of the zones of interest in order to orientate the field missions by helping the localization of the major structures which facilitates the interpretation, the programming and the orientation of the mining works. The predictive map also allows for the correction of field mapping work, especially the direction and dimensions of structures such as dykes, corridors or scrapings. The use of a series of processing such as SAM, PCA, MNF and unsupervised and supervised classification on a Landsat 8 satellite image of the study area allowed us to highlight the main facies of the Imite area. To improve the exploration research, we used another processing that allows to realize a spatial distribution of the alteration mineral indices, and the application of several filters on the different bands to have lineament maps.Keywords: principal corps B3, teledetection, Landsat 8, Imiter II, silver mineralization, lineaments
Procedia PDF Downloads 962956 Integrating Optuna and Synthetic Data Generation for Optimized Medical Transcript Classification Using BioBERT
Authors: Sachi Nandan Mohanty, Shreya Sinha, Sweeti Sah, Shweta Sharma4
Abstract:
The advancement of natural language processing has majorly influenced the field of medical transcript classification, providing a robust framework for enhancing the accuracy of clinical data processing. It has enormous potential to transform healthcare and improve people's livelihoods. This research focuses on improving the accuracy of medical transcript categorization using Bidirectional Encoder Representations from Transformers (BERT) and its specialized variants, including BioBERT, ClinicalBERT, SciBERT, and BlueBERT. The experimental work employs Optuna, an optimization framework, for hyperparameter tuning to identify the most effective variant, concluding that BioBERT yields the best performance. Furthermore, various optimizers, including Adam, RMSprop, and Layerwise adaptive large batch optimization (LAMB), were evaluated alongside BERT's default AdamW optimizer. The findings show that the LAMB optimizer achieves a performance that is equally good as AdamW's. Synthetic data generation techniques from Gretel were utilized to augment the dataset, expanding the original dataset from 5,000 to 10,000 rows. Subsequent evaluations demonstrated that the model maintained its performance with synthetic data, with the LAMB optimizer showing marginally better results. The enhanced dataset and optimized model configurations improved classification accuracy, showcasing the efficacy of the BioBERT variant and the LAMB optimizer. It resulted in an accuracy of up to 98.2% and 90.8% for the original and combined datasets.Keywords: BioBERT, clinical data, healthcare AI, transformer models
Procedia PDF Downloads 42955 English Pashto Contact: Morphological Adaptation of Bilingual Compound Words in Pashto
Authors: Imran Ullah Imran
Abstract:
Language contact is a familiar concept in the present global world. Across the globe, languages get mixed up at different levels. Borrowing, code-switching are some of the means through which languages interact. This study examines Pashto-English contact at word and syllable levels. By recording the speech of 30 Pashto native speakers, selected via 'social network' sampling, the study located a number of Pashto-English compound words, which is a unique contact of its kind. In data analysis, tokens were categorized on the basis of their pattern and morphological structure. The study shows that Pashto-English Bilingual Compound words (BCWs) are very prevalent in the Pashto language. The study also found that the BCWs in Pashto are completely productive and have their own meanings. It also shows that the dominant pattern of hybrid words in Pashto is the conjugation of an independent English root word followed by a Pashto inflectional morpheme, which contributes to the core semantic content of the construction. The BCWs construction shows that how both the languages are closer to each other. Pashto-English contact results into bilingual compound and hybrid words, which forms a considerable number of tokens in the present-day spoken Pashto. On the basis of these findings, the study assumes that the same phenomenon may increase with the passage of time that would, in turn, result in the formation of more bilingual compound or hybrid words.Keywords: code-mixing, bilingual compound words, pashto-english contact, hybrid words, inflectional lexical morpheme
Procedia PDF Downloads 2502954 A U-Net Based Architecture for Fast and Accurate Diagram Extraction
Authors: Revoti Prasad Bora, Saurabh Yadav, Nikita Katyal
Abstract:
In the context of educational data mining, the use case of extracting information from images containing both text and diagrams is of high importance. Hence, document analysis requires the extraction of diagrams from such images and processes the text and diagrams separately. To the author’s best knowledge, none among plenty of approaches for extracting tables, figures, etc., suffice the need for real-time processing with high accuracy as needed in multiple applications. In the education domain, diagrams can be of varied characteristics viz. line-based i.e. geometric diagrams, chemical bonds, mathematical formulas, etc. There are two broad categories of approaches that try to solve similar problems viz. traditional computer vision based approaches and deep learning approaches. The traditional computer vision based approaches mainly leverage connected components and distance transform based processing and hence perform well in very limited scenarios. The existing deep learning approaches either leverage YOLO or faster-RCNN architectures. These approaches suffer from a performance-accuracy tradeoff. This paper proposes a U-Net based architecture that formulates the diagram extraction as a segmentation problem. The proposed method provides similar accuracy with a much faster extraction time as compared to the mentioned state-of-the-art approaches. Further, the segmentation mask in this approach allows the extraction of diagrams of irregular shapes.Keywords: computer vision, deep-learning, educational data mining, faster-RCNN, figure extraction, image segmentation, real-time document analysis, text extraction, U-Net, YOLO
Procedia PDF Downloads 1402953 Factor Analysis Based on Semantic Differential of the Public Perception of Public Art: A Case Study of the Malaysia National Monument
Authors: Yuhanis Ibrahim, Sung-Pil Lee
Abstract:
This study attempts to address factors that contribute to outline public art factors assessment, memorial monument specifically. Memorial monuments hold significant and rich message whether the intention of the art is to mark and commemorate important event or to inform younger generation about the past. Public monument should relate to the public and raise awareness about the significant issue. Therefore, by investigating the impact of the existing public memorial art will hopefully shed some lights to the upcoming public art projects’ stakeholders to ensure the lucid memorial message is delivered to the public directly. Public is the main actor as public is the fundamental purpose that the art was created. Perception is framed as one of the reliable evaluation tools to assess the public art impact factors. The Malaysia National Monument was selected to be the case study for the investigation. The public’s perceptions were gathered using a questionnaire that involved (n-115) participants to attain keywords, and next Semantical Differential Methodology (SDM) was adopted to evaluate the perceptions about the memorial monument. These perceptions were then measured with Reliability Factor and then were factorised using Factor Analysis of Principal Component Analysis (PCA) method to acquire concise factors for the monument assessment. The result revealed that there are four factors that influence public’s perception on the monument which are aesthetic, audience, topology, and public reception. The study concludes by proposing the factors for public memorial art assessment for the next future public memorial projects especially in Malaysia.Keywords: factor analysis, public art, public perception, semantical differential methodology
Procedia PDF Downloads 5022952 The Importance of Visual Communication in Artificial Intelligence
Authors: Manjitsingh Rajput
Abstract:
Visual communication plays an important role in artificial intelligence (AI) because it enables machines to understand and interpret visual information, similar to how humans do. This abstract explores the importance of visual communication in AI and emphasizes the importance of various applications such as computer vision, object emphasis recognition, image classification and autonomous systems. In going deeper, with deep learning techniques and neural networks that modify visual understanding, In addition to AI programming, the abstract discusses challenges facing visual interfaces for AI, such as data scarcity, domain optimization, and interpretability. Visual communication and other approaches, such as natural language processing and speech recognition, have also been explored. Overall, this abstract highlights the critical role that visual communication plays in advancing AI capabilities and enabling machines to perceive and understand the world around them. The abstract also explores the integration of visual communication with other modalities like natural language processing and speech recognition, emphasizing the critical role of visual communication in AI capabilities. This methodology explores the importance of visual communication in AI development and implementation, highlighting its potential to enhance the effectiveness and accessibility of AI systems. It provides a comprehensive approach to integrating visual elements into AI systems, making them more user-friendly and efficient. In conclusion, Visual communication is crucial in AI systems for object recognition, facial analysis, and augmented reality, but challenges like data quality, interpretability, and ethics must be addressed. Visual communication enhances user experience, decision-making, accessibility, and collaboration. Developers can integrate visual elements for efficient and accessible AI systems.Keywords: visual communication AI, computer vision, visual aid in communication, essence of visual communication.
Procedia PDF Downloads 972951 Development of Mobile Application for Internship Program Management Using the Concept of Model View Controller (MVC) Pattern
Authors: Shutchapol Chopvitayakun
Abstract:
Nowadays, especially for the last 5 years, mobile devices, mobile applications and mobile users, through the deployment of wireless communication and mobile phone cellular network, all these components are growing significantly bigger and stronger. They are being integrated into each other to create multiple purposes and pervasive deployments into every business and non-business sector such as education, medicine, traveling, finance, real estate and many more. Objective of this study was to develop a mobile application for seniors or last-year students who enroll the internship program at each tertiary school (undergraduate school) and do onsite practice at real field sties, real organizations and real workspaces. During the internship session, all students as the interns are required to exercise, drilling and training onsite with specific locations and specific tasks or may be some assignments from their supervisor. Their work spaces are both private and government corporates and enterprises. This mobile application is developed under schema of a transactional processing system that enables users to keep daily work or practice log, monitor true working locations and ability to follow daily tasks of each trainee. Moreover, it provides useful guidance from each intern’s advisor, in case of emergency. Finally, it can summarize all transactional data then calculate each internship cumulated hours from the field practice session for each individual intern.Keywords: internship, mobile application, Android OS, smart phone devices, mobile transactional processing system, guidance and monitoring, tertiary education, senior students, model view controller (MVC)
Procedia PDF Downloads 3152950 X-Ray Diffraction, Microstructure, and Mössbauer Studies of Nanostructured Materials Obtained by High-Energy Ball Milling
Authors: N. Boudinar, A. Djekoun, A. Otmani, B. Bouzabata, J. M. Greneche
Abstract:
High-energy ball milling is a solid-state powder processing technique that allows synthesizing a variety of equilibrium and non-equilibrium alloy phases starting from elemental powders. The advantage of this process technology is that the powder can be produced in large quantities and the processing parameters can be easily controlled, thus it is a suitable method for commercial applications. It can also be used to produce amorphous and nanocrystalline materials in commercially relevant amounts and is also amenable to the production of a variety of alloy compositions. Mechanical alloying (high-energy ball milling) provides an inter-dispersion of elements through a repeated cold welding and fracture of free powder particles; the grain size decreases to nano metric scale and the element mix together. Progressively, the concentration gradients disappear and eventually the elements are mixed at the atomic scale. The end products depend on many parameters such as the milling conditions and the thermodynamic properties of the milled system. Here, the mechanical alloying technique has been used to prepare nano crystalline Fe_50 and Fe_64 wt.% Ni alloys from powder mixtures. Scanning electron microscopy (SEM) with energy-dispersive, X-ray analyses and Mössbauer spectroscopy were used to study the mixing at nanometric scale. The Mössbauer Spectroscopy confirmed the ferromagnetic ordering and was use to calculate the distribution of hyperfin field. The Mössbauer spectrum for both alloys shows the existence of a ferromagnetic phase attributed to γ-Fe-Ni solid solution.Keywords: nanocrystalline, mechanical alloying, X-ray diffraction, Mössbauer spectroscopy, phase transformations
Procedia PDF Downloads 4372949 Vehicle Speed Estimation Using Image Processing
Authors: Prodipta Bhowmik, Poulami Saha, Preety Mehra, Yogesh Soni, Triloki Nath Jha
Abstract:
In India, the smart city concept is growing day by day. So, for smart city development, a better traffic management and monitoring system is a very important requirement. Nowadays, road accidents increase due to more vehicles on the road. Reckless driving is mainly responsible for a huge number of accidents. So, an efficient traffic management system is required for all kinds of roads to control the traffic speed. The speed limit varies from road to road basis. Previously, there was a radar system but due to high cost and less precision, the radar system is unable to become favorable in a traffic management system. Traffic management system faces different types of problems every day and it has become a researchable topic on how to solve this problem. This paper proposed a computer vision and machine learning-based automated system for multiple vehicle detection, tracking, and speed estimation of vehicles using image processing. Detection of vehicles and estimating their speed from a real-time video is tough work to do. The objective of this paper is to detect vehicles and estimate their speed as accurately as possible. So for this, a real-time video is first captured, then the frames are extracted from that video, then from that frames, the vehicles are detected, and thereafter, the tracking of vehicles starts, and finally, the speed of the moving vehicles is estimated. The goal of this method is to develop a cost-friendly system that can able to detect multiple types of vehicles at the same time.Keywords: OpenCV, Haar Cascade classifier, DLIB, YOLOV3, centroid tracker, vehicle detection, vehicle tracking, vehicle speed estimation, computer vision
Procedia PDF Downloads 852948 Isolation and Selection of Strains Perspective for Sewage Sludge Processing
Authors: A. Zh. Aupova, A. Ulankyzy, A. Sarsenova, A. Kussayin, Sh. Turarbek, N. Moldagulova, A. Kurmanbayev
Abstract:
One of the methods of organic waste bioconversion into environmentally-friendly fertilizer is composting. Microorganisms that produce hydrolytic enzymes play a significant role in accelerating the process of organic waste composting. We studied the enzymatic potential (amylase, protease, cellulase, lipase, urease activity) of bacteria isolated from the sewage sludge of Nur-Sultan, Rudny, and Fort-Shevchenko cities, the dacha soil of Nur-Sultan city, and freshly cut grass from the dacha for processing organic waste and identifying active strains. Microorganism isolation was carried out by the cultures enrichment method on liquid nutrient media, followed by inoculating on different solid media to isolate individual colonies. As a result, sixty-one microorganisms were isolated, three of which were thermophiles (DS1, DS2, and DS3). The highest number of isolates, twenty-one and eighteen, were isolated from sewage sludge of Nur-Sultan and Rudny cities, respectively. Ten isolates were isolated from the wastewater of the sewage treatment plant in Fort-Shevchenko. From the dacha soil of Nur-Sultan city and freshly cut grass - 9 and 5 isolates were revealed, respectively. The lipolytic, proteolytic, amylolytic, cellulolytic, ureolytic, and oil-oxidizing activities of isolates were studied. According to the results of experiments, starch hydrolysis (amylolytic activity) was found in 2 isolates - CB2/2, and CB2/1. Three isolates - CB2, CB2/1, and CB1/1 were selected for the highest ability to break down casein. Among isolated 61 bacterial cultures, three isolates could break down fats - CB3, CBG1/1, and IL3. Seven strains had cellulolytic activity - DS1, DS2, IL3, IL5, P2, P5, and P3. Six isolates rapidly decomposed urea. Isolate P1 could break down casein and cellulose. Isolate DS3 was a thermophile and had cellulolytic activity. Thus, based on the conducted studies, 15 isolates were selected as a potential for sewage sludge composting - CB2, CB3, CB1/1, CB2/2, CBG1/1, CB2/1, DS1, DS2, DS3, IL3, IL5, P1, P2, P5, P3. Selected strains were identified on a mass spectrometer (Maldi-TOF). The isolate - CB 3 was referred to the genus Rhodococcus rhodochrous; two isolates CB2 and CB1 / 1 - to Bacillus cereus, CB 2/2 - to Cryseobacterium arachidis, CBG 1/1 - to Pseudoxanthomonas sp., CB2/1 - to Bacillus megaterium, DS1 - to Pediococcus acidilactici, DS2 - to Paenibacillus residui, DS3 - to Brevibacillus invocatus, three strains IL3, P5, P3 - to Enterobacter cloacae, two strains IL5, P2 - to Ochrobactrum intermedium, and P1 - Bacillus lichenoformis. Hence, 60 isolates were isolated from the wastewater of the cities of Nur-Sultan, Rudny, Fort-Shevchenko, the dacha soil of Nur-Sultan city, and freshly cut grass from the dacha. Based on the highest enzymatic activity, 15 active isolates were selected and identified. These strains may become the candidates for bio preparation for sewage sludge processing.Keywords: sewage sludge, composting, bacteria, enzymatic activity
Procedia PDF Downloads 1032947 Text as Reader Device Improving Subjectivity on the Role of Attestation between Interpretative Semiotics and Discursive Linguistics
Authors: Marco Castagna
Abstract:
Proposed paper is aimed to inquire about the relation between text and reader, focusing on the concept of ‘attestation’. Indeed, despite being widely accepted in semiotic research, even today the concept of text remains uncertainly defined. So, it seems to be undeniable that what is called ‘text’ offers an image of internal cohesion and coherence, that makes it possible to analyze it as an object. Nevertheless, this same object remains problematic when it is pragmatically activated by the act of reading. In fact, as for the T.A.R:D.I.S., that is the unique space-temporal vehicle used by the well-known BBC character Doctor Who in his adventures, every text appears to its own readers not only “bigger inside than outside”, but also offering spaces that change according to the different traveller standing in it. In a few words, as everyone knows, this singular condition raises the questions about the gnosiological relation between text and reader. How can a text be considered the ‘same’, even if it can be read in different ways by different subjects? How can readers can be previously provided with knowledge required for ‘understanding’ a text, but at the same time learning something more from it? In order to explain this singular condition it seems useful to start thinking about text as a device more than an object. In other words, this unique status is more clearly understandable when ‘text’ ceases to be considered as a box designed to move meaning from a sender to a recipient (marking the semiotic priority of the “code”) and it starts to be recognized as performative meaning hypothesis, that is discursively configured by one or more forms and empirically perceivable by means of one or more substances. Thus, a text appears as a “semantic hanger”, potentially offered to the “unending deferral of interpretant", and from time to time fixed as “instance of Discourse”. In this perspective, every reading can be considered as an answer to the continuous request for confirming or denying the meaning configuration (the meaning hypothesis) expressed by text. Finally, ‘attestation’ is exactly what regulates this dynamic of request and answer, through which the reader is able to confirm his previous hypothesis on reality or maybe acquire some new ones.Proposed paper is aimed to inquire about the relation between text and reader, focusing on the concept of ‘attestation’. Indeed, despite being widely accepted in semiotic research, even today the concept of text remains uncertainly defined. So, it seems to be undeniable that what is called ‘text’ offers an image of internal cohesion and coherence, that makes it possible to analyze it as an object. Nevertheless, this same object remains problematic when it is pragmatically activated by the act of reading. In fact, as for the T.A.R:D.I.S., that is the unique space-temporal vehicle used by the well-known BBC character Doctor Who in his adventures, every text appears to its own readers not only “bigger inside than outside”, but also offering spaces that change according to the different traveller standing in it. In a few words, as everyone knows, this singular condition raises the questions about the gnosiological relation between text and reader. How can a text be considered the ‘same’, even if it can be read in different ways by different subjects? How can readers can be previously provided with knowledge required for ‘understanding’ a text, but at the same time learning something more from it? In order to explain this singular condition it seems useful to start thinking about text as a device more than an object. In other words, this unique status is more clearly understandable when ‘text’ ceases to be considered as a box designed to move meaning from a sender to a recipient (marking the semiotic priority of the “code”) and it starts to be recognized as performative meaning hypothesis, that is discursively configured by one or more forms and empirically perceivable by means of one or more substances. Thus, a text appears as a “semantic hanger”, potentially offered to the “unending deferral of interpretant", and from time to time fixed as “instance of Discourse”. In this perspective, every reading can be considered as an answer to the continuous request for confirming or denying the meaning configuration (the meaning hypothesis) expressed by text. Finally, ‘attestation’ is exactly what regulates this dynamic of request and answer, through which the reader is able to confirm his previous hypothesis on reality or maybe acquire some new ones.Keywords: attestation, meaning, reader, text
Procedia PDF Downloads 2372946 Low Temperature Biological Treatment of Chemical Oxygen Demand for Agricultural Water Reuse Application Using Robust Biocatalysts
Authors: Vedansh Gupta, Allyson Lutz, Ameen Razavi, Fatemeh Shirazi
Abstract:
The agriculture industry is especially vulnerable to forecasted water shortages. In the fresh and fresh-cut produce sector, conventional flume-based washing with recirculation exhibits high water demand. This leads to a large water footprint and possible cross-contamination of pathogens. These can be alleviated through advanced water reuse processes, such as membrane technologies including reverse osmosis (RO). Water reuse technologies effectively remove dissolved constituents but can easily foul without pre-treatment. Biological treatment is effective for the removal of organic compounds responsible for fouling, but not at the low temperatures encountered at most produce processing facilities. This study showed that the Microvi MicroNiche Engineering (MNE) technology effectively removes organic compounds (> 80%) at low temperatures (6-8 °C) from wash water. The MNE technology uses synthetic microorganism-material composites with negligible solids production, making it advantageously situated as an effective bio-pretreatment for RO. A preliminary technoeconomic analysis showed 60-80% savings in operation and maintenance costs (OPEX) when using the Microvi MNE technology for organics removal. This study and the accompanying economic analysis indicated that the proposed technology process will substantially reduce the cost barrier for adopting water reuse practices, thereby contributing to increased food safety and furthering sustainable water reuse processes across the agricultural industry.Keywords: biological pre-treatment, innovative technology, vegetable processing, water reuse, agriculture, reverse osmosis, MNE biocatalysts
Procedia PDF Downloads 1292945 Quality Analysis of Lake Malawi's Diplotaxodon Fish Species Processed in Solar Tent Dryer versus Open Sun Drying
Authors: James Banda, Jupiter Simbeye, Essau Chisale, Geoffrey Kanyerere, Kings Kamtambe
Abstract:
Improved solar tent dryers for processing small fish species were designed to reduce post-harvest fish losses and improve supply of quality fish products in the southern part of Lake Malawi under CultiAF project. A comparative analysis of the quality of Diplotaxodon (Ndunduma) from Lake Malawi processed in solar tent dryer and open sun drying was conducted using proximate analysis, microbial analysis and sensory evaluation. Proximates for solar tent dried fish and open sun dried fish in terms of proteins, fats, moisture and ash were 63.3±0.15% and 63.3±0.34%, 19.6±0.09% and 19.9±0.25%, 8.3±0.12% and 17.0±0.01%, and 15.6±0.61% and 21.9±0.91% respectively. Crude protein and crude fat showed non-significant differences (p = 0.05), while moisture and ash content were significantly different (p = 001). Open sun dried fish had significantly higher numbers of viable bacteria counts (5.2×10⁶ CFU) than solar tent dried fish (3.9×10² CFU). Most isolated bacteria from solar tent dried and open sun dried fish were 1.0×10¹ and 7.2×10³ for Total coliform, 0 and 4.5 × 10³ for Escherishia coli, 0 and 7.5 × 10³ for Salmonella, 0 and 5.7×10² for shigella, 4.0×10¹ and 6.1×10³ for Staphylococcus, 1.0×10¹ and 7.0×10² for vibrio. Qualitative evaluation of sensory properties showed higher acceptability of 3.8 for solar tent dried fish than 1.7 for open sun dried fish. It is concluded that promotion of solar tent drying in processing small fish species in Malawi would support small-scale fish processors to produce quality fish in terms of nutritive value, reduced microbial contamination, sensory acceptability and reduced moisture content.Keywords: diplotaxodon, Malawi, open sun drying, solar tent drying
Procedia PDF Downloads 3372944 Design and Implementation of Collaborative Editing System Based on Physical Simulation Engine Running State
Authors: Zhang Songning, Guan Zheng, Ci Yan, Ding Gangyi
Abstract:
The application of physical simulation engines in collaborative editing systems has an important background and role. Firstly, physical simulation engines can provide real-world physical simulations, enabling users to interact and collaborate in real time in virtual environments. This provides a more intuitive and immersive experience for collaborative editing systems, allowing users to more accurately perceive and understand various elements and operations in collaborative editing. Secondly, through physical simulation engines, different users can share virtual space and perform real-time collaborative editing within it. This real-time sharing and collaborative editing method helps to synchronize information among team members and improve the efficiency of collaborative work. Through experiments, the average model transmission speed of a single person in the collaborative editing system has increased by 141.91%; the average model processing speed of a single person has increased by 134.2%; the average processing flow rate of a single person has increased by 175.19%; the overall efficiency improvement rate of a single person has increased by 150.43%. With the increase in the number of users, the overall efficiency remains stable, and the physical simulation engine running status collaborative editing system also has horizontal scalability. It is not difficult to see that the design and implementation of a collaborative editing system based on physical simulation engines not only enriches the user experience but also optimizes the effectiveness of team collaboration, providing new possibilities for collaborative work.Keywords: physics engine, simulation technology, collaborative editing, system design, data transmission
Procedia PDF Downloads 882943 Colored Image Classification Using Quantum Convolutional Neural Networks Approach
Authors: Farina Riaz, Shahab Abdulla, Srinjoy Ganguly, Hajime Suzuki, Ravinesh C. Deo, Susan Hopkins
Abstract:
Recently, quantum machine learning has received significant attention. For various types of data, including text and images, numerous quantum machine learning (QML) models have been created and are being tested. Images are exceedingly complex data components that demand more processing power. Despite being mature, classical machine learning still has difficulties with big data applications. Furthermore, quantum technology has revolutionized how machine learning is thought of, by employing quantum features to address optimization issues. Since quantum hardware is currently extremely noisy, it is not practicable to run machine learning algorithms on it without risking the production of inaccurate results. To discover the advantages of quantum versus classical approaches, this research has concentrated on colored image data. Deep learning classification models are currently being created on Quantum platforms, but they are still in a very early stage. Black and white benchmark image datasets like MNIST and Fashion MINIST have been used in recent research. MNIST and CIFAR-10 were compared for binary classification, but the comparison showed that MNIST performed more accurately than colored CIFAR-10. This research will evaluate the performance of the QML algorithm on the colored benchmark dataset CIFAR-10 to advance QML's real-time applicability. However, deep learning classification models have not been developed to compare colored images like Quantum Convolutional Neural Network (QCNN) to determine how much it is better to classical. Only a few models, such as quantum variational circuits, take colored images. The methodology adopted in this research is a hybrid approach by using penny lane as a simulator. To process the 10 classes of CIFAR-10, the image data has been translated into grey scale and the 28 × 28-pixel image containing 10,000 test and 50,000 training images were used. The objective of this work is to determine how much the quantum approach can outperform a classical approach for a comprehensive dataset of color images. After pre-processing 50,000 images from a classical computer, the QCNN model adopted a hybrid method and encoded the images into a quantum simulator for feature extraction using quantum gate rotations. The measurements were carried out on the classical computer after the rotations were applied. According to the results, we note that the QCNN approach is ~12% more effective than the traditional classical CNN approaches and it is possible that applying data augmentation may increase the accuracy. This study has demonstrated that quantum machine and deep learning models can be relatively superior to the classical machine learning approaches in terms of their processing speed and accuracy when used to perform classification on colored classes.Keywords: CIFAR-10, quantum convolutional neural networks, quantum deep learning, quantum machine learning
Procedia PDF Downloads 1302942 Characteristic Sentence Stems in Academic English Texts: Definition, Identification, and Extraction
Authors: Jingjie Li, Wenjie Hu
Abstract:
Phraseological units in academic English texts have been a central focus in recent corpus linguistic research. A wide variety of phraseological units have been explored, including collocations, chunks, lexical bundles, patterns, semantic sequences, etc. This paper describes a special category of clause-level phraseological units, namely, Characteristic Sentence Stems (CSSs), with a view to describing their defining criteria and extraction method. CSSs are contiguous lexico-grammatical sequences which contain a subject-predicate structure and which are frame expressions characteristic of academic writing. The extraction of CSSs consists of six steps: Part-of-speech tagging, n-gram segmentation, structure identification, significance of occurrence calculation, text range calculation, and overlapping sequence reduction. Significance of occurrence calculation is the crux of this study. It includes the computing of both the internal association and the boundary independence of a CSS and tests the occurring significance of the CSS from both inside and outside perspectives. A new normalization algorithm is also introduced into the calculation of LocalMaxs for reducing overlapping sequences. It is argued that many sentence stems are so recurrent in academic texts that the most typical of them have become the habitual ways of making meaning in academic writing. Therefore, studies of CSSs could have potential implications and reference value for academic discourse analysis, English for Academic Purposes (EAP) teaching and writing.Keywords: characteristic sentence stem, extraction method, phraseological unit, the statistical measure
Procedia PDF Downloads 1702941 Microfluidic Impedimetric Biochip and Related Methods for Measurement Chip Manufacture and Counting Cells
Authors: Amina Farooq, Nauman Zafar Butt
Abstract:
This paper is about methods and tools for counting particles of interest, such as cells. A microfluidic system with interconnected electronics on a flexible substrate, inlet-outlet ports and interface schemes, sensitive and selective detection of cells specificity, and processing of cell counting at polymer interfaces in a microscale biosensor for use in the detection of target biological and non-biological cells. The development of fluidic channels, planar fluidic contact ports, integrated metal electrodes on a flexible substrate for impedance measurements, and a surface modification plasma treatment as an intermediate bonding layer are all part of the fabrication process. Magnetron DC sputtering is used to deposit a double metal layer (Ti/Pt) over the polypropylene film. Using a photoresist layer, specified and etched zones are established. Small fluid volumes, a reduced detection region, and electrical impedance measurements over a range of frequencies for cell counts improve detection sensitivity and specificity. The procedure involves continuous flow of fluid samples that contain particles of interest through the microfluidic channels, counting all types of particles in a portion of the sample using the electrical differential counter to generate a bipolar pulse for each passing cell—calculating the total number of particles of interest originally in the fluid sample by using MATLAB program and signal processing. It's indeed potential to develop a robust and economical kit for cell counting in whole-blood samples using these methods and similar devices.Keywords: impedance, biochip, cell counting, microfluidics
Procedia PDF Downloads 1622940 Interaction Design In Home Appliance: An Integrated Approach InKanseiAnd Hedonomic “Cases: Rice Cooker, Juicer, Mixer”
Authors: Sara Mostowfi, Hassan Sadeghinaeini, Sana Behnamasl, Leila Ensaniat, Maryam Mostafaee
Abstract:
Nowadays, most of product producers, e.g. home appliance, electronic machines and vehicles focus on quality and comfort, and promise consumers ease of use and pleasurable experiences during product using. Consumers make their purchase decisions according to two needs: functional and emotional needs. Functional needs are fulfilled by product functionality, besides emotional needs are related to psychologists’ aspects of production. Emotions are distinctive elements which should be added to products and services to lead them up. In this case, the authors’ survey conducted pleasurable and hedonomic aspects in products of a home appliance company in Iran. In this regard, three samples of home appliance were selected: mixer, rice cooker, iron. Fifteen women (20-60) participated in this study. Every user evaluated each product by questionnaire based on 7 point semantic differential scale. After analyzing the results with statistical methods, results showed that 90% of users aren’t satisfied with hedonic and pleasurable criteria in interaction with these products. They notified that regarding hedonomics and pleasurable criteria’s they will have better ease of use and functionality. Our findings show a significant association between products’ features and user satisfaction. It seems that industrial design has a significant impression on the company’s products and with regard the pleasurable criteria the company sales will be more successful.Keywords: home appliance, interaction, pleasure, hedonomy, ergonomy
Procedia PDF Downloads 3832939 Saudi Arabia Border Security Informatics: Challenges of a Harsh Environment
Authors: Syed Ahsan, Saleh Alshomrani, Ishtiaq Rasool, Ali Hassan
Abstract:
In this oral presentation, we will provide an overview of the technical and semantic architecture of a desert border security and critical infrastructure protection security system. Modern border security systems are designed to reduce the dependability and intrusion of human operators. To achieve this, different types of sensors are use along with video surveillance technologies. Application of these technologies in a harsh desert environment of Saudi Arabia poses unique challenges. Environmental and geographical factors including high temperatures, desert storms, temperature variations and remoteness adversely affect the reliability of surveillance systems. To successfully implement a reliable, effective system in a harsh desert environment, the following must be achieved: i) Selection of technology including sensors, video cameras, and communication infrastructure that suit desert environments. ii) Reduced power consumption and efficient usage of equipment to increase the battery life of the equipment. iii) A reliable and robust communication network with efficient usage of bandwidth. Also, to reduce the expert bottleneck, an ontology-based intelligent information systems needs to be developed. Domain knowledge unique and peculiar to Saudi Arabia needs to be formalized to develop an expert system that can detect abnormal activities and any intrusion.Keywords: border security, sensors, abnormal activity detection, ontologies
Procedia PDF Downloads 4812938 COVID_ICU_BERT: A Fine-Tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes
Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo
Abstract:
Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as vital physiological signs, images, and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision-making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful in influencing the judgement of clinical sentiment in ICU clinical notes. This paper introduces two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of clinical transformer models that can reliably predict clinical sentiment for notes of COVID patients in the ICU. We train the model on clinical notes for COVID-19 patients, a type of notes that were not previously seen by clinicalBERT, and Bio_Discharge_Summary_BERT. The model, which was based on clinicalBERT achieves higher predictive accuracy (Acc 93.33%, AUC 0.98, and precision 0.96 ). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and precision 0.92 ).Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation
Procedia PDF Downloads 2092937 Geographic Information System (GIS) for Structural Typology of Buildings
Authors: Néstor Iván Rojas, Wilson Medina Sierra
Abstract:
Managing spatial information is described through a Geographic Information System (GIS), for some neighborhoods in the city of Tunja, in relation to the structural typology of the buildings. The use of GIS provides tools that facilitate the capture, processing, analysis and dissemination of cartographic information, product quality evaluation of the classification of buildings. Allows the development of a method that unifies and standardizes processes information. The project aims to generate a geographic database that is useful to the entities responsible for planning and disaster prevention and care for vulnerable populations, also seeks to be a basis for seismic vulnerability studies that can contribute in a study of urban seismic microzonation. The methodology consists in capturing the plat including road naming, neighborhoods, blocks and buildings, to which were added as attributes, the product of the evaluation of each of the housing data such as the number of inhabitants and classification, year of construction, the predominant structural systems, the type of mezzanine board and state of favorability, the presence of geo-technical problems, the type of cover, the use of each building, damage to structural and non-structural elements . The above data are tabulated in a spreadsheet that includes cadastral number, through which are systematically included in the respective building that also has that attribute. Geo-referenced data base is obtained, from which graphical outputs are generated, producing thematic maps for each evaluated data, which clearly show the spatial distribution of the information obtained. Using GIS offers important advantages for spatial information management and facilitates consultation and update. Usefulness of the project is recognized as a basis for studies on issues of planning and prevention.Keywords: microzonation, buildings, geo-processing, cadastral number
Procedia PDF Downloads 3342936 Text Analysis to Support Structuring and Modelling a Public Policy Problem-Outline of an Algorithm to Extract Inferences from Textual Data
Authors: Claudia Ehrentraut, Osama Ibrahim, Hercules Dalianis
Abstract:
Policy making situations are real-world problems that exhibit complexity in that they are composed of many interrelated problems and issues. To be effective, policies must holistically address the complexity of the situation rather than propose solutions to single problems. Formulating and understanding the situation and its complex dynamics, therefore, is a key to finding holistic solutions. Analysis of text based information on the policy problem, using Natural Language Processing (NLP) and Text analysis techniques, can support modelling of public policy problem situations in a more objective way based on domain experts knowledge and scientific evidence. The objective behind this study is to support modelling of public policy problem situations, using text analysis of verbal descriptions of the problem. We propose a formal methodology for analysis of qualitative data from multiple information sources on a policy problem to construct a causal diagram of the problem. The analysis process aims at identifying key variables, linking them by cause-effect relationships and mapping that structure into a graphical representation that is adequate for designing action alternatives, i.e., policy options. This study describes the outline of an algorithm used to automate the initial step of a larger methodological approach, which is so far done manually. In this initial step, inferences about key variables and their interrelationships are extracted from textual data to support a better problem structuring. A small prototype for this step is also presented.Keywords: public policy, problem structuring, qualitative analysis, natural language processing, algorithm, inference extraction
Procedia PDF Downloads 5902935 Effect of Anisotropy on Steady Creep in a Whisker Reinforced Functionally Graded Composite Disc
Authors: V. K. Gupta, Tejeet Singh
Abstract:
In many whisker reinforced composites, anisotropy may result due to material flow during processing operations such as forging, extrusion etc. The consequence of anisotropy, introduced during processing of disc material, has been investigated on the steady state creep deformations of the rotating disc. The disc material is assumed to undergo plastic deformations according to Hill’s anisotropic criterion. Steady state creep has been analyzed in a constant thickness rotating disc made of functionally graded 6061Al-SiCw (where the subscript ‘w’ stands for whisker) using Hill’s The content of reinforcement (SiCw) in the disc is assumed to decrease linearly from the inner to outer radius. The stresses and strain rates in the disc are estimated by solving the force equilibrium equation along with the constitutive equations describing multi-axial creep. The results obtained for anisotropic FGM disc have been compared with those estimated for isotropic FGM disc having the same average whisker content. The anisotropic constants, appearing in Hill’s yield criterion, have been obtained from the available experimental results. The results show that the presence of anisotropy reduces the tangential stress in the middle of the disc but near the inner and outer radii the tangential stress is higher when compared to isotropic disc. On the other hand, the steady state creep rates in the anisotropic disc are reduced significantly over the entire disc radius, with the maximum reduction observed at the inner radius. Further, in the presence of anisotropy the distribution of strain rate becomes relatively uniform over the entire disc, which may be responsible for reducing the extent of distortion in the disc.Keywords: anisotropy, creep, functionally graded composite, rotating disc
Procedia PDF Downloads 392