Search results for: rice processing
878 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis
Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan
Abstract:
Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of Big Data Technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centers or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through Vader and Roberta model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and TFIDF Vectorization, and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.Keywords: counter vectorization, convolutional neural network, crawler, data technology, long short-term memory, web scraping, sentiment analysis
Procedia PDF Downloads 88877 The Effect of Mixing and Degassing Conditions on the Properties of Epoxy/Anhydride Resin System
Authors: Latha Krishnan, Andrew Cobley
Abstract:
Epoxy resin is most widely used as matrices for composites of aerospace, automotive and electronic applications due to its outstanding mechanical properties. These properties are chiefly predetermined by the chemical structure of the prepolymer and type of hardener but can also be varied by the processing conditions such as prepolymer and hardener mixing, degassing and curing conditions. In this research, the effect of degassing on the curing behaviour and the void occurrence is experimentally evaluated for epoxy /anhydride resin system. The epoxy prepolymer was mixed with an anhydride hardener and accelerator in an appropriate quantity. In order to investigate the effect of degassing on the curing behaviour and void content of the resin, the uncured resin samples were prepared using three different methods: 1) no degassing 2) degassing on prepolymer and 3) degassing on mixed solution of prepolymer and hardener with an accelerator. The uncured resins were tested in differential scanning calorimeter (DSC) to observe the changes in curing behaviour of the above three resin samples by analysing factors such as gel temperature, peak cure temperature and heat of reaction/heat flow in curing. Additionally, the completely cured samples were tested in DSC to identify the changes in the glass transition temperature (Tg) between the three samples. In order to evaluate the effect of degassing on the void content and morphology changes in the cured epoxy resin, the fractured surfaces of cured epoxy resin were examined under the scanning electron microscope (SEM). Also, the changes in the mechanical properties of the cured resin were studied by three-point bending test. It was found that degassing at different stages of resin mixing had significant effects on properties such as glass transition temperature, the void content and void size of the epoxy/anhydride resin system. For example, degassing (vacuum applied on the mixed resin) has shown higher glass transition temperature (Tg) with lower void content.Keywords: anhydride epoxy, curing behaviour, degassing, void occurrence
Procedia PDF Downloads 347876 Experimental Optimization in Diamond Lapping of Plasma Sprayed Ceramic Coatings
Authors: S. Gowri, K. Narayanasamy, R. Krishnamurthy
Abstract:
Plasma spraying, from the point of value engineering, is considered as a cost-effective technique to deposit high performance ceramic coatings on ferrous substrates for use in the aero,automobile,electronics and semiconductor industries. High-performance ceramics such as Alumina, Zirconia, and titania-based ceramics have become a key part of turbine blades,automotive cylinder liners,microelectronic and semiconductor components due to their ability to insulate and distribute heat. However, as the industries continue to advance, improved methods are needed to increase both the flexibility and speed of ceramic processing in these applications. The ceramics mentioned were individually coated on structural steel substrate with NiCr bond coat of 50-70 micron thickness with the final thickness in the range of 150 to 200 microns. Optimal spray parameters were selected based on bond strength and porosity. The 'optimal' processed specimens were super finished by lapping using diamond and green SiC abrasives. Interesting results could be observed as follows: The green SiC could improve the surface finish of lapped surfaces almost as that by diamond in case of alumina and titania based ceramics but the diamond abrasives could improve the surface finish of PSZ better than that by green SiC. The conventional random scratches could be absent in alumina and titania ceramics but in PS those marks were found to be less. However, the flatness accuracy could be improved unto 60 to 85%. The surface finish and geometrical accuracy were measured and modeled. The abrasives in the midrange of their particle size could improve the surface quality faster and better than the particles of size in low and high ranges. From the experimental investigations after lapping process, the optimal lapping time, abrasive size, lapping pressure etc could be evaluated.Keywords: atmospheric plasma spraying, ceramics, lapping, surface qulaity, optimization
Procedia PDF Downloads 414875 Impacts of Urbanization on Forest and Agriculture Areas in Savannakhet Province, Lao People's Democratic Republic
Authors: Chittana Phompila
Abstract:
The current increased population pushes increasing demands for natural resources and living space. In Laos, urban areas have been expanding rapidly in recent years. The rapid urbanization can have negative impacts on landscapes, including forest and agriculture lands. The primary objective of this research were to map current urban areas in a large city in Savannakhet province, in Laos, 2) to compare changes in urbanization between 1990 and 2018, and 3) to estimate forest and agriculture areas lost due to expansions of urban areas during the last over twenty years within study area. Landsat 8 data was used and existing GIS data was collected including spatial data on rivers, lakes, roads, vegetated areas and other land use/land covers). GIS data was obtained from the government sectors. Object based classification (OBC) approach was applied in ECognition for image processing and analysis of urban area using. Historical data from other Landsat instruments (Landsat 5 and 7) were used to allow us comparing changes in urbanization in 1990, 2000, 2010 and 2018 in this study area. Only three main land cover classes were focused and classified, namely forest, agriculture and urban areas. Change detection approach was applied to illustrate changes in built-up areas in these periods. Our study shows that the overall accuracy of map was 95% assessed, kappa~ 0.8. It is found that that there is an ineffective control over forest and land-use conversions from forests and agriculture to urban areas in many main cities across the province. A large area of agriculture and forest has been decreased due to this conversion. Uncontrolled urban expansion and inappropriate land use planning can lead to creating a pressure in our resource utilisation. As consequence, it can lead to food insecurity and national economic downturn in a long term.Keywords: urbanisation, forest cover, agriculture areas, Landsat 8 imagery
Procedia PDF Downloads 158874 In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR
Authors: Pascal Mwenge, Tumisang Seodigeng
Abstract:
The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.Keywords: biodiesel, calibration, chemometrics, methanolysis, multivariate analysis, transesterification, FTIR
Procedia PDF Downloads 148873 Saving Energy through Scalable Architecture
Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala
Abstract:
In this paper, we focus on the importance of scalable architecture for data centers and buildings in general to help an enterprise achieve environmental sustainability. The scalable architecture helps in many ways, such as adaptability to the business and user requirements, promotes high availability and disaster recovery solutions that are cost effective and low maintenance. The scalable architecture also plays a vital role in three core areas of sustainability: economy, environment, and social, which are also known as the 3 pillars of a sustainability model. If the architecture is scalable, it has many advantages. A few examples are that scalable architecture helps businesses and industries to adapt to changing technology, drive innovation, promote platform independence, and build resilience against natural disasters. Most importantly, having a scalable architecture helps industries bring in cost-effective measures for energy consumption, reduce wastage, increase productivity, and enable a robust environment. It also helps in the reduction of carbon emissions with advanced monitoring and metering capabilities. Scalable architectures help in reducing waste by optimizing the designs to utilize materials efficiently, minimize resources, decrease carbon footprints by using low-impact materials that are environmentally friendly. In this paper we also emphasize the importance of cultural shift towards the reuse and recycling of natural resources for a balanced ecosystem and maintain a circular economy. Also, since all of us are involved in the use of computers, much of the scalable architecture we have studied is related to data centers.Keywords: scalable architectures, sustainability, application design, disruptive technology, machine learning and natural language processing, AI, social media platform, cloud computing, advanced networking and storage devices, advanced monitoring and metering infrastructure, climate change
Procedia PDF Downloads 106872 Multi-source Question Answering Framework Using Transformers for Attribute Extraction
Authors: Prashanth Pillai, Purnaprajna Mangsuli
Abstract:
Oil exploration and production companies invest considerable time and efforts to extract essential well attributes (like well status, surface, and target coordinates, wellbore depths, event timelines, etc.) from unstructured data sources like technical reports, which are often non-standardized, multimodal, and highly domain-specific by nature. It is also important to consider the context when extracting attribute values from reports that contain information on multiple wells/wellbores. Moreover, semantically similar information may often be depicted in different data syntax representations across multiple pages and document sources. We propose a hierarchical multi-source fact extraction workflow based on a deep learning framework to extract essential well attributes at scale. An information retrieval module based on the transformer architecture was used to rank relevant pages in a document source utilizing the page image embeddings and semantic text embeddings. A question answering framework utilizingLayoutLM transformer was used to extract attribute-value pairs incorporating the text semantics and layout information from top relevant pages in a document. To better handle context while dealing with multi-well reports, we incorporate a dynamic query generation module to resolve ambiguities. The extracted attribute information from various pages and documents are standardized to a common representation using a parser module to facilitate information comparison and aggregation. Finally, we use a probabilistic approach to fuse information extracted from multiple sources into a coherent well record. The applicability of the proposed approach and related performance was studied on several real-life well technical reports.Keywords: natural language processing, deep learning, transformers, information retrieval
Procedia PDF Downloads 193871 Intuition in Negotiation within Ghanaian Social Contexts: Exploring Female Leadership Strategies for Conflict Transformation
Authors: Nadia Naadu Nartey, Esther A.O.G. Tetteh
Abstract:
Male negotiator representations and the appreciation of masculine traits in negotiation contexts dominate negotiation research in the field of conflict management and resolution. This study switched focus to pay attention to rarely examined gendered criteria and social contexts in negotiation research by investigating how intuition has been used in negotiations by female leaders toward conflict transformation in Ghanaian social contexts. Using the theoretical lenses of Klein’s Recognition-Primed Decisions (RPD) and Unconscious Information Processing (UIP) models, this study employs narrative inquiry in qualitative research. Semi-structured interviews of five (5) female leaders of Ghanaian social contexts in the United States (US) revealed that the use of intuition is necessary for effective negotiation outcomes due to its primal focus on relationship-building toward transforming conflicts. The knowledge added to the body of research by this study is summed up in the study’s conceptual framework. Female leaders, in negotiation situations where there are conflicting parties, prioritize the greater need for stronger relationships and win-win outcomes. The participant female leaders in negotiation contexts utilize their intuition as a bonding mechanism by effectively timing their actions, using an appropriate communication tone, emphasizing relationship building, and drawing from experience to make sound situational judgments (as in assessing a situation in the RPD model). Female leaders’ use of intuition in negotiations then translates to creating a force that bridges the gap between the conflicting parties. That force is noticed as conflict transformation that manifests as a reduction in anger and a promotion of trust and mutual understanding toward strengthening relationships. Future studies can expand the scope of the findings of this research by conducting a comparative analysis between male and female leaders on their use of intuition in negotiations in Ghanaian contexts.Keywords: intuition, negotiation, conflict transformation, female leaders, ghanaian social contexts
Procedia PDF Downloads 11870 Thermo-Oxidative Degradation of Esterified Starch (with Lauric Acid) -Plastic Composite Assembled with Pro-Oxidants and Elastomers
Authors: R. M. S. Sachini Amararathne
Abstract:
This research is striving to develop a thermo degradable starch plastic compound/ masterbatch for industrial packaging applications. A native corn starch-modified with an esterification reaction of lauric acid is melt blent with an unsaturated elastomer (styrene-butadiene-rubber/styrene-butadiene-styrene). A trace amount of metal salt is added into the internal mixer to study the effect of pro-oxidants in a thermo oxidative environment. Then the granulated polymer composite which is consisted with 80-86% of polyolefin (LLDP/LDPE/PP) as the pivotal agent; is extruded with processing aids, antioxidants and some other additives in a co-rotating twin-screw extruder. The pelletized composite is subjected to compression molding/ Injection molding or blown film extrusion processes to acquire the samples/specimen for tests. The degradation process is explicated by analyzing the results of fourier transform infrared spectroscopy (FTIR) measurements, thermo oxidative aging studies (placing the dumb-bell specimen in an air oven at 70 °C for four weeks of exposure.) governed by tensile and impact strength test reports. Furthermore, the samples were elicited into manifold outdoors to inspect the degradation process. This industrial process is implemented to reduce the volume of fossil-based garbage by achieving the biodegradability and compostability in the natural cycle. Hence the research leads to manufacturing a degradable plastic packaging compound which is now available in the Sri Lankan market.Keywords: blown film extrusion, compression moulding, polyolefin, pro-oxidant, styrene-butadine-rubber, styrene-butadiene-styrene, thermo oxidative aging, unsaturated elastomer
Procedia PDF Downloads 95869 Bacteriological Quality of Commercially Prepared Fermented Ogi (AKAMU) Sold in Some Parts of South Eastern Nigeria
Authors: Alloysius C. Ogodo, Ositadinma C. Ugbogu, Uzochukwu G. Ekeleme
Abstract:
Food poisoning and infection by bacteria are of public health significance to both developing and developed countries. Samples of ogi (akamu) prepared from white and yellow variety of maize sold in Uturu and Okigwe were analyzed together with the laboratory prepared ogi for microbial quality using the standard microbiological methods. The analyses showed that both white and yellow variety had total bacterial counts (cfu/g) of 4.0 ×107 and 3.9 x 107 for the laboratory prepared ogi while the commercial ogi had 5.2 x 107 and 4.9 x107, 4.9 x107 and 4.5 x107, 5.4 x107 and 5.0 x107 for Eke-Okigwe, Up-gate and Nkwo-Achara market respectively. The Staphylococcal counts ranged from 2.0 x 102 to 5.0 x102 and 1.0 x 102 to 4.0 x102 for the white and yellow variety from the different markets while Staphylococcal growth was not recorded on the laboratory prepared ogi. The laboratory prepared ogi had no Coliform growth while the commercially prepared ogi had counts of 0.5 x103 to 1.6 x 103 for white variety and 0.3 x 103 to 1.1 x103 for yellow variety respectively. The Lactic acid bacterial count of 3.5x106 and 3.0x106 was recorded for the laboratory ogi while the commercially prepared ogi ranged from 3.2x106 to 4.2x106 (white variety) and 3.0 x106 to 3.9 x106 (yellow). The presence of bacteria isolates from the commercial and laboratory fermented ogi showed that Lactobacillus sp, Leuconostoc sp and Citrobacter sp were present in all the samples, Micrococcus sp and Klebsiella sp were isolated from Eke-Okigwe and ABSU-up-gate markets varieties respectively, E. coli and Staphylococcus sp were present in Eke-Okigwe and Nkwo-Achara markets while Salmonella sp were isolated from the three markets. Hence, there are chances of contracting food borne diseases from commercially prepared ogi. Therefore, there is the need for sanitary measures in the production of fermented cereals so as to minimize the rate of food borne pathogens during processing and storage.Keywords: ogi, fermentation, bacterial quality, lactic acid bacteria, maize
Procedia PDF Downloads 407868 Exploration of Environmental Parameters on the Evolution of Vernacular Building Techniques in East Austria
Authors: Hubert Feiglstorfer
Abstract:
Due to its location in a transition zone from the Pannonian to the pre-Alpine region, the east of Austria shows a small-scale diversity in the regional development of certain vernacular building techniques. In this article the relationship between natural building material resources, topography and climate will be examined. Besides environmental preconditions, social and economic historical factors have developed different construction techniques within certain regions in the Weinviertel and Burgenland, the two eastern federal states of Austria. But even within these regions, varying building techniques were found, due to the locally different use of raw materials like wood, stone, clay, lime, or organic fibres. Within these small-scale regions, building traditions were adapted over the course of time due to changes in the use of the building material, for example from wood to brick or from wood to earth. The processing of the raw materials varies from region to region, for example as rammed earth, cob, log, or brick construction. Environmental preconditions cross national borders. For that reason, developments in the neighbouring countries, the Czech Republic, Slovakia, Hungary and Slovenia are included in this analysis. As an outcome of this research a map was drawn which shows the interrelation between locally available building materials, topography, climate and local building techniques? As a result of this study, which covers the last 300 years, one can see how the local population used natural resources very sensitively adapted to local environmental preconditions. In the case of clay, for example, changes of proportions of lime and particular minerals cause structural changes that differ from region to region. Based on material analyses in the field of clay mineralogy, on ethnographic research, literature and archive research, explanations for certain local structural developments will be given for the first time over the region of East Austria.Keywords: European crafts, material culture, architectural history, earthen architecture, earth building history
Procedia PDF Downloads 238867 Film Dosimetry – An Asset for Collaboration Between Cancer Radiotherapy Centers at Established Institutions and Those Located in Low- and Middle-Income Countries
Authors: A. Fomujong, P. Mobit, A. Ndlovu, R. Teboh
Abstract:
Purpose: Film’s unique qualities, such as tissue equivalence, high spatial resolution, near energy independence and comparatively less expensive dosimeter, ought to make it the preferred and widely used in radiotherapy centers in low and middle income countries (LMICs). This, however, is not always the case, as other factors that are often maybe taken for granted in advanced radiotherapy centers remain a challenge in LMICs. We explored the unique qualities of film dosimetry that can make it possible for one Institution to benefit from another’s protocols via collaboration. Methods: For simplicity, two Institutions were considered in this work. We used a single batch of films (EBT-XD) and established a calibration protocol, including scan protocols and calibration curves, using the radiotherapy delivery system at Institution A. We then proceeded and performed patient-specific QA for patients treated on system A (PSQA-A-A). Films from the same batch were then sent to a remote center for PSQA on radiotherapy delivery system B. Irradiations were done at Institution B and then returned to Institution A for processing and analysis (PSQA-B-A). The following points were taken into consideration throughout the process (a) A reference film was irradiated to a known dose on the same system irradiating the PSQA film. (b) For calibration, we utilized the one-scan protocol and maintained the same scan orientation of the calibration, PSQA and reference films. Results: Gamma index analysis using a dose threshold of 10% and 3%/2mm criteria showed a gamma passing rate of 99.8% and 100% for the PSQA-A-A and PSQA-B-A, respectively. Conclusion: This work demonstrates that one could use established film dosimetry protocols in one Institution, e.g., an advanced radiotherapy center and apply similar accuracies to irradiations performed at another institution, e.g., a center located in LMIC, which thus encourages collaboration between the two for worldwide patient benefits.Keywords: collaboration, film dosimetry, LMIC, radiotherapy, calibration
Procedia PDF Downloads 75866 Signed Language Phonological Awareness: Building Deaf Children's Vocabulary in Signed and Written Language
Authors: Lynn Mcquarrie, Charlotte Enns
Abstract:
The goal of this project was to develop a visually-based, signed language phonological awareness training program and to pilot the intervention with signing deaf children (ages 6 -10 years/ grades 1 - 4) who were beginning readers to assess the effects of systematic explicit American Sign Language (ASL) phonological instruction on both ASL vocabulary and English print vocabulary learning. Growing evidence that signing learners utilize visually-based signed language phonological knowledge (homologous to the sound-based phonological level of spoken language processing) when reading underscore the critical need for further research on the innovation of reading instructional practices for visual language learners. Multiple single-case studies using a multiple probe design across content (i.e., sign and print targets incorporating specific ASL phonological parameters – handshapes) was implemented to examine if a functional relationship existed between instruction and acquisition of these skills. The results indicated that for all cases, representing a variety of language abilities, the visually-based phonological teaching approach was exceptionally powerful in helping children to build their sign and print vocabularies. Although intervention/teaching studies have been essential in testing hypotheses about spoken language phonological processes supporting non-deaf children’s reading development, there are no parallel intervention/teaching studies exploring hypotheses about signed language phonological processes in supporting deaf children’s reading development. This study begins to provide the needed evidence to pursue innovative teaching strategies that incorporate the strengths of visual learners.Keywords: American sign language phonological awareness, dual language strategies, vocabulary learning, word reading
Procedia PDF Downloads 333865 ARABEX: Automated Dotted Arabic Expiration Date Extraction using Optimized Convolutional Autoencoder and Custom Convolutional Recurrent Neural Network
Authors: Hozaifa Zaki, Ghada Soliman
Abstract:
In this paper, we introduced an approach for Automated Dotted Arabic Expiration Date Extraction using Optimized Convolutional Autoencoder (ARABEX) with bidirectional LSTM. This approach is used for translating the Arabic dot-matrix expiration dates into their corresponding filled-in dates. A custom lightweight Convolutional Recurrent Neural Network (CRNN) model is then employed to extract the expiration dates. Due to the lack of available dataset images for the Arabic dot-matrix expiration date, we generated synthetic images by creating an Arabic dot-matrix True Type Font (TTF) matrix to address this limitation. Our model was trained on a realistic synthetic dataset of 3287 images, covering the period from 2019 to 2027, represented in the format of yyyy/mm/dd. We then trained our custom CRNN model using the generated synthetic images to assess the performance of our model (ARABEX) by extracting expiration dates from the translated images. Our proposed approach achieved an accuracy of 99.4% on the test dataset of 658 images, while also achieving a Structural Similarity Index (SSIM) of 0.46 for image translation on our dataset. The ARABEX approach demonstrates its ability to be applied to various downstream learning tasks, including image translation and reconstruction. Moreover, this pipeline (ARABEX+CRNN) can be seamlessly integrated into automated sorting systems to extract expiry dates and sort products accordingly during the manufacturing stage. By eliminating the need for manual entry of expiration dates, which can be time-consuming and inefficient for merchants, our approach offers significant results in terms of efficiency and accuracy for Arabic dot-matrix expiration date recognition.Keywords: computer vision, deep learning, image processing, character recognition
Procedia PDF Downloads 82864 Preferred Character Size for Oblique Angles
Authors: Photjanat Phimnom, Haruetai Lohasiriwat
Abstract:
In today’s world, the LED display has been used for presenting visual information under various circumstances. Such information is an important intermediary in the human information processing. Researchers have been investigated diverse factors that influence this process effectiveness. The letter size is undoubtedly one major factor that has been tested and recommended by many standards and guidelines. However, viewing information on the display from direct perpendicular position is a typical assumption whereas many actual events are required viewing from the angles. This current research aims to study the effect of oblique viewing angle and viewing distance on ability to recognize alphabet, number, and English word. The total of ten participants was volunteered to our 3 x 4 x 4 within subject study. Independent variables include three distance levels (2, 6, and 12 m), four oblique angle (0, 45, 60, 75 degree), and four target types (alphabet, number, short words, and long words). Following the method of constant stimuli we found that the larger oblique angle, ranging from 0 to 75 degree from the line of sight, results in significant higher legibility threshold or larger font size required (p-value < 0.05). Viewing distance factor also shows to have significant effect on the threshold (p-value < 0.05). However, the effect from distance factor is expected to be confounded by the quality of the screen we used in our experiment. Lastly, our results show that single alphabet as well as single number are recognized at significant lower threshold (smaller font size) as compared to both short and long words (p-value < 0.05). Therefore, it is recommended that when designs information to be presented on LED display, understanding of all possible ranges of oblique angle should be taken into account in order to specify the preferred letter size. Additionally, the recommendation of letter size for 100 % readability in our tested conditions is provided in the paper.Keywords: letter size, oblique angle, viewing distance, legibility threshold
Procedia PDF Downloads 394863 In vitro Effects of Salvia officinalis on Bovine Spermatozoa
Authors: Eva Tvrdá, Boris Botman, Marek Halenár, Tomáš Slanina, Norbert Lukáč
Abstract:
In vitro storage and processing of animal semen represents a risk factor to spermatozoa vitality, potentially leading to reduced fertility. A variety of substances isolated from natural sources may exhibit protective or antioxidant properties on the spermatozoon, thus extending the lifespan of stored ejaculates. This study compared the ability of different concentrations of the Salvia officinalis extract on the motility, mitochondrial activity, viability and reactive oxygen species (ROS) production by bovine spermatozoa during different time periods (0, 2, 6 and 24 h) of in vitro culture. Spermatozoa motility was assessed using the Computer-assisted sperm analysis (CASA) system. Cell viability was examined using the metabolic activity MTT assay, the eosin-nigrosin staining technique was used to evaluate the sperm viability and ROS generation was quantified using luminometry. The CASA analysis revealed that the motility in the experimental groups supplemented with 0.5-2 µg/mL Salvia extract was significantly lower in comparison with the control (P<0.05; Time 24 h). At the same time, a long-term exposure of spermatozoa to concentrations ranging between 0.05 µg/mL and 2 µg/mL had a negative impact on the mitochondrial metabolism (P<0.05; Time 24 h). The viability staining revealed that 0.001-1 µg/mL Salvia extract had no effects on bovine male gametes, however 2 µg/mL Salvia had a persisting negative effect on spermatozoa (P<0.05). Furthermore 0.05-2 µg/mL Salvia exhibited an immediate ROS-promoting effect on the sperm culture (P>0.05; Time 0 h and 2 h), which remained significant throughout the entire in vitro culture (P<0.05; Time 24 h). Our results point out to the necessity to examine specific effects the biomolecules present in Salvia officinalis may have individually or collectively on the in vitro sperm vitality and oxidative profile.Keywords: bulls, CASA, MTT test, reactive oxygen species, sage, Salvia officinalis, spermatozoa
Procedia PDF Downloads 338862 A Comparative Study on Fish Raised with Feed Formulated with Various Organic Wastes and Commercial Feed
Authors: Charles Chijioke Dike, Hugh Clifford Chima Maduka, Chinwe A. Isibor
Abstract:
Fish is among the products consumed at a very high rate. In most countries of the world, fish are used as part of the daily meal. The high cost of commercial fish feeds in Africa has made it necessary the development of an alternative source of fish feed processing from organic waste. The objective of this research is to investigate the efficacy of fish feeds processed from various animal wastes in order to know whether those feeds shall be alternatives to commercial feeds. This work shall be carried out at the Research Laboratory Unit of the Department of Human Biochemistry, Faculty of Basic Medical Sciences, College of Health Sciences, Nnamdi Azikiwe University (NAU), Nnewi Campus, Anambra State. The fingerlings to be used shall be gotten from the Agricultural Department of NAU, Awka, Anambra State, and allowed to acclimatize for 14 d. Animal and food wastes shall be gotten from Nnewi. The fish shall be grouped into 1-13 (Chicken manure only, cow dung only, pig manure only, chicken manure + yeast, cow dung + yeast, pig manure + yeast, chicken manure + other wastes + yeast, cow dung + other wastes + yeast, and pig manure + other wastes + yeast. Feed assessment shall be carried out by determining bulk density, feed water absorption, feed hardness, feed oil absorption, and feed water stability. The nutritional analysis shall be carried out on the feeds processed. The risk assessment shall be done on the fish by determining methylmercury (MeHg), polycyclic aromatic hydrocarbons (PAHs), and dichloro-diphenyl-trichloroethane (DDT) in the fish. The results from this study shall be analyzed statistically using SPSS statistical software, version 25. The hypothesis is that fish feeds processed from animal wastes are efficient in raising catfish. The outcome of this study shall provide the basis for the formulation of fish feeds from organic wastes.Keywords: assessment, feeds, health risk, wastes
Procedia PDF Downloads 103861 X-Ray Diffraction and Crosslink Density Analysis of Starch/Natural Rubber Polymer Composites Prepared by Latex Compounding Method
Authors: Raymond Dominic Uzoh
Abstract:
Starch fillers were extracted from three plant sources namely amora tuber (a wild variety of Irish potato), sweet potato and yam starch and their particle size, pH, amylose, and amylopectin percentage decomposition determined accordingly by high performance liquid chromatography (HPLC). The starch was introduced into natural rubber in liquid phase (through gelatinization) by the latex compounding method and compounded according to standard method. The prepared starch/natural rubber composites was characterized by Instron Universal testing machine (UTM) for tensile mechanical properties. The composites was further characterized by x-ray diffraction and crosslink density analysis. The particle size determination showed that amora starch granules have the highest particle size (156 × 47 μm) followed by yam starch (155× 40 μm) and then the sweet potato starch (153 × 46 μm). The pH test also revealed that amora starch has a near neutral pH of 6.9, yam 6.8, and sweet potato 5.2 respectively. Amylose and amylopectin determination showed that yam starch has a higher percentage of amylose (29.68), followed by potato (22.34) and then amora starch with the lowest value (14.86) respectively. The tensile mechanical properties testing revealed that yam starch produced the best tensile mechanical properties followed by amora starch and then sweet potato starch. The structure, crystallinity/amorphous nature of the product composite was confirmed by x-ray diffraction, while the nature of crosslinking was confirmed by swelling test in toluene solvent using the Flory-Rehner approach. This research study has rendered a workable strategy for enhancing interfacial interaction between a hydrophilic filler (starch) and hydrophobic polymeric matrix (natural rubber) yielding moderately good tensile mechanical properties for further exploitation development and application in the rubber processing industry.Keywords: natural rubber, fillers, starch, amylose, amylopectin, crosslink density
Procedia PDF Downloads 169860 Integration of Big Data to Predict Transportation for Smart Cities
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system. The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.Keywords: big data, machine learning, smart city, social cost, transportation network
Procedia PDF Downloads 260859 Bridge Damage Detection and Stiffness Reduction Using Vibration Data: Experimental Investigation on a Small Scale Steel Bridge
Authors: Mirco Tarozzi, Giacomo Pignagnoli, Andrea Benedetti
Abstract:
The design of planning maintenance of civil structures often requires the evaluation of their level of safety in order to be able to choose which structure, and in which measure, it needs a structural retrofit. This work deals with the evaluation of the stiffness reduction of a scaled steel deck due to the presence of localized damages. The dynamic tests performed on it have shown the variability of its main frequencies linked to the gradual reduction of its rigidity. This deck consists in a steel grillage of four secondary beams and three main beams linked to a concrete slab. This steel deck is 6 m long and 3 m wide and it rests on two abutments made of concrete. By processing the signals of the accelerations due to a random excitation of the deck, the main natural frequencies of this bridge have been extracted. In order to assign more reliable parameters to the numerical model of the deck, some load tests have been performed and the mechanical property of the materials and the supports have been obtained. The two external beams have been cut at one third of their length and the structural strength has been restored by the design of a bolted plate. The gradual loss of the bolts and the plates removal have made the simulation of localized damage possible. In order to define the relationship between frequency variation and loss in stiffness, the identification of its natural frequencies has been performed, before and after the occurrence of the damage, corresponding to each step. The study of the relationship between stiffness losses and frequency shifts has been reported in this paper: the square of the frequency variation due to the presence of the damage is proportional to the ratio between the rigidities. This relationship can be used to quantify the loss in stiffness of a real scale bridge in an efficient way.Keywords: damage detection, dynamic test, frequency shifts, operational modal analysis, steel bridge
Procedia PDF Downloads 160858 Hedgerow Detection and Characterization Using Very High Spatial Resolution SAR DATA
Authors: Saeid Gharechelou, Stuart Green, Fiona Cawkwell
Abstract:
Hedgerow has an important role for a wide range of ecological habitats, landscape, agriculture management, carbon sequestration, wood production. Hedgerow detection accurately using satellite imagery is a challenging problem in remote sensing techniques, because in the special approach it is very similar to line object like a road, from a spectral viewpoint, a hedge is very similar to a forest. Remote sensors with very high spatial resolution (VHR) recently enable the automatic detection of hedges by the acquisition of images with enough spectral and spatial resolution. Indeed, recently VHR remote sensing data provided the opportunity to detect the hedgerow as line feature but still remain difficulties in monitoring the characterization in landscape scale. In this research is used the TerraSAR-x Spotlight and Staring mode with 3-5 m resolution in wet and dry season in the test site of Fermoy County, Ireland to detect the hedgerow by acquisition time of 2014-2015. Both dual polarization of Spotlight data in HH/VV is using for detection of hedgerow. The varied method of SAR image technique with try and error way by integration of classification algorithm like texture analysis, support vector machine, k-means and random forest are using to detect hedgerow and its characterization. We are applying the Shannon entropy (ShE) and backscattering analysis in single and double bounce in polarimetric analysis for processing the object-oriented classification and finally extracting the hedgerow network. The result still is in progress and need to apply the other method as well to find the best method in study area. Finally, this research is under way to ahead to get the best result and here just present the preliminary work that polarimetric image of TSX potentially can detect the hedgerow.Keywords: TerraSAR-X, hedgerow detection, high resolution SAR image, dual polarization, polarimetric analysis
Procedia PDF Downloads 230857 Enzyme Producing Psyhrophilic Pseudomonas app. Isolated from Poultry Meats
Authors: Ali Aydin, Mert Sudagidan, Aysen Coban, Alparslan Kadir Devrim
Abstract:
Pseudomonas spp. (specifically, P. fluorescens and P. fragi) are considered the principal spoilage microorganisms of refrigerated poultry meats. The higher the level psychrophilic spoilage Pseudomonas spp. on carcasses at the end of processing lead to decrease the shelf life of the refrigerated product. The aim of the study was the identification of psychrophilic Pseudomonas spp. having proteolytic and lipolytic activities from poultry meats by 16S rRNA and rpoB gene sequencing, investigation of protease and lipase related genes and determination of proteolytic activity of Pseudomonas spp. In the of isolation procedure, collected chicken meat samples from local markets and slaughterhouses were homogenized and the lysates were incubated on Standard method agar and Skim Milk agar for selection of proteolytic bacteria and tributyrin agar for selection of lipolytic bacteria at +4 °C for 7 days. After detection of proteolytic and lipolytic colonies, the isolates were firstly analyzed by biochemical tests such as Gram staining, catalase and oxidase tests. DNA gene sequencing analysis and comparison with GenBank revealed that 126 strong enzyme Pseudomonas spp. were identified as predominantly P. fluorescens (n=55), P. fragi (n=42), Pseudomonas spp. (n=24), P. cedrina (n=2), P. poae (n=1), P. koreensis (n=1), and P. gessardi (n=1). Additionally, protease related aprX gene was screened in the strains and it was detected in 69/126 strains, whereas, lipase related lipA gene was found in 9 Pseudomonas strains. Protease activity was determined using commercially available protease assay kit and 5 strains showed high protease activity. The results showed that psychrophilic Pseudomonas strains were present in chicken meat samples and they can produce important levels of proteases and lipases for food spoilage to decrease food quality and safety.Keywords: Pseudomonas, chicken meat, protease, lipase
Procedia PDF Downloads 387856 Tool for Maxillary Sinus Quantification in Computed Tomography Exams
Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina
Abstract:
The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.Keywords: maxillary sinus, support vector machine, region growing, volume quantification
Procedia PDF Downloads 504855 Intrinsically Dual-Doped Conductive Polymer System for Electromagnetic Shielding Applications
Authors: S. Koul, Joshua Adedamola
Abstract:
Currently, the global concerning fact about electromagnetic pollution (EMP) is that it not only adversely affects human health but rather projects the malfunctioning of sensitive equipment both locally and at a global level. The market offers many incumbent technologies to solve the issues, but still, a processable sustainable material solution with acceptable limits for GHG emission is still at an exploratory stage. The present work offers a sustainable material solution with a wide range of processability in terms of a polymeric resin matrix and shielding operational efficiency across the electromagnetic spectrum, covering both ionizing and non-ionizing electromagnetic radiations. The present work offers an in-situ synthesized conducting polyaniline (PANI) in the presence of the hybrid dual dopant system with tuned conductivity and high shielding efficiency between 89 to 92 decibels, depending upon the EMI frequency range. The conductive polymer synthesized in the presence of a hybrid dual dopant system via the in-situ emulsion polymerization method offers a higher surface resistance of 1.0 ohms/cm with thermal stability up to 2450C in their powder form. This conductive polymer with a hybrid dual dopant system was used as a filler material with different polymeric thermoplastic resin systems for the preparation of conductive composites. Intrinsically Conductive polymeric (ICP) composites based on hybrid dual dopant systems were prepared using melt blending, extrusion, and finally by, compression molding processing techniques. ICP composites with hybrid dual dopant systems offered good mechanical, thermal, structural, weathering, and stable surface resistivity properties over a period of time. The preliminary shielding behavior for ICP composites between frequency levels of 10 GHz to 24GHZ offered a shielding efficiency of more than 90 dB.Keywords: ICP, dopant, EMI, shielding
Procedia PDF Downloads 81854 Modeling in the Middle School: Eighth-Grade Students’ Construction of the Summer Job Problem
Authors: Neslihan Sahin Celik, Ali Eraslan
Abstract:
Mathematical model and modeling are one of the topics that have been intensively discussed in recent years. In line with the results of the PISA studies, researchers in many countries have begun to question how much students in school-education system are prepared to solve the real-world problems they encounter in their future professional lives. As a result, many mathematics educators have begun to emphasize the importance of new skills and understanding such as constructing, Hypothesizing, Describing, manipulating, predicting, working together for complex and multifaceted problems for success in beyond the school. When students increasingly face this kind of situations in their daily life, it is important to make sure that students have enough experience to work together and interpret mathematical situations that enable them to think in different ways and share their ideas with their peers. Thus, model eliciting activities are one of main tools that help students to gain experiences and the new skills required. This research study was carried on the town center of a big city located in the Black Sea region in Turkey. The participants were eighth-grade students in a middle school. After a six-week preliminary study, three students in an eighth-grade classroom were selected using criterion sampling technique and placed in a focus group. The focus group of three students was videotaped as they worked on a model eliciting activity, the Summer Job Problem. The conversation of the group was transcribed, examined with students’ written work and then qualitatively analyzed through the lens of Blum’s (1996) modeling processing cycle. The study results showed that eighth grade students can successfully work with the model eliciting, develop a model based on the two parameters and review the whole process. On the other hand, they had difficulties to relate parameters to each other and take all parameters into account to establish the model.Keywords: middle school, modeling, mathematical modeling, summer job problem
Procedia PDF Downloads 337853 Preparing Data for Calibration of Mechanistic-Empirical Pavement Design Guide in Central Saudi Arabia
Authors: Abdulraaof H. Alqaili, Hamad A. Alsoliman
Abstract:
Through progress in pavement design developments, a pavement design method was developed, which is titled the Mechanistic Empirical Pavement Design Guide (MEPDG). Nowadays, the evolution in roads network and highways is observed in Saudi Arabia as a result of increasing in traffic volume. Therefore, the MEPDG currently is implemented for flexible pavement design by the Saudi Ministry of Transportation. Implementation of MEPDG for local pavement design requires the calibration of distress models under the local conditions (traffic, climate, and materials). This paper aims to prepare data for calibration of MEPDG in Central Saudi Arabia. Thus, the first goal is data collection for the design of flexible pavement from the local conditions of the Riyadh region. Since, the modifying of collected data to input data is needed; the main goal of this paper is the analysis of collected data. The data analysis in this paper includes processing each: Trucks Classification, Traffic Growth Factor, Annual Average Daily Truck Traffic (AADTT), Monthly Adjustment Factors (MAFi), Vehicle Class Distribution (VCD), Truck Hourly Distribution Factors, Axle Load Distribution Factors (ALDF), Number of axle types (single, tandem, and tridem) per truck class, cloud cover percent, and road sections selected for the local calibration. Detailed descriptions of input parameters are explained in this paper, which leads to providing of an approach for successful implementation of MEPDG. Local calibration of MEPDG to the conditions of Riyadh region can be performed based on the findings in this paper.Keywords: mechanistic-empirical pavement design guide (MEPDG), traffic characteristics, materials properties, climate, Riyadh
Procedia PDF Downloads 226852 Vitamin Content of Swordfish (Xhiphias gladius) Affected by Salting and Frying
Authors: L. Piñeiro, N. Cobas, L. Gómez-Limia, S. Martínez, I. Franco
Abstract:
The swordfish (Xiphias gladius) is a large oceanic fish of high commercial value, which is widely distributed in waters of the world’s oceans. They are considered to be an important source of high quality proteins, vitamins and essential fatty acids, although only half of the population follows the recommendation of nutritionists to consume fish at least twice a week. Swordfish is consumed worldwide because of its low fat content and high protein content. It is generally sold as fresh, frozen, and as pieces or slices. The aim of this study was to evaluate the effect of salting and frying on the composition of the water-soluble vitamins (B2, B3, B9 and B12) and fat-soluble vitamins (A, D, and E) of swordfish. Three loins of swordfish from Pacific Ocean were analyzed. All the fishes had a weight between 50 and 70 kg and were transported to the laboratory frozen (-18 ºC). Before the processing, they were defrosted at 4 ºC. Each loin was sliced and salted in brine. After cleaning the slices, they were divided into portions (10×2 cm) and fried in olive oil. The identification and quantification of vitamins were carried out by high-performance liquid chromatography (HPLC), using methanol and 0.010% trifluoroacetic acid as mobile phases at a flow-rate of 0.7 mL min-1. The UV-Vis detector was used for the detection of the water- and fat-soluble vitamins (A and D), as well as the fluorescence detector for the detection of the vitamin E. During salting, water and fat-soluble vitamin contents remained constant, observing an evident decrease in the values of vitamin B2. The diffusion of salt into the interior of the pieces and the loss of constitution water that occur during this stage would be related to this significant decrease. In general, after frying water-soluble and fat-soluble vitamins showed a great thermolability with high percentages of retention with values among 50–100%. Vitamin B3 is the one that exhibited higher percentages of retention with values close to 100%. However, vitamin B9 presented the highest losses with a percentage of retention of less than 20%.Keywords: frying, HPLC, salting, swordfish, vitamins
Procedia PDF Downloads 126851 Interactive IoT-Blockchain System for Big Data Processing
Authors: Abdallah Al-ZoubI, Mamoun Dmour
Abstract:
The spectrum of IoT devices is becoming widely diversified, entering almost all possible fields and finding applications in industry, health, finance, logistics, education, to name a few. The IoT active endpoint sensors and devices exceeded the 12 billion mark in 2021 and are expected to reach 27 billion in 2025, with over $34 billion in total market value. This sheer rise in numbers and use of IoT devices bring with it considerable concerns regarding data storage, analysis, manipulation and protection. IoT Blockchain-based systems have recently been proposed as a decentralized solution for large-scale data storage and protection. COVID-19 has actually accelerated the desire to utilize IoT devices as it impacted both demand and supply and significantly affected several regions due to logistic reasons such as supply chain interruptions, shortage of shipping containers and port congestion. An IoT-blockchain system is proposed to handle big data generated by a distributed network of sensors and controllers in an interactive manner. The system is designed using the Ethereum platform, which utilizes smart contracts, programmed in solidity to execute and manage data generated by IoT sensors and devices. such as Raspberry Pi 4, Rasbpian, and add-on hardware security modules. The proposed system will run a number of applications hosted by a local machine used to validate transactions. It then sends data to the rest of the network through InterPlanetary File System (IPFS) and Ethereum Swarm, forming a closed IoT ecosystem run by blockchain where a number of distributed IoT devices can communicate and interact, thus forming a closed, controlled environment. A prototype has been deployed with three IoT handling units distributed over a wide geographical space in order to examine its feasibility, performance and costs. Initial results indicated that big IoT data retrieval and storage is feasible and interactivity is possible, provided that certain conditions of cost, speed and thorough put are met.Keywords: IoT devices, blockchain, Ethereum, big data
Procedia PDF Downloads 150850 Keynote Talk: The Role of Internet of Things in the Smart Cities Power System
Authors: Abdul-Rahman Al-Ali
Abstract:
As the number of mobile devices is growing exponentially, it is estimated to connect about 50 million devices to the Internet by the year 2020. At the end of this decade, it is expected that an average of eight connected devices per person worldwide. The 50 billion devices are not mobile phones and data browsing gadgets only, but machine-to-machine and man-to-machine devices. With such growing numbers of devices the Internet of Things (I.o.T) concept is one of the emerging technologies as of recently. Within the smart grid technologies, smart home appliances, Intelligent Electronic Devices (IED) and Distributed Energy Resources (DER) are major I.o.T objects that can be addressable using the IPV6. These objects are called the smart grid internet of things (SG-I.o.T). The SG-I.o.T generates big data that requires high-speed computing infrastructure, widespread computer networks, big data storage, software, and platforms services. A company’s utility control and data centers cannot handle such a large number of devices, high-speed processing, and massive data storage. Building large data center’s infrastructure takes a long time, it also requires widespread communication networks and huge capital investment. To maintain and upgrade control and data centers’ infrastructure and communication networks as well as updating and renewing software licenses which collectively, requires additional cost. This can be overcome by utilizing the emerging computing paradigms such as cloud computing. This can be used as a smart grid enabler to replace the legacy of utilities data centers. The talk will highlight the role of I.o.T, cloud computing services and their development models within the smart grid technologies.Keywords: intelligent electronic devices (IED), distributed energy resources (DER), internet, smart home appliances
Procedia PDF Downloads 324849 A Thermo-mechanical Finite Element Model to Predict Thermal Cycles and Residual Stresses in Directed Energy Deposition Technology
Authors: Edison A. Bonifaz
Abstract:
In this work, a numerical procedure is proposed to design dense multi-material structures using the Directed Energy Deposition (DED) process. A thermo-mechanical finite element model to predict thermal cycles and residual stresses is presented. A numerical layer build-up procedure coupled with a moving heat flux was constructed to minimize strains and residual stresses that result in the multi-layer deposition of an AISI 316 austenitic steel on an AISI 304 austenitic steel substrate. To simulate the DED process, the automated interface of the ABAQUS AM module was used to define element activation and heat input event data as a function of time and position. Of this manner, the construction of ABAQUS user-defined subroutines was not necessary. Thermal cycles and thermally induced stresses created during the multi-layer deposition metal AM pool crystallization were predicted and validated. Results were analyzed in three independent metal layers of three different experiments. The one-way heat and material deposition toolpath used in the analysis was created with a MatLab path script. An optimal combination of feedstock and heat input printing parameters suitable for fabricating multi-material dense structures in the directed energy deposition metal AM process was established. At constant power, it can be concluded that the lower the heat input, the lower the peak temperatures and residual stresses. It means that from a design point of view, the one-way heat and material deposition processing toolpath with the higher welding speed should be selected.Keywords: event series, thermal cycles, residual stresses, multi-pass welding, abaqus am modeler
Procedia PDF Downloads 69