Search results for: thin film processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5559

Search results for: thin film processing

849 Integration of Big Data to Predict Transportation for Smart Cities

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system.  The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.

Keywords: big data, machine learning, smart city, social cost, transportation network

Procedia PDF Downloads 260
848 Bridge Damage Detection and Stiffness Reduction Using Vibration Data: Experimental Investigation on a Small Scale Steel Bridge

Authors: Mirco Tarozzi, Giacomo Pignagnoli, Andrea Benedetti

Abstract:

The design of planning maintenance of civil structures often requires the evaluation of their level of safety in order to be able to choose which structure, and in which measure, it needs a structural retrofit. This work deals with the evaluation of the stiffness reduction of a scaled steel deck due to the presence of localized damages. The dynamic tests performed on it have shown the variability of its main frequencies linked to the gradual reduction of its rigidity. This deck consists in a steel grillage of four secondary beams and three main beams linked to a concrete slab. This steel deck is 6 m long and 3 m wide and it rests on two abutments made of concrete. By processing the signals of the accelerations due to a random excitation of the deck, the main natural frequencies of this bridge have been extracted. In order to assign more reliable parameters to the numerical model of the deck, some load tests have been performed and the mechanical property of the materials and the supports have been obtained. The two external beams have been cut at one third of their length and the structural strength has been restored by the design of a bolted plate. The gradual loss of the bolts and the plates removal have made the simulation of localized damage possible. In order to define the relationship between frequency variation and loss in stiffness, the identification of its natural frequencies has been performed, before and after the occurrence of the damage, corresponding to each step. The study of the relationship between stiffness losses and frequency shifts has been reported in this paper: the square of the frequency variation due to the presence of the damage is proportional to the ratio between the rigidities. This relationship can be used to quantify the loss in stiffness of a real scale bridge in an efficient way.

Keywords: damage detection, dynamic test, frequency shifts, operational modal analysis, steel bridge

Procedia PDF Downloads 160
847 Hedgerow Detection and Characterization Using Very High Spatial Resolution SAR DATA

Authors: Saeid Gharechelou, Stuart Green, Fiona Cawkwell

Abstract:

Hedgerow has an important role for a wide range of ecological habitats, landscape, agriculture management, carbon sequestration, wood production. Hedgerow detection accurately using satellite imagery is a challenging problem in remote sensing techniques, because in the special approach it is very similar to line object like a road, from a spectral viewpoint, a hedge is very similar to a forest. Remote sensors with very high spatial resolution (VHR) recently enable the automatic detection of hedges by the acquisition of images with enough spectral and spatial resolution. Indeed, recently VHR remote sensing data provided the opportunity to detect the hedgerow as line feature but still remain difficulties in monitoring the characterization in landscape scale. In this research is used the TerraSAR-x Spotlight and Staring mode with 3-5 m resolution in wet and dry season in the test site of Fermoy County, Ireland to detect the hedgerow by acquisition time of 2014-2015. Both dual polarization of Spotlight data in HH/VV is using for detection of hedgerow. The varied method of SAR image technique with try and error way by integration of classification algorithm like texture analysis, support vector machine, k-means and random forest are using to detect hedgerow and its characterization. We are applying the Shannon entropy (ShE) and backscattering analysis in single and double bounce in polarimetric analysis for processing the object-oriented classification and finally extracting the hedgerow network. The result still is in progress and need to apply the other method as well to find the best method in study area. Finally, this research is under way to ahead to get the best result and here just present the preliminary work that polarimetric image of TSX potentially can detect the hedgerow.

Keywords: TerraSAR-X, hedgerow detection, high resolution SAR image, dual polarization, polarimetric analysis

Procedia PDF Downloads 230
846 Enzyme Producing Psyhrophilic Pseudomonas app. Isolated from Poultry Meats

Authors: Ali Aydin, Mert Sudagidan, Aysen Coban, Alparslan Kadir Devrim

Abstract:

Pseudomonas spp. (specifically, P. fluorescens and P. fragi) are considered the principal spoilage microorganisms of refrigerated poultry meats. The higher the level psychrophilic spoilage Pseudomonas spp. on carcasses at the end of processing lead to decrease the shelf life of the refrigerated product. The aim of the study was the identification of psychrophilic Pseudomonas spp. having proteolytic and lipolytic activities from poultry meats by 16S rRNA and rpoB gene sequencing, investigation of protease and lipase related genes and determination of proteolytic activity of Pseudomonas spp. In the of isolation procedure, collected chicken meat samples from local markets and slaughterhouses were homogenized and the lysates were incubated on Standard method agar and Skim Milk agar for selection of proteolytic bacteria and tributyrin agar for selection of lipolytic bacteria at +4 °C for 7 days. After detection of proteolytic and lipolytic colonies, the isolates were firstly analyzed by biochemical tests such as Gram staining, catalase and oxidase tests. DNA gene sequencing analysis and comparison with GenBank revealed that 126 strong enzyme Pseudomonas spp. were identified as predominantly P. fluorescens (n=55), P. fragi (n=42), Pseudomonas spp. (n=24), P. cedrina (n=2), P. poae (n=1), P. koreensis (n=1), and P. gessardi (n=1). Additionally, protease related aprX gene was screened in the strains and it was detected in 69/126 strains, whereas, lipase related lipA gene was found in 9 Pseudomonas strains. Protease activity was determined using commercially available protease assay kit and 5 strains showed high protease activity. The results showed that psychrophilic Pseudomonas strains were present in chicken meat samples and they can produce important levels of proteases and lipases for food spoilage to decrease food quality and safety.

Keywords: Pseudomonas, chicken meat, protease, lipase

Procedia PDF Downloads 387
845 Tool for Maxillary Sinus Quantification in Computed Tomography Exams

Authors: Guilherme Giacomini, Ana Luiza Menegatti Pavan, Allan Felipe Fattori Alves, Marcela de Oliveira, Fernando Antonio Bacchim Neto, José Ricardo de Arruda Miranda, Seizo Yamashita, Diana Rodrigues de Pina

Abstract:

The maxillary sinus (MS), part of the paranasal sinus complex, is one of the most enigmatic structures in modern humans. The literature has suggested that MSs function as olfaction accessories, to heat or humidify inspired air, for thermoregulation, to impart resonance to the voice and others. Thus, the real function of the MS is still uncertain. Furthermore, the MS anatomy is complex and varies from person to person. Many diseases may affect the development process of sinuses. The incidence of rhinosinusitis and other pathoses in the MS is comparatively high, so, volume analysis has clinical value. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure, which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust, and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression, and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to quantify MS volume proved to be robust, fast, and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases. Providing volume values for MS could be helpful in evaluating the presence of any abnormality and could be used for treatment planning and evaluation of the outcome. The computed tomography (CT) has allowed a more exact assessment of this structure which enables a quantitative analysis. However, this is not always possible in the clinical routine, and if possible, it involves much effort and/or time. Therefore, it is necessary to have a convenient, robust and practical tool correlated with the MS volume, allowing clinical applicability. Nowadays, the available methods for MS segmentation are manual or semi-automatic. Additionally, manual methods present inter and intraindividual variability. Thus, the aim of this study was to develop an automatic tool to quantity the MS volume in CT scans of paranasal sinuses. This study was developed with ethical approval from the authors’ institutions and national review panels. The research involved 30 retrospective exams of University Hospital, Botucatu Medical School, São Paulo State University, Brazil. The tool for automatic MS quantification, developed in Matlab®, uses a hybrid method, combining different image processing techniques. For MS detection, the algorithm uses a Support Vector Machine (SVM), by features such as pixel value, spatial distribution, shape and others. The detected pixels are used as seed point for a region growing (RG) segmentation. Then, morphological operators are applied to reduce false-positive pixels, improving the segmentation accuracy. These steps are applied in all slices of CT exam, obtaining the MS volume. To evaluate the accuracy of the developed tool, the automatic method was compared with manual segmentation realized by an experienced radiologist. For comparison, we used Bland-Altman statistics, linear regression and Jaccard similarity coefficient. From the statistical analyses for the comparison between both methods, the linear regression showed a strong association and low dispersion between variables. The Bland–Altman analyses showed no significant differences between the analyzed methods. The Jaccard similarity coefficient was > 0.90 in all exams. In conclusion, the developed tool to automatically quantify MS volume proved to be robust, fast and efficient, when compared with manual segmentation. Furthermore, it avoids the intra and inter-observer variations caused by manual and semi-automatic methods. As future work, the tool will be applied in clinical practice. Thus, it may be useful in the diagnosis and treatment determination of MS diseases.

Keywords: maxillary sinus, support vector machine, region growing, volume quantification

Procedia PDF Downloads 504
844 Intrinsically Dual-Doped Conductive Polymer System for Electromagnetic Shielding Applications

Authors: S. Koul, Joshua Adedamola

Abstract:

Currently, the global concerning fact about electromagnetic pollution (EMP) is that it not only adversely affects human health but rather projects the malfunctioning of sensitive equipment both locally and at a global level. The market offers many incumbent technologies to solve the issues, but still, a processable sustainable material solution with acceptable limits for GHG emission is still at an exploratory stage. The present work offers a sustainable material solution with a wide range of processability in terms of a polymeric resin matrix and shielding operational efficiency across the electromagnetic spectrum, covering both ionizing and non-ionizing electromagnetic radiations. The present work offers an in-situ synthesized conducting polyaniline (PANI) in the presence of the hybrid dual dopant system with tuned conductivity and high shielding efficiency between 89 to 92 decibels, depending upon the EMI frequency range. The conductive polymer synthesized in the presence of a hybrid dual dopant system via the in-situ emulsion polymerization method offers a higher surface resistance of 1.0 ohms/cm with thermal stability up to 2450C in their powder form. This conductive polymer with a hybrid dual dopant system was used as a filler material with different polymeric thermoplastic resin systems for the preparation of conductive composites. Intrinsically Conductive polymeric (ICP) composites based on hybrid dual dopant systems were prepared using melt blending, extrusion, and finally by, compression molding processing techniques. ICP composites with hybrid dual dopant systems offered good mechanical, thermal, structural, weathering, and stable surface resistivity properties over a period of time. The preliminary shielding behavior for ICP composites between frequency levels of 10 GHz to 24GHZ offered a shielding efficiency of more than 90 dB.

Keywords: ICP, dopant, EMI, shielding

Procedia PDF Downloads 81
843 Modeling in the Middle School: Eighth-Grade Students’ Construction of the Summer Job Problem

Authors: Neslihan Sahin Celik, Ali Eraslan

Abstract:

Mathematical model and modeling are one of the topics that have been intensively discussed in recent years. In line with the results of the PISA studies, researchers in many countries have begun to question how much students in school-education system are prepared to solve the real-world problems they encounter in their future professional lives. As a result, many mathematics educators have begun to emphasize the importance of new skills and understanding such as constructing, Hypothesizing, Describing, manipulating, predicting, working together for complex and multifaceted problems for success in beyond the school. When students increasingly face this kind of situations in their daily life, it is important to make sure that students have enough experience to work together and interpret mathematical situations that enable them to think in different ways and share their ideas with their peers. Thus, model eliciting activities are one of main tools that help students to gain experiences and the new skills required. This research study was carried on the town center of a big city located in the Black Sea region in Turkey. The participants were eighth-grade students in a middle school. After a six-week preliminary study, three students in an eighth-grade classroom were selected using criterion sampling technique and placed in a focus group. The focus group of three students was videotaped as they worked on a model eliciting activity, the Summer Job Problem. The conversation of the group was transcribed, examined with students’ written work and then qualitatively analyzed through the lens of Blum’s (1996) modeling processing cycle. The study results showed that eighth grade students can successfully work with the model eliciting, develop a model based on the two parameters and review the whole process. On the other hand, they had difficulties to relate parameters to each other and take all parameters into account to establish the model.

Keywords: middle school, modeling, mathematical modeling, summer job problem

Procedia PDF Downloads 337
842 Preparing Data for Calibration of Mechanistic-Empirical Pavement Design Guide in Central Saudi Arabia

Authors: Abdulraaof H. Alqaili, Hamad A. Alsoliman

Abstract:

Through progress in pavement design developments, a pavement design method was developed, which is titled the Mechanistic Empirical Pavement Design Guide (MEPDG). Nowadays, the evolution in roads network and highways is observed in Saudi Arabia as a result of increasing in traffic volume. Therefore, the MEPDG currently is implemented for flexible pavement design by the Saudi Ministry of Transportation. Implementation of MEPDG for local pavement design requires the calibration of distress models under the local conditions (traffic, climate, and materials). This paper aims to prepare data for calibration of MEPDG in Central Saudi Arabia. Thus, the first goal is data collection for the design of flexible pavement from the local conditions of the Riyadh region. Since, the modifying of collected data to input data is needed; the main goal of this paper is the analysis of collected data. The data analysis in this paper includes processing each: Trucks Classification, Traffic Growth Factor, Annual Average Daily Truck Traffic (AADTT), Monthly Adjustment Factors (MAFi), Vehicle Class Distribution (VCD), Truck Hourly Distribution Factors, Axle Load Distribution Factors (ALDF), Number of axle types (single, tandem, and tridem) per truck class, cloud cover percent, and road sections selected for the local calibration. Detailed descriptions of input parameters are explained in this paper, which leads to providing of an approach for successful implementation of MEPDG. Local calibration of MEPDG to the conditions of Riyadh region can be performed based on the findings in this paper.

Keywords: mechanistic-empirical pavement design guide (MEPDG), traffic characteristics, materials properties, climate, Riyadh

Procedia PDF Downloads 226
841 Vitamin Content of Swordfish (Xhiphias gladius) Affected by Salting and Frying

Authors: L. Piñeiro, N. Cobas, L. Gómez-Limia, S. Martínez, I. Franco

Abstract:

The swordfish (Xiphias gladius) is a large oceanic fish of high commercial value, which is widely distributed in waters of the world’s oceans. They are considered to be an important source of high quality proteins, vitamins and essential fatty acids, although only half of the population follows the recommendation of nutritionists to consume fish at least twice a week. Swordfish is consumed worldwide because of its low fat content and high protein content. It is generally sold as fresh, frozen, and as pieces or slices. The aim of this study was to evaluate the effect of salting and frying on the composition of the water-soluble vitamins (B2, B3, B9 and B12) and fat-soluble vitamins (A, D, and E) of swordfish. Three loins of swordfish from Pacific Ocean were analyzed. All the fishes had a weight between 50 and 70 kg and were transported to the laboratory frozen (-18 ºC). Before the processing, they were defrosted at 4 ºC. Each loin was sliced and salted in brine. After cleaning the slices, they were divided into portions (10×2 cm) and fried in olive oil. The identification and quantification of vitamins were carried out by high-performance liquid chromatography (HPLC), using methanol and 0.010% trifluoroacetic acid as mobile phases at a flow-rate of 0.7 mL min-1. The UV-Vis detector was used for the detection of the water- and fat-soluble vitamins (A and D), as well as the fluorescence detector for the detection of the vitamin E. During salting, water and fat-soluble vitamin contents remained constant, observing an evident decrease in the values of vitamin B2. The diffusion of salt into the interior of the pieces and the loss of constitution water that occur during this stage would be related to this significant decrease. In general, after frying water-soluble and fat-soluble vitamins showed a great thermolability with high percentages of retention with values among 50–100%. Vitamin B3 is the one that exhibited higher percentages of retention with values close to 100%. However, vitamin B9 presented the highest losses with a percentage of retention of less than 20%.

Keywords: frying, HPLC, salting, swordfish, vitamins

Procedia PDF Downloads 126
840 Interactive IoT-Blockchain System for Big Data Processing

Authors: Abdallah Al-ZoubI, Mamoun Dmour

Abstract:

The spectrum of IoT devices is becoming widely diversified, entering almost all possible fields and finding applications in industry, health, finance, logistics, education, to name a few. The IoT active endpoint sensors and devices exceeded the 12 billion mark in 2021 and are expected to reach 27 billion in 2025, with over $34 billion in total market value. This sheer rise in numbers and use of IoT devices bring with it considerable concerns regarding data storage, analysis, manipulation and protection. IoT Blockchain-based systems have recently been proposed as a decentralized solution for large-scale data storage and protection. COVID-19 has actually accelerated the desire to utilize IoT devices as it impacted both demand and supply and significantly affected several regions due to logistic reasons such as supply chain interruptions, shortage of shipping containers and port congestion. An IoT-blockchain system is proposed to handle big data generated by a distributed network of sensors and controllers in an interactive manner. The system is designed using the Ethereum platform, which utilizes smart contracts, programmed in solidity to execute and manage data generated by IoT sensors and devices. such as Raspberry Pi 4, Rasbpian, and add-on hardware security modules. The proposed system will run a number of applications hosted by a local machine used to validate transactions. It then sends data to the rest of the network through InterPlanetary File System (IPFS) and Ethereum Swarm, forming a closed IoT ecosystem run by blockchain where a number of distributed IoT devices can communicate and interact, thus forming a closed, controlled environment. A prototype has been deployed with three IoT handling units distributed over a wide geographical space in order to examine its feasibility, performance and costs. Initial results indicated that big IoT data retrieval and storage is feasible and interactivity is possible, provided that certain conditions of cost, speed and thorough put are met.

Keywords: IoT devices, blockchain, Ethereum, big data

Procedia PDF Downloads 150
839 Keynote Talk: The Role of Internet of Things in the Smart Cities Power System

Authors: Abdul-Rahman Al-Ali

Abstract:

As the number of mobile devices is growing exponentially, it is estimated to connect about 50 million devices to the Internet by the year 2020. At the end of this decade, it is expected that an average of eight connected devices per person worldwide. The 50 billion devices are not mobile phones and data browsing gadgets only, but machine-to-machine and man-to-machine devices. With such growing numbers of devices the Internet of Things (I.o.T) concept is one of the emerging technologies as of recently. Within the smart grid technologies, smart home appliances, Intelligent Electronic Devices (IED) and Distributed Energy Resources (DER) are major I.o.T objects that can be addressable using the IPV6. These objects are called the smart grid internet of things (SG-I.o.T). The SG-I.o.T generates big data that requires high-speed computing infrastructure, widespread computer networks, big data storage, software, and platforms services. A company’s utility control and data centers cannot handle such a large number of devices, high-speed processing, and massive data storage. Building large data center’s infrastructure takes a long time, it also requires widespread communication networks and huge capital investment. To maintain and upgrade control and data centers’ infrastructure and communication networks as well as updating and renewing software licenses which collectively, requires additional cost. This can be overcome by utilizing the emerging computing paradigms such as cloud computing. This can be used as a smart grid enabler to replace the legacy of utilities data centers. The talk will highlight the role of I.o.T, cloud computing services and their development models within the smart grid technologies.

Keywords: intelligent electronic devices (IED), distributed energy resources (DER), internet, smart home appliances

Procedia PDF Downloads 324
838 A Thermo-mechanical Finite Element Model to Predict Thermal Cycles and Residual Stresses in Directed Energy Deposition Technology

Authors: Edison A. Bonifaz

Abstract:

In this work, a numerical procedure is proposed to design dense multi-material structures using the Directed Energy Deposition (DED) process. A thermo-mechanical finite element model to predict thermal cycles and residual stresses is presented. A numerical layer build-up procedure coupled with a moving heat flux was constructed to minimize strains and residual stresses that result in the multi-layer deposition of an AISI 316 austenitic steel on an AISI 304 austenitic steel substrate. To simulate the DED process, the automated interface of the ABAQUS AM module was used to define element activation and heat input event data as a function of time and position. Of this manner, the construction of ABAQUS user-defined subroutines was not necessary. Thermal cycles and thermally induced stresses created during the multi-layer deposition metal AM pool crystallization were predicted and validated. Results were analyzed in three independent metal layers of three different experiments. The one-way heat and material deposition toolpath used in the analysis was created with a MatLab path script. An optimal combination of feedstock and heat input printing parameters suitable for fabricating multi-material dense structures in the directed energy deposition metal AM process was established. At constant power, it can be concluded that the lower the heat input, the lower the peak temperatures and residual stresses. It means that from a design point of view, the one-way heat and material deposition processing toolpath with the higher welding speed should be selected.

Keywords: event series, thermal cycles, residual stresses, multi-pass welding, abaqus am modeler

Procedia PDF Downloads 69
837 Short Text Classification Using Part of Speech Feature to Analyze Students' Feedback of Assessment Components

Authors: Zainab Mutlaq Ibrahim, Mohamed Bader-El-Den, Mihaela Cocea

Abstract:

Students' textual feedback can hold unique patterns and useful information about learning process, it can hold information about advantages and disadvantages of teaching methods, assessment components, facilities, and other aspects of teaching. The results of analysing such a feedback can form a key point for institutions’ decision makers to advance and update their systems accordingly. This paper proposes a data mining framework for analysing end of unit general textual feedback using part of speech feature (PoS) with four machine learning algorithms: support vector machines, decision tree, random forest, and naive bays. The proposed framework has two tasks: first, to use the above algorithms to build an optimal model that automatically classifies the whole data set into two subsets, one subset is tailored to assessment practices (assessment related), and the other one is the non-assessment related data. Second task to use the same algorithms to build an optimal model for whole data set, and the new data subsets to automatically detect their sentiment. The significance of this paper is to compare the performance of the above four algorithms using part of speech feature to the performance of the same algorithms using n-grams feature. The paper follows Knowledge Discovery and Data Mining (KDDM) framework to construct the classification and sentiment analysis models, which is understanding the assessment domain, cleaning and pre-processing the data set, selecting and running the data mining algorithm, interpreting mined patterns, and consolidating the discovered knowledge. The results of this paper experiments show that both models which used both features performed very well regarding first task. But regarding the second task, models that used part of speech feature has underperformed in comparison with models that used unigrams and bigrams.

Keywords: assessment, part of speech, sentiment analysis, student feedback

Procedia PDF Downloads 142
836 Olive-Mill Wastewater and Organo-Mineral Fertlizers Application for the Control of Parasitic Weed Phelipanche ramosa L. Pomel in Tomato

Authors: Grazia Disciglio, Francesco Lops, Annalisa Tarantino, Emanuele Tarantino

Abstract:

The parasitic weed specie Phelipanche ramosa (L) Pomel is one of the major constraints in tomato crop in Apulia region (southern Italy). The experimental was considered to investigate the effect of six organic compounds (Olive miller wastewater, Allil isothiocyanate®, Alfa plus K®, Radicon®, Rizosum Max®, Kendal Nem®) on the naturally infested field of tomato growing season in 2016. The randomized block design with 3 replicates was adopted. Tomato seedling were transplant on 19 May 2016. During the growing cycle of the tomato at 74, 81, 93 and 103 days after transplantation (DAT), the number of parasitic shoots (branched plants) that had emerged in each plot was determined. At harvesting on 13 September 2016 the major quanti-qualitative yield parameters were determined, including marketable yield, mean weight, dry matter, soluble solids, fruit colour, pH and titratable acidity. The treatments provided the results show that none of treatments provided complete control against P. ramosa. However, among the products tested Olive miller wastewater, Alfa plus K®, Rizosum Max® and Kendal Nem® products applied to the soil show the number of emerged shoots significantly lower than Radicon® and especially than the Allil isothiocyanate® treatment and the untreated control. Regarding the effect of different treatments on the tomato productive parameters, the marketable yield resulted significantly higher in the same mentioned treatments which gave the lower P. ramosa infestation. No significative differences for the other fruit characteristics were observed.

Keywords: processing tomato crop, Phelipanche ramosa, olive-mill wastewater, organic fertilizers

Procedia PDF Downloads 325
835 Integrated Decision Support for Energy/Water Planning in Zayandeh Rud River Basin in Iran

Authors: Safieh Javadinejad

Abstract:

In order to make well-informed decisions respecting long-term system planning, resource managers and policy creators necessitate to comprehend the interconnections among energy and water utilization and manufacture—and also the energy-water nexus. Planning and assessment issues contain the enhancement of strategies for declining the water and energy system’s vulnerabilities to climate alteration with also emissions of decreasing greenhouse gas. In order to deliver beneficial decision support for climate adjustment policy and planning, understanding the regionally-specific features of the energy-water nexus, and the history-future of the water and energy source systems serving is essential. It will be helpful for decision makers understand the nature of current water-energy system conditions and capacity for adaptation plans for future. This research shows an integrated hydrology/energy modeling platform which is able to extend water-energy examines based on a detailed illustration of local circumstances. The modeling links the Water Evaluation and Planning (WEAP) and the Long Range Energy Alternatives Planning (LEAP) system to create full picture of water-energy processes. This will allow water managers and policy-decision makers to simply understand links between energy system improvements and hydrological processing and realize how future climate change will effect on water-energy systems. The Zayandeh Rud river basin in Iran is selected as a case study to show the results and application of the analysis. This region is known as an area with large integration of both the electric power and water sectors. The linkages between water, energy and climate change and possible adaptation strategies are described along with early insights from applications of the integration modeling system.

Keywords: climate impacts, hydrology, water systems, adaptation planning, electricity, integrated modeling

Procedia PDF Downloads 292
834 AI-Driven Solutions for Optimizing Master Data Management

Authors: Srinivas Vangari

Abstract:

In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.

Keywords: artificial intelligence, master data management, data governance, data quality

Procedia PDF Downloads 17
833 Effect of Microstructure and Texture of Magnesium Alloy Due to Addition of Pb

Authors: Yebeen Ji, Jimin Yun, Kwonhoo Kim

Abstract:

Magnesium alloys were limited for industrial applications due to having a limited slip system and high plastic anisotropy. It has been known that specific textures were formed during processing (rolling, etc.), and These textures cause poor formability. To solve these problems, many researchers have studied controlling texture by adding rare-earth elements. However, the high cost limits their use; therefore, alternatives are needed to replace them. Although Pb addition doesn’t directly improve magnesium properties, it has been known to suppress the diffusion of other alloying elements and reduce grain boundary energy. These characteristics are similar to the additions of rare-earth elements, and a similar texture behavior is expected as well. However, there is insufficient research on this. Therefore, this study investigates the behavior of texture and microstructure development after adding Pb to magnesium. This study compared and analyzed AZ61 alloy and Mg-15wt%Pb alloy to determine the effect of adding solute elements. The alloy was hot rolled and annealed to form a single phase and initial texture. Afterward, the specimen was set to contraction and elongate parallel to the rolling surface and the rolling direction and then subjected to high-temperature plane strain compression under the conditions of 723K and 0.05/s. Microstructural analysis and texture measurements were performed by SEM-EBSD. The peak stress in the true strain-stress curve after compression was higher in AZ61, but the shape of the flow curve was similar for both alloys. For both alloys, continuous dynamic recrystallization was confirmed to occur during the compression process. The basal texture developed parallel to the compressed surface, and the pole density was lower in the Mg-15wt%Pb alloy. It is confirmed that this change in behavior is because the orientation distribution of recrystallized grains has a more random orientation compared to the parent grains when Pb is added.

Keywords: Mg, texture, Pb, DRX

Procedia PDF Downloads 49
832 Unsupervised Classification of DNA Barcodes Species Using Multi-Library Wavelet Networks

Authors: Abdesselem Dakhli, Wajdi Bellil, Chokri Ben Amar

Abstract:

DNA Barcode, a short mitochondrial DNA fragment, made up of three subunits; a phosphate group, sugar and nucleic bases (A, T, C, and G). They provide good sources of information needed to classify living species. Such intuition has been confirmed by many experimental results. Species classification with DNA Barcode sequences has been studied by several researchers. The classification problem assigns unknown species to known ones by analyzing their Barcode. This task has to be supported with reliable methods and algorithms. To analyze species regions or entire genomes, it becomes necessary to use similarity sequence methods. A large set of sequences can be simultaneously compared using Multiple Sequence Alignment which is known to be NP-complete. To make this type of analysis feasible, heuristics, like progressive alignment, have been developed. Another tool for similarity search against a database of sequences is BLAST, which outputs shorter regions of high similarity between a query sequence and matched sequences in the database. However, all these methods are still computationally very expensive and require significant computational infrastructure. Our goal is to build predictive models that are highly accurate and interpretable. This method permits to avoid the complex problem of form and structure in different classes of organisms. On empirical data and their classification performances are compared with other methods. Our system consists of three phases. The first is called transformation, which is composed of three steps; Electron-Ion Interaction Pseudopotential (EIIP) for the codification of DNA Barcodes, Fourier Transform and Power Spectrum Signal Processing. The second is called approximation, which is empowered by the use of Multi Llibrary Wavelet Neural Networks (MLWNN).The third is called the classification of DNA Barcodes, which is realized by applying the algorithm of hierarchical classification.

Keywords: DNA barcode, electron-ion interaction pseudopotential, Multi Library Wavelet Neural Networks (MLWNN)

Procedia PDF Downloads 317
831 Minimizing the Drilling-Induced Damage in Fiber Reinforced Polymeric Composites

Authors: S. D. El Wakil, M. Pladsen

Abstract:

Fiber reinforced polymeric (FRP) composites are finding wide-spread industrial applications because of their exceptionally high specific strength and specific modulus of elasticity. Nevertheless, it is very seldom to get ready-for-use components or products made of FRP composites. Secondary processing by machining, particularly drilling, is almost always required to make holes for fastening components together to produce assemblies. That creates problems since the FRP composites are neither homogeneous nor isotropic. Some of the problems that are encountered include the subsequent damage in the region around the drilled hole and the drilling – induced delamination of the layer of ply, that occurs both at the entrance and the exit planes of the work piece. Evidently, the functionality of the work piece would be detrimentally affected. The current work was carried out with the aim of eliminating or at least minimizing the work piece damage associated with drilling of FPR composites. Each test specimen involves a woven reinforced graphite fiber/epoxy composite having a thickness of 12.5 mm (0.5 inch). A large number of test specimens were subjected to drilling operations with different combinations of feed rates and cutting speeds. The drilling induced damage was taken as the absolute value of the difference between the drilled hole diameter and the nominal one taken as a percentage of the nominal diameter. The later was determined for each combination of feed rate and cutting speed, and a matrix comprising those values was established, where the columns indicate varying feed rate while and rows indicate varying cutting speeds. Next, the analysis of variance (ANOVA) approach was employed using Minitab software, in order to obtain the combination that would improve the drilling induced damage. Experimental results show that low feed rates coupled with low cutting speeds yielded the best results.

Keywords: drilling of composites, dimensional accuracy of holes drilled in composites, delamination and charring, graphite-epoxy composites

Procedia PDF Downloads 389
830 Optimizing Energy Efficiency: Leveraging Big Data Analytics and AWS Services for Buildings and Industries

Authors: Gaurav Kumar Sinha

Abstract:

In an era marked by increasing concerns about energy sustainability, this research endeavors to address the pressing challenge of energy consumption in buildings and industries. This study delves into the transformative potential of AWS services in optimizing energy efficiency. The research is founded on the recognition that effective management of energy consumption is imperative for both environmental conservation and economic viability. Buildings and industries account for a substantial portion of global energy use, making it crucial to develop advanced techniques for analysis and reduction. This study sets out to explore the integration of AWS services with big data analytics to provide innovative solutions for energy consumption analysis. Leveraging AWS's cloud computing capabilities, scalable infrastructure, and data analytics tools, the research aims to develop efficient methods for collecting, processing, and analyzing energy data from diverse sources. The core focus is on creating predictive models and real-time monitoring systems that enable proactive energy management. By harnessing AWS's machine learning and data analytics capabilities, the research seeks to identify patterns, anomalies, and optimization opportunities within energy consumption data. Furthermore, this study aims to propose actionable recommendations for reducing energy consumption in buildings and industries. By combining AWS services with metrics-driven insights, the research strives to facilitate the implementation of energy-efficient practices, ultimately leading to reduced carbon emissions and cost savings. The integration of AWS services not only enhances the analytical capabilities but also offers scalable solutions that can be customized for different building and industrial contexts. The research also recognizes the potential for AWS-powered solutions to promote sustainable practices and support environmental stewardship.

Keywords: energy consumption analysis, big data analytics, AWS services, energy efficiency

Procedia PDF Downloads 64
829 Influence of Glenohumeral Joint Approximation Technique on the Cardiovascular System in the Acute Phase after Stroke

Authors: Iva Hereitova, Miroslav Svatek, Vit Novacek

Abstract:

Background and Aim: Autonomic imbalance is one of the complications for immobilized patients in the acute stage after a stroke. The predominance of sympathetic activity significantly increases cardiac activity. The technique of glenohumeral joint approximation may contribute in a non-pharmacological way to the regulation of blood pressure and heart rate in patients in this risk group. The aim of the study was to evaluate the effect of glenohumeral joint approximation on the change in heart rate and blood pressure in immobilized patients in the acute phase after a stroke. Methods: The experimental study bilaterally evaluated heart rate, systolic and diastolic pressure values before and after glenohumeral joint approximation in 40 immobilized participants (72.6 ± 10.2 years) in the acute phase after stroke. The experimental group was compared with 40 healthy participants in the control group (68.6 ± 14.2 years). An SpO2 vital signs monitor and a validated Microlife WatchBP Office blood pressure monitor were used for evaluation. Statistical processing and evaluation were performed in MATLAB R2019 (The Math Works®, Inc., Natick, MA, USA). Results: Approximation of the glenohumeral joint resulted in a statistically significant decrease in systolic and diastolic pressure. An average decrease in systolic pressure for individual groups ranged from 8.2 to 11.3 mmHg (p <0.001). For diastolic pressure, the average decrease ranged from 5.0 - 14.2 mmHg (p <0.001). There was a statistically significant reduction in heart rate (p <0.01) only in patients after ischemic stroke in the inferior cerebral artery. There was the average decrease in heart rate of 3.9 beats per minute (median 4 beats per minute). Conclusion: Approximation of the glenohumeral joint leads to a statistically significant decrease in systolic and diastolic pressure in immobilized patients in the acute phase after stroke.

Keywords: Aproximation technique, Cardiovaskular system, Glenohumeral joint, Stroke

Procedia PDF Downloads 216
828 The Effect of Restaurant Residuals on Performance of Japanese Quail

Authors: A. A. Saki, Y. Karimi, H. J. Najafabadi, P. Zamani, Z. Mostafaie

Abstract:

The restaurant residuals reasons such as competition between human and animal consumption of cereals, increasing environmental pollution and the high cost of production of livestock products is important. Therefore, in this restaurant residuals have a high nutritional value (protein and high energy) that it is possible can replace some of the poultry diets are especially Japanese quail. Today, the challenges of processing and consumption of these lesions occurring in modern industry would be confronting. Increasing costs, pressures, and problems associated with waste excretion, the need for re-evaluation and utilization of waste to livestock and poultry feed fortifies. This study aimed to investigate the effects of different levels of restaurant residuals on performance of 300 layer Japanese quails. This experiment included 5 treatments, 4 replicates, and 15 quails in each from 10 to 18 weeks age in a completely randomized design (CRD). The treatments consist of basal diet including corn and soybean meal (without residual restaurants), and treatments 2, 3, 4 and 5, includes a basal diet containing 5, 10, 15 and 20% of restaurant residuals, respectively. There were no significant effect of restaurant residuals levels on body weight (BW), feed conversion ratio (FCR), percentage of egg production (EP), egg mass (EM) between treatments (P > 0/05). However, feed intake (FI) of 5% restaurant residual was significantly higher than 20% treatment (P < 0/05). Egg weight (EW) was also higher by receiving 20% restaurant residuals compared with 10% in this respect (P < 0/05). Yolk weight (YW) of treatments containing 10 and 20% of the residual restaurant were significantly higher than control (P < 0/05). Eggs white weight (EWW) of 20 and 5% restaurants residual treatments were significantly increased compared by 10% (P < 0/05). Furthermore, EW, egg weight to shell surface area and egg surface area in 20% treatment were significantly higher than control and 10% treatment (P < 0/05). The overall results of this study have shown that restaurant residuals for laying quail diets in levels of 10 and 15 percent could be replaced with a part of the quail ration without any adverse effect.

Keywords: by-product, laying quail, performance, restaurant residuals

Procedia PDF Downloads 164
827 The Web of Injustice: Untangling Violations of Personality Rights in European International Private Law

Authors: Sara Vora (Hoxha)

Abstract:

Defamation, invasion of privacy, and cyberbullying have all increased in tandem with the growth of the internet. European international private law may struggle to deal with such transgressions if they occur in many jurisdictions. The current study examines how effectively the legal system of European international private law addresses abuses of personality rights in cyberspace. The study starts by discussing how established legal frameworks are being threatened by online personality rights abuses. The article then looks into the rules and regulations of European international private law that are in place to handle overseas lawsuits. This article examines the different elements that courts evaluate when deciding which law to use in a particular case, focusing on the concepts of jurisdiction, choice of law, and recognition and execution of foreign judgements. Next, the research analyses the function of the European Union in preventing and punishing online personality rights abuses. Key pieces of law that control the collecting and processing of personal data on the Internet, including the General Data Protection Regulation (GDPR) and the e-Commerce Directive, are discussed. In addition, this article investigates how the ECtHR handles cases involving the infringement of personal freedoms, including privacy and speech. The article finishes with an assessment of how well the legal framework of European international private law protects individuals' right to privacy online. It draws attention to problems with the present legal structure, such as the inability to enforce international judgements, the inconsistency between national laws, and the necessity for stronger measures to safeguard people' rights online. This paper concludes that while European international private law provides a useful framework for dealing with violations of personality rights online, further harmonisation and stronger enforcement mechanisms are necessary to effectively protect individuals' rights in the digital age.

Keywords: European international private law, personality rights, internet, jurisdiction, cross-border disputes, data protection

Procedia PDF Downloads 73
826 The Predictive Implication of Executive Function and Language in Theory of Mind Development in Preschool Age Children

Authors: Michael Luc Andre, Célia Maintenant

Abstract:

Theory of mind is a milestone in child development which allows children to understand that others could have different mental states than theirs. Understanding the developmental stages of theory of mind in children leaded researchers on two Connected research problems. In one hand, the link between executive function and theory of mind, and on the other hand, the relationship of theory of mind and syntax processing. These two lines of research involved a great literature, full of important results, despite certain level of disagreement between researchers. For a long time, these two research perspectives continue to grow up separately despite research conclusion suggesting that the three variables should implicate same developmental period. Indeed, our goal was to study the relation between theory of mind, executive function, and language via a unique research question. It supposed that between executive function and language, one of the two variables could play a critical role in the relationship between theory of mind and the other variable. Thus, 112 children aged between three and six years old were recruited for completing a receptive and an expressive vocabulary task, a syntax understanding task, a theory of mind task, and three executive function tasks (inhibition, cognitive flexibility and working memory). The results showed significant correlations between performance on theory of mind task and performance on executive function domain tasks, except for cognitive flexibility task. We also found significant correlations between success on theory of mind task and performance in all language tasks. Multiple regression analysis justified only syntax and general abilities of language as possible predictors of theory of mind performance in our preschool age children sample. The results were discussed in the perspective of a great role of language abilities in theory of mind development. We also discussed possible reasons that could explain the non-significance of executive domains in predicting theory of mind performance, and the meaning of our results for the literature.

Keywords: child development, executive function, general language, syntax, theory of mind

Procedia PDF Downloads 64
825 Changes in Textural Properties of Zucchini Slices Under Effects of Partial Predrying and Deep-Fat-Frying

Authors: E. Karacabey, Ş. G. Özçelik, M. S. Turan, C. Baltacıoğlu, E. Küçüköner

Abstract:

Changes in textural properties of any food material during processing is significant for further consumer’s evaluation and directly affects their decisions. Thus any food material should be considered in terms of textural properties after any process. In the present study zucchini slices were partially predried to control and reduce the product’s final oil content. A conventional oven was used for partially dehydration of zucchini slices. Following frying was carried in an industrial fryer having temperature controller. This study was based on the effect of this predrying process on textural properties of fried zucchini slices. Texture profile analysis was performed. Hardness, elasticity, chewiness, cohesiveness were studied texture parameters of fried zucchini slices. Temperature and weight loss were monitored parameters of predrying process, whereas, in frying, oil temperature and process time were controlled. Optimization of two successive processes was done by response surface methodology being one of the common used statistical process optimization tools. Models developed for each texture parameters displayed high success to predict their values as a function of studied processes’ conditions. Process optimization was performed according to target values for each property determined for directly fried zucchini slices taking the highest score from sensory evaluation. Results indicated that textural properties of predried and then fried zucchini slices could be controlled by well-established equations. This is thought to be significant for fried stuff related food industry, where controlling of sensorial properties are crucial to lead consumer’s perception and texture related ones are leaders. This project (113R015) has been supported by TUBITAK.

Keywords: optimization, response surface methodology, texture profile analysis, conventional oven, modelling

Procedia PDF Downloads 433
824 Influence of Magnetic Field on Microstructure and Properties of Copper-Silver Composites

Authors: Engang Wang

Abstract:

The Cu-alloy composites are a kind of high-strength and high-conductivity Cu-based alloys, which have excellent mechanical and electrical properties and is widely used in electronic, electrical, machinery industrial fields. However, the solidification microstructure of the composites, such as the primary or second dendrite arm spacing, have important rule to its tensile strength and conductivity, and that is affected by its fabricating method. In this paper, two kinds of directional solidification methods; the exothermic powder method (EP method) and liquid metal cooling method (LMC method), were used to fabricate the Cu-alloy composites with applied different magnetic fields to investigate their influence on the solidifying microstructure of Cu-alloy, and further the fabricated Cu-alloy composites was drawn to wires to investigate the influence of fabricating method and magnetic fields on the drawing microstructure of fiber-reinforced Cu-alloy composites and its properties. The experiment of Cu-Ag alloy under directional solidification and horizontal magnetic fields with different processing parameters show that: 1) For the Cu-Ag alloy with EP method, the dendrite is directionally developed in the cooling copper mould and the solidifying microstructure is effectively refined by applying horizontal magnetic fields. 2) For the Cu-Ag alloy with LMC method, the primary dendrite arm spacing is decreased and the content of Ag in the dendrite increases as increasing the drawing velocity of solidification. 3) The dendrite is refined and the content of Ag in the dendrite increases as increasing the magnetic flux intensity; meanwhile, the growth direction of dendrite is also affected by magnetic field. The research results of Cu-Ag alloy in situ composites by drawing deforming process show that the micro-hardness of alloy is higher by decreasing dendrite arm spacing. When the dendrite growth orientation is consistent with the axial of the samples. the conductivity of the composites increases with the second dendrite arm spacing increases. However, its conductivity reduces with the applied magnetic fields owing to disrupting the dendrite growth orientation.

Keywords: Cu-Ag composite, magnetic field, microstructure, solidification

Procedia PDF Downloads 214
823 Bias Minimization in Construction Project Dispute Resolution

Authors: Keyao Li, Sai On Cheung

Abstract:

Incorporation of alternative dispute resolution (ADR) mechanism has been the main feature of current trend of construction project dispute resolution (CPDR). ADR approaches have been identified as efficient mechanisms and are suitable alternatives to litigation and arbitration. Moreover, the use of ADR in this multi-tiered dispute resolution process often leads to repeated evaluations of a same dispute. Multi-tiered CPDR may become a breeding ground for cognitive biases. When completed knowledge is not available at the early tier of construction dispute resolution, disputing parties may form preconception of the dispute matter or the counterpart. This preconception would influence their information processing in the subsequent tier. Disputing parties tend to search and interpret further information in a self-defensive way to confirm their early positions. Their imbalanced information collection would boost their confidence in the held assessments. Their attitudes would be hardened and difficult to compromise. The occurrence of cognitive bias, therefore, impedes efficient dispute settlement. This study aims to explore ways to minimize bias in CPDR. Based on a comprehensive literature review, three types of bias minimizing approaches were collected: strategy-based, attitude-based and process-based. These approaches were further operationalized into bias minimizing measures. To verify the usefulness and practicability of these bias minimizing measures, semi-structured interviews were conducted with ten CPDR third party neutral professionals. All of the interviewees have at least twenty years of experience in facilitating settlement of construction dispute. The usefulness, as well as the implications of the bias minimizing measures, were validated and suggested by these experts. There are few studies on cognitive bias in construction management in general and in CPDR in particular. This study would be the first of its type to enhance the efficiency of construction dispute resolution by highlighting strategies to minimize the biases therein.

Keywords: bias, construction project dispute resolution, minimization, multi-tiered, semi-structured interview

Procedia PDF Downloads 186
822 Sustainable Crop Mechanization among Small Scale Rural Farmers in Nigeria: The Hurdles

Authors: Charles Iledun Oyewole

Abstract:

The daunting challenge that the ‘man with the hoe’ is going to face in the coming decades will be complex and interwoven. With global population already above 7 billion people, it has been estimated that food (crop) production must more than double by 2050 to meet up with the world’s food requirements. Nigeria population is also expected to reach over 240 million people by 2050, at the current annual population growth of 2.61 per cent. The country’s farming population is estimated at over 65 per cent, but the country still depends on food importation to complement production. The small scale farmer, who depends on simple hand tools: hoes and cutlasses, remains the centre of agricultural production, accounting for 90 per cent of the total agricultural output and 80 per cent of the market flow. While the hoe may have been a tool for sustainable development at a time in human history, this role has been smothered by population growth, which has brought too many mouths to be fed (over 170 million), as well as many industries to fuel with raw materials. It may then be argued that the hoe is unfortunately not a tool for the coming challenges and that agricultural mechanization should be the focus. However, agriculture as an enterprise is a ‘complete wheel’ which does not work when broken, particularly, in respect to mechanization. Generally, mechanization will prompt increase production, where land is readily available; increase production, will require post-harvest handling mechanisms, crop processing and subsequent storage. An important aspect of this is readily available and favourable markets for such produce; fuel by good agricultural policies. A break in this wheel will lead to the process of mechanization crashing back to subsistence production, and probably reversal to the hoe. The focus of any agricultural policy should be to chart a course for sustainable mechanization that is environmentally friendly, that may ameliorate Nigeria’s food and raw material gaps. This is the focal point of this article.

Keywords: Crop production, Farmer, Hoes, Mechanization, Policy framework, Population, Growth, Rural areas

Procedia PDF Downloads 220
821 Experimenting the Influence of Input Modality on Involvement Load Hypothesis

Authors: Mohammad Hassanzadeh

Abstract:

As far as incidental vocabulary learning is concerned, the basic contention of the Involvement Load Hypothesis (ILH) is that retention of unfamiliar words is, generally, conditional upon the degree of involvement in processing them. This study examined input modality and incidental vocabulary uptake in a task-induced setting whereby three variously loaded task types (marginal glosses, fill-in-task, and sentence-writing) were alternately assigned to one group of students at Allameh Tabataba’i University (n=2l) during six classroom sessions. While one round of exposure was comprised of the audiovisual medium (TV talk shows), the second round consisted of textual materials with approximately similar subject matter (reading texts). In both conditions, however, the tasks were equivalent to one another. Taken together, the study pursued the dual objectives of establishing a litmus test for the ILH and its proposed values of ‘need’, ‘search’ and ‘evaluation’ in the first place. Secondly, it sought to bring to light the superiority issue of exposure to audiovisual input versus the written input as far as the incorporation of tasks is concerned. At the end of each treatment session, a vocabulary active recall test was administered to measure their incidental gains. Running a one-way analysis of variance revealed that the audiovisual intervention yielded higher gains than the written version even when differing tasks were included. Meanwhile, task 'three' (sentence-writing) turned out the most efficient in tapping learners' active recall of the target vocabulary items. In addition to shedding light on the superiority of audiovisual input over the written input when circumstances are relatively held constant, this study for the most part, did support the underlying tenets of ILH.

Keywords: Keywords— Evaluation, incidental vocabulary learning, input mode, Involvement Load Hypothesis, need, search.

Procedia PDF Downloads 279
820 The Multiple Sclerosis condition and the Role of Varicella-zoster virus in its Progression

Authors: Sina Mahdavi, Mahdi Asghari Ozma

Abstract:

Multiple sclerosis (MS) is the most common inflammatory autoimmune disease of the CNS that affects the myelination process in the central nervous system (CNS). Complex interactions of various "environmental or infectious" factors may act as triggers in autoimmunity and disease progression. The association between viral infections, especially human Varicella-zoster virus (VZV) and MS is one potential cause that is not well understood. This study aims to summarize the available data on VZV retrovirus infection in MS disease progression. For this study, the keywords "Multiple sclerosis", " Human Varicella-zoster virus ", and "central nervous system" in the databases PubMed, Google Scholar, Sid, and MagIran between 2016 and 2022 were searched and 14 articles were chosen, studied, and analyzed. Analysis of the amino acid sequences of HNRNPA1 with VZV proteins has shown a 62% amino acid sequence similarity between VZV gE and the PrLD/M9 epitope region (TNPO1 binding domain) of mutant HNRNPA1. A heterogeneous nuclear ribonucleoprotein (hnRNP), which is produced by HNRNPA1, is involved in the processing and transfer of mRNA and pre-mRNA. Mutant HNRNPA1 mimics gE of VZV as an antigen that leads to autoantibody production. Mutant HnRNPA1 translocates to the cytoplasm, after aggregation is presented by MHC class I, followed by CD8 + cells. Of these, antibodies and immune cells against the gE epitopes of VZV remain due to the memory immune response, causing neurodegeneration and the development of MS in genetically predisposed individuals. VZV expression during the course of MS is present in genetically predisposed individuals with HNRNPA1 mutation, suggesting a link between VZV and MS, and that this virus may play a role in the development of MS by inducing an inflammatory state. Therefore, measures to modulate VZV expression may be effective in reducing inflammatory processes in demyelinated areas of MS patients in genetically predisposed individuals.

Keywords: multiple sclerosis, varicella-zoster virus, central nervous system, autoimmunity

Procedia PDF Downloads 76