Search results for: fully spatial signal processing
7192 Reading Comprehension in Profound Deaf Readers
Authors: S. Raghibdoust, E. Kamari
Abstract:
Research show that reduced functional hearing has a detrimental influence on the ability of an individual to establish proper phonological representations of words, since the phonological representations are claimed to mediate the conceptual processing of written words. Word processing efficiency is expected to decrease with a decrease in functional hearing. In other words, it is predicted that hearing individuals would be more capable of word processing than individuals with hearing loss, as their functional hearing works normally. Studies also demonstrate that the quality of the functional hearing affects reading comprehension via its effect on their word processing skills. In other words, better hearing facilitates the development of phonological knowledge, and can promote enhanced strategies for the recognition of written words, which in turn positively affect higher-order processes underlying reading comprehension. The aims of this study were to investigate and compare the effect of deafness on the participants’ abilities to process written words at the lexical and sentence levels through using two online and one offline reading comprehension tests. The performance of a group of 8 deaf male students (ages 8-12) was compared with that of a control group of normal hearing male students. All the participants had normal IQ and visual status, and came from an average socioeconomic background. None were diagnosed with a particular learning or motor disability. The language spoken in the homes of all participants was Persian. Two tests of word processing were developed and presented to the participants using OpenSesame software, in order to measure the speed and accuracy of their performance at the two perceptual and conceptual levels. In the third offline test of reading comprehension which comprised of semantically plausible and semantically implausible subject relative clauses, the participants had to select the correct answer out of two choices. The data derived from the statistical analysis using SPSS software indicated that hearing and deaf participants had a similar word processing performance both in terms of speed and accuracy of their responses. The results also showed that there was no significant difference between the performance of the deaf and hearing participants in comprehending semantically plausible sentences (p > 0/05). However, a significant difference between the performances of the two groups was observed with respect to their comprehension of semantically implausible sentences (p < 0/05). In sum, the findings revealed that the seriously impoverished sentence reading ability characterizing the profound deaf subjects of the present research, exhibited their reliance on reading strategies that are based on insufficient or deviant structural knowledge, in particular in processing semantically implausible sentences, rather than a failure to efficiently process written words at the lexical level. This conclusion, of course, does not mean to say that deaf individuals may never experience deficits at the word processing level, deficits that impede their understanding of written texts. However, as stated in previous researches, it sounds reasonable to assume that the more deaf individuals get familiar with written words, the better they can recognize them, despite having a profound phonological weakness.Keywords: deafness, reading comprehension, reading strategy, word processing, subject and object relative sentences
Procedia PDF Downloads 3387191 Bamboo: A Trendy and New Alternative to Wood
Authors: R. T. Aggangan, R. J. Cabangon
Abstract:
Bamboo is getting worldwide attention over the last 20 to 30 years due to numerous uses and it is regarded as the closest material that can be used as substitute to wood. In the domestic market, high quality bamboo products are sold in high-end markets while lower quality products are generally sold to medium and low income consumers. The global market in 2006 stands at about 7 billion US dollars and was projected to increase to US$ 17 B from 2015 to 2020. The Philippines had been actively producing and processing bamboo products for the furniture, handicrafts and construction industry. It was however in 2010 that the Philippine bamboo industry was formalized by virtue of Executive Order 879 that stated that the Philippine bamboo industry development is made a priority program of the government and created the Philippine Bamboo Industry Development Council (PBIDC) to provide the overall policy and program directions of the program for all stakeholders. At present, the most extensive use of bamboo is for the manufacture of engineered bamboo for school desks for all public schools as mandated by EO 879. Also, engineered bamboo products are used for high-end construction and furniture as well as for handicrafts. Development of cheap adhesives, preservatives, and finishing chemicals from local species of plants, development of economical methods of drying and preservation, product development and processing of lesser-used species of bamboo, development of processing tools, equipment and machineries are the strategies that will be employed to reduce the price and mainstream engineered bamboo products in the local and foreign market. In addition, processing wastes from bamboo can be recycled into fuel products such as charcoal are already in use. The more exciting possibility, however, is the production of bamboo pellets that can be used as a substitute for wood pellets for heating, cooking and generating electricity.Keywords: bamboo charcoal and light distillates, engineered bamboo, furniture and handicraft industries, housing and construction, pellets
Procedia PDF Downloads 2487190 Detecting Music Enjoyment Level Using Electroencephalogram Signals and Machine Learning Techniques
Authors: Raymond Feng, Shadi Ghiasi
Abstract:
An electroencephalogram (EEG) is a non-invasive technique that records electrical activity in the brain using scalp electrodes. Researchers have studied the use of EEG to detect emotions and moods by collecting signals from participants and analyzing how those signals correlate with their activities. In this study, researchers investigated the relationship between EEG signals and music enjoyment. Participants listened to music while data was collected. During the signal-processing phase, power spectral densities (PSDs) were computed from the signals, and dominant brainwave frequencies were extracted from the PSDs to form a comprehensive feature matrix. A machine learning approach was then taken to find correlations between the processed data and the music enjoyment level indicated by the participants. To improve on previous research, multiple machine learning models were employed, including K-Nearest Neighbors Classifier, Support Vector Classifier, and Decision Tree Classifier. Hyperparameters were used to fine-tune each model to further increase its performance. The experiments showed that a strong correlation exists, with the Decision Tree Classifier with hyperparameters yielding 85% accuracy. This study proves that EEG is a reliable means to detect music enjoyment and has future applications, including personalized music recommendation, mood adjustment, and mental health therapy.Keywords: EEG, electroencephalogram, machine learning, mood, music enjoyment, physiological signals
Procedia PDF Downloads 627189 Augmented Reality to Support the Design of Innovative Agroforestry Systems
Authors: Laetitia Lemiere, Marie Gosme, Gerard Subsol, Marc Jaeger
Abstract:
Agroforestry is recognized as a way of developing sustainable and resilient agriculture that can fight against climate change. However, the number of species combinations, spatial configurations, and management options for trees and crops is vast. These choices must be adapted to the pedoclimatic and socio-economic contexts and to the objectives of the farmer, who therefore needs support in designing his system. Participative design workshops are a good way to integrate the knowledge of several experts in order to design such complex systems. The design of agroforestry systems should take into account both spatial aspects (e.g., spacing of trees within the lines and between lines, tree line orientation, tree-crop distance, species spatial patterns) and temporal aspects (e.g., crop rotations, tree thinning and pruning, tree planting in the case of successional agroforestry). Furthermore, the interactions between trees and crops evolve as the trees grow. However, agroforestry design workshops generally emphasize the spatial aspect only through the use of static tokens to represent the different species when designing the spatial configuration of the system. Augmented reality (AR) may overcome this limitation, allowing to visualize dynamic representations of trees and crops, and also their interactions, while at the same time retaining the possibility to physically interact with the system being designed (i.e., move trees, add or remove species, etc.). We propose an ergonomic digital solution capable of assisting a group of agroforestry experts to design an agroforestry system and to represent it. We investigated the use of web-based marker-based AR that does not require specific hardware and does not require specific installation so that all users could use their own smartphones right out of the pocket. We developed a prototype mobilizing the AR.js, ArToolKit.js, and Three.js open source libraries. In our implementation, we gradually build a virtual agroforestry system pattern scene from the users' interactions. A specific set of markers initialize the scene properties, and the various plant species are added and located during the workshop design session. The full virtual scene, including the trees positions with their neighborhood, are saved for further uses, such as virtual, augmented instantiation in the farmer fields. The number of tree species available in the application is gradually increasing; we mobilize 3D digital models for walnut, poplar, wild cherry, and other popular species used in agroforestry systems. The prototype allows shadow computations and the representation of trees at various growth stages, as well as different tree generations, and is thus able to visualize the dynamics of the system over time. Future work will focus on i) the design of complex patterns mobilizing several tree/shrub organizations, not restricted to lines; ii) the design of interfaces related to cultural practices, such as clearing or pruning; iii) the representation of tree-crop interactions. Beside tree shade (light competition), our objective is to represent also below-ground competitions (water, nitrogen) or other variables of interest for the design of agroforestry systems (e.g., predicted crop yield).Keywords: agroforestry system design, augmented reality, marker-based AR, participative design, web-based AR
Procedia PDF Downloads 1757188 Nanorods Based Dielectrophoresis for Protein Concentration and Immunoassay
Authors: Zhen Cao, Yu Zhu, Junxue Fu
Abstract:
Immunoassay, i.e., antigen-antibody reaction, is crucial for disease diagnostics. To achieve the adequate signal of the antigen protein detection, a large amount of sample and long incubation time is needed. However, the amount of protein is usually small at the early stage, which makes it difficult to detect. Unlike cells and DNAs, no valid chemical method exists for protein amplification. Thus, an alternative way to improve the signal is through particle manipulation techniques to concentrate proteins, among which dielectrophoresis (DEP) is an effective one. DEP is a technique that concentrates particles to the designated region through a force created by the gradient in a non-uniform electric field. Since DEP force is proportional to the cube of particle size and square of electric field gradient, it is relatively easy to capture larger particles such as cells. For smaller ones like proteins, a super high gradient is then required. In this work, three-dimensional Ag/SiO2 nanorods arrays, fabricated by an easy physical vapor deposition technique called as oblique angle deposition, have been integrated with a DEP device and created the field gradient as high as of 2.6×10²⁴ V²/m³. The nanorods based DEP device is able to enrich bovine serum albumin (BSA) protein by 1800-fold and the rate has reached 180-fold/s when only applying 5 V electric potential. Based on the above nanorods integrated DEP platform, an immunoassay of mouse immunoglobulin G (IgG) proteins has been performed. Briefly, specific antibodies are immobilized onto nanorods, then IgG proteins are concentrated and captured, and finally, the signal from fluorescence-labelled antibodies are detected. The limit of detection (LoD) is measured as 275.3 fg/mL (~1.8 fM), which is a 20,000-fold enhancement compared with identical assays performed on blank glass plates. Further, prostate-specific antigen (PSA), which is a cancer biomarker for diagnosis of prostate cancer after radical prostatectomy, is also quantified with a LoD as low as 2.6 pg/mL. The time to signal saturation has been significantly reduced to one minute. In summary, together with an easy nanorod fabrication and integration method, this nanorods based DEP platform has demonstrated highly sensitive immunoassay performance and thus poses great potentials in applications for early point-of-care diagnostics.Keywords: dielectrophoresis, immunoassay, oblique angle deposition, protein concentration
Procedia PDF Downloads 1037187 (Re)Processing of ND-Fe-B Permanent Magnets Using Electrochemical and Physical Approaches
Authors: Kristina Zuzek, Xuan Xu, Awais Ikram, Richard Sheridan, Allan Walton, Saso Sturm
Abstract:
Recycling of end-of-life REEs based Nd-Fe-B magnets is an important strategy for reducing the environmental dangers associated with rare-earth mining and overcoming the well-documented supply risks related to the REEs. However, challenges on their reprocessing still remain. We report on the possibility of direct electrochemical recycling and reprocessing of Nd-Fe(B)-based magnets. In this investigation, we were able first to electrochemically leach the end-of-life NdFeB magnet and to electrodeposit Nd–Fe using a 1-ethyl-3-methyl imidazolium dicyanamide ([EMIM][DCA]) ionic liquid-based electrolyte. We observed that Nd(III) could not be reduced independently. However, it can be co-deposited on a substrate with the addition of Fe(II). Using advanced TEM techniques of electron-energy-loss spectroscopy (EELS) it was shown that Nd(III) is reduced to Nd(0) during the electrodeposition process. This gave a new insight into determining the Nd oxidation state, as X-ray photoelectron spectroscopy (XPS) has certain limitations. This is because the binding energies of metallic Nd (Nd0) and neodymium oxide (Nd₂O₃) are very close, i. e., 980.5-981.5 eV and 981.7-982.3 eV, respectively, making it almost impossible to differentiate between the two states. These new insights into the electrodeposition process represent an important step closer to efficient recycling of rare piles of earth in metallic form at mild temperatures, thus providing an alternative to high-temperature molten-salt electrolysis and a step closer to deposit Nd-Fe-based magnetic materials. Further, we propose a new concept of recycling the sintered Nd-Fe-B magnets by direct recovering the 2:14:1 matrix phase. Via an electrochemical etching method, we are able to recover pure individual 2:14:1 grains that can be re-used for new types of magnet production. In the frame of physical reprocessing, we have successfully synthesized new magnets out of hydrogen (HDDR)-recycled stocks with a contemporary technique of pulsed electric current sintering (PECS). The optimal PECS conditions yielded fully dense Nd-Fe-B magnets with the coercivity Hc = 1060 kA/m, which was boosted to 1160 kA/m after the post-PECS thermal treatment. The Br and Hc were tackled further and increased applied pressures of 100 – 150 MPa resulted in Br = 1.01 T. We showed that with a fine tune of the PECS and post-annealing it is possible to revitalize the Nd-Fe-B end-of-life magnets. By applying advanced TEM, i.e. atomic-scale Z-contrast STEM combined with EDXS and EELS, the resulting magnetic properties were critically assessed against various types of structural and compositional discontinuities down to atomic-scale, which we believe control the microstructure evolution during the PECS processing route.Keywords: electrochemistry, Nd-Fe-B, pulsed electric current sintering, recycling, reprocessing
Procedia PDF Downloads 1577186 Map UI Design of IoT Application Based on Passenger Evacuation Behaviors in Underground Station
Authors: Meng-Cong Zheng
Abstract:
When the public space is in an emergency, how to quickly establish spatial cognition and emergency shelter in the closed underground space is the urgent task. This study takes Taipei Station as the research base and aims to apply the use of Internet of things (IoT) application for underground evacuation mobility design. The first experiment identified passengers' evacuation behaviors and spatial cognition in underground spaces by wayfinding tasks and thinking aloud, then defined the design conditions of User Interface (UI) and proposed the UI design. The second experiment evaluated the UI design based on passengers' evacuation behaviors by wayfinding tasks and think aloud again as same as the first experiment. The first experiment found that the design conditions that the subjects were most concerned about were "map" and hoping to learn the relative position of themselves with other landmarks by the map and watch the overall route. "Position" needs to be accurately labeled to determine the location in underground space. Each step of the escape instructions should be presented clearly in "navigation bar." The "message bar" should be informed of the next or final target exit. In the second experiment with the UI design, we found that the "spatial map" distinguishing between walking and non-walking areas with shades of color is useful. The addition of 2.5D maps of the UI design increased the user's perception of space. Amending the color of the corner diagram in the "escape route" also reduces the confusion between the symbol and other diagrams. The larger volume of toilets and elevators can be a judgment of users' relative location in "Hardware facilities." Fire extinguisher icon should be highlighted. "Fire point tips" of the UI design indicated fire with a graphical fireball can convey precise information to the escaped person. "Fire point tips" of the UI design indicated fire with a graphical fireball can convey precise information to the escaped person. However, "Compass and return to present location" are less used in underground space.Keywords: evacuation behaviors, IoT application, map UI design, underground station
Procedia PDF Downloads 2077185 A Case Study on Quantitatively and Qualitatively Increasing Student Output by Using Available Word Processing Applications to Teach Reluctant Elementary School-Age Writers
Authors: Vivienne Cameron
Abstract:
Background: Between 2010 and 2017, teachers in a suburban public school district struggled to get students to consistently produce adequate writing samples as measured by the Pennsylvania state writing rubric for measuring focus, content, organization, style, and conventions. A common thread in all of the data was the need to develop stamina in the student writers. Method: All of the teachers used the traditional writing process model (prewrite, draft, revise, edit, final copy) during writing instruction. One teacher taught the writing process using word processing and incentivizing with publication instead of the traditional pencil/paper/grading method. Students did not have instruction in typing/keyboarding. The teacher submitted resulting student work to real-life contests, magazines, and publishers. Results: Students in the test group increased both the quantity and quality of their writing over a seven month period as measured by the Pennsylvania state writing rubric. Reluctant writers, as well as students with autism spectrum disorder, benefited from this approach. This outcome was repeated consistently over a five-year period. Interpretation: Removing the burden of pencil and paper allowed students to participate in the writing process more fully. Writing with pencil and paper is physically tiring. Students are discouraged when they submit a draft and are instructed to use the Add, Remove, Move, Substitute (ARMS) method to revise their papers. Each successive version becomes shorter. Allowing students to type their papers frees them to quickly and easily make changes. The result is longer writing pieces in shorter time frames, allowing the teacher to spend more time working on individual needs. With this additional time, the teacher can concentrate on teaching focus, content, organization, style, conventions, and audience. S/he also has a larger body of works from which to work on whole group instruction such as developing effective leads. The teacher submitted the resulting student work to contests, magazines, and publishers. Although time-consuming, the submission process was an invaluable lesson for teaching about audience and tone. All students in the test sample had work accepted for publication. Students became highly motivated to succeed when their work was accepted for publication. This motivation applied to special needs students, regular education students, and gifted students.Keywords: elementary-age students, reluctant writers, teaching strategies, writing process
Procedia PDF Downloads 1757184 Towards Long-Range Pixels Connection for Context-Aware Semantic Segmentation
Authors: Muhammad Zubair Khan, Yugyung Lee
Abstract:
Deep learning has recently achieved enormous response in semantic image segmentation. The previously developed U-Net inspired architectures operate with continuous stride and pooling operations, leading to spatial data loss. Also, the methods lack establishing long-term pixels connection to preserve context knowledge and reduce spatial loss in prediction. This article developed encoder-decoder architecture with bi-directional LSTM embedded in long skip-connections and densely connected convolution blocks. The network non-linearly combines the feature maps across encoder-decoder paths for finding dependency and correlation between image pixels. Additionally, the densely connected convolutional blocks are kept in the final encoding layer to reuse features and prevent redundant data sharing. The method applied batch-normalization for reducing internal covariate shift in data distributions. The empirical evidence shows a promising response to our method compared with other semantic segmentation techniques.Keywords: deep learning, semantic segmentation, image analysis, pixels connection, convolution neural network
Procedia PDF Downloads 1027183 A Geographic Information System Mapping Method for Creating Improved Satellite Solar Radiation Dataset Over Qatar
Authors: Sachin Jain, Daniel Perez-Astudillo, Dunia A. Bachour, Antonio P. Sanfilippo
Abstract:
The future of solar energy in Qatar is evolving steadily. Hence, high-quality spatial solar radiation data is of the uttermost requirement for any planning and commissioning of solar technology. Generally, two types of solar radiation data are available: satellite data and ground observations. Satellite solar radiation data is developed by the physical and statistical model. Ground data is collected by solar radiation measurement stations. The ground data is of high quality. However, they are limited to distributed point locations with the high cost of installation and maintenance for the ground stations. On the other hand, satellite solar radiation data is continuous and available throughout geographical locations, but they are relatively less accurate than ground data. To utilize the advantage of both data, a product has been developed here which provides spatial continuity and higher accuracy than any of the data alone. The popular satellite databases: National Solar radiation Data Base, NSRDB (PSM V3 model, spatial resolution: 4 km) is chosen here for merging with ground-measured solar radiation measurement in Qatar. The spatial distribution of ground solar radiation measurement stations is comprehensive in Qatar, with a network of 13 ground stations. The monthly average of the daily total Global Horizontal Irradiation (GHI) component from ground and satellite data is used for error analysis. The normalized root means square error (NRMSE) values of 3.31%, 6.53%, and 6.63% for October, November, and December 2019 were observed respectively when comparing in-situ and NSRDB data. The method is based on the Empirical Bayesian Kriging Regression Prediction model available in ArcGIS, ESRI. The workflow of the algorithm is based on the combination of regression and kriging methods. A regression model (OLS, ordinary least square) is fitted between the ground and NSBRD data points. A semi-variogram is fitted into the experimental semi-variogram obtained from the residuals. The kriging residuals obtained after fitting the semi-variogram model were added to NSRBD data predicted values obtained from the regression model to obtain the final predicted values. The NRMSE values obtained after merging are respectively 1.84%, 1.28%, and 1.81% for October, November, and December 2019. One more explanatory variable, that is the ground elevation, has been incorporated in the regression and kriging methods to reduce the error and to provide higher spatial resolution (30 m). The final GHI maps have been created after merging, and NRMSE values of 1.24%, 1.28%, and 1.28% have been observed for October, November, and December 2019, respectively. The proposed merging method has proven as a highly accurate method. An additional method is also proposed here to generate calibrated maps by using regression and kriging model and further to use the calibrated model to generate solar radiation maps from the explanatory variable only when not enough historical ground data is available for long-term analysis. The NRMSE values obtained after the comparison of the calibrated maps with ground data are 5.60% and 5.31% for November and December 2019 month respectively.Keywords: global horizontal irradiation, GIS, empirical bayesian kriging regression prediction, NSRDB
Procedia PDF Downloads 897182 Geospatial Analysis of Hydrological Response to Forest Fires in Small Mediterranean Catchments
Authors: Bojana Horvat, Barbara Karleusa, Goran Volf, Nevenka Ozanic, Ivica Kisic
Abstract:
Forest fire is a major threat in many regions in Croatia, especially in coastal areas. Although they are often caused by natural processes, the most common cause is the human factor, intentional or unintentional. Forest fires drastically transform landscapes and influence natural processes. The main goal of the presented research is to analyse and quantify the impact of the forest fire on hydrological processes and propose the model that best describes changes in hydrological patterns in the analysed catchments. Keeping in mind the spatial component of the processes, geospatial analysis is performed to gain better insight into the spatial variability of the hydrological response to disastrous events. In that respect, two catchments that experienced severe forest fire were delineated, and various hydrological and meteorological data were collected both attribute and spatial. The major drawback is certainly the lack of hydrological data, common in small torrential karstic streams; hence modelling results should be validated with the data collected in the catchment that has similar characteristics and established hydrological monitoring. The event chosen for the modelling is the forest fire that occurred in July 2019 and burned nearly 10% of the analysed area. Surface (land use/land cover) conditions before and after the event were derived from the two Sentinel-2 images. The mapping of the burnt area is based on a comparison of the Normalized Burn Index (NBR) computed from both images. To estimate and compare hydrological behaviour before and after the event, curve number (CN) values are assigned to the land use/land cover classes derived from the satellite images. Hydrological modelling resulted in surface runoff generation and hence prediction of hydrological responses in the catchments to a forest fire event. The research was supported by the Croatian Science Foundation through the project 'Influence of Open Fires on Water and Soil Quality' (IP-2018-01-1645).Keywords: Croatia, forest fire, geospatial analysis, hydrological response
Procedia PDF Downloads 1367181 Evaluation of Condyle Alterations after Orthognathic Surgery with a Digital Image Processing Technique
Authors: Livia Eisler, Cristiane C. B. Alves, Cristina L. F. Ortolani, Kurt Faltin Jr.
Abstract:
Purpose: This paper proposes a technically simple diagnosis method among orthodontists and maxillofacial surgeons in order to evaluate discrete bone alterations. The methodology consists of a protocol to optimize the diagnosis and minimize the possibility for orthodontic and ortho-surgical retreatment. Materials and Methods: A protocol of image processing and analysis, through ImageJ software and its plugins, was applied to 20 pairs of lateral cephalometric images obtained from cone beam computerized tomographies, before and 1 year after undergoing orthognathic surgery. The optical density of the images was analyzed in the condylar region to determine possible bone alteration after surgical correction. Results: Image density was shown to be altered in all image pairs, especially regarding the condyle contours. According to measures, condyle had a gender-related density reduction for p=0.05 and condylar contours had their alterations registered in mm. Conclusion: A simple, viable and cost-effective technique can be applied to achieve the more detailed image-based diagnosis, not depending on the human eye and therefore, offering more reliable, quantitative results.Keywords: bone resorption, computer-assisted image processing, orthodontics, orthognathic surgery
Procedia PDF Downloads 1607180 Urban Enclaves Caused by Migration: Little Aleppo in Ankara, Turkey
Authors: Sezen Aslan, N. Aydan Sat
Abstract:
The society of 21st century constantly faces with complex otherness that emerges in various forms and justifications. Otherness caused by class, race or ethnicity inevitably reflects to urban areas, and in this way, cities are diversified into totally self-centered and closed-off urban enclaves. One of the most important dynamics that creates otherness in contemporary society is migration. Immigration on an international scale is one of the most important events that have reshaped the world, and the number of immigrants in the world is increasing day by day. Forced migration and refugee statements constitute the major part of countries' immigration policies and practices. Domestic problems such as racism, violence, war, censorship and silencing, attitudes contrary to human rights, different cultural or religious identities cause populations to migrate. Immigration is one of the most important reasons for the formation of urban enclaves within cities. Turkey, which was used to face a higher rate of outward migration, has begun to host immigrant groups from foreign countries. 1980s is the breaking point about the issue as a result of internal disturbances in the Middle East. After Iranian, Iraqi and Afghan immigrants, Turkey faces the largest external migration in its story with Syrian population. Turkey has been hosting approximate three million Syrian people after Syrian Civil War which started in 2011. 92% of Syrian refugees are currently living in different urban areas in Turkey instead of camps. Syrian refugees are experiencing a spontaneous spatiality due to the lack of specific settlement and housing policies of the country. This spontaneity is one of the most important factors in the creation of urban enclaves. From this point of view, the aim of this study is to clarify processes that lead the creation of urban enclaves and to explain socio-spatial effects of these urban enclaves to the other parts of the cities. Ankara, which is one of the most registered Syrian hosting Province in Turkey, is selected as a case study area. About 55% of the total Syrian population lives in the Altındağ district in Ankara. They settled specifically in two neighborhoods in Altındağ district, named as Önder and Ulubey. These neighborhoods are old slum areas, and they were evacuated due to urban renewal on the same dates with the migration of the Syrians. Before demolition of these old slums, Syrians are settled into them as tenants. In the first part of the study, a brief explanation of the concept of urban enclave, its occurrence parameters and possible socio-spatial threats, examples from previous immigrant urban enclaves caused internal migration will be given. Emergence of slums, planning history and social processes in the case study area will be described in the second part of the study. The third part will be focused on the Syrian refugees and their socio-spatial relationship in the case study area and in-depth interviews with refugees and spatial analysis will be realized. Suggestions for the future of the case study area and recommendations to prevent immigrant groups from social and spatial exclusion will be discussed in the conclusion part of the study.Keywords: migration, immigration, Syrian refugees, urban enclaves, Ankara
Procedia PDF Downloads 2087179 An Approach to Apply Kernel Density Estimation Tool for Crash Prone Location Identification
Authors: Kazi Md. Shifun Newaz, S. Miaji, Shahnewaz Hazanat-E-Rabbi
Abstract:
In this study, the kernel density estimation tool has been used to identify most crash prone locations in a national highway of Bangladesh. Like other developing countries, in Bangladesh road traffic crashes (RTC) have now become a great social alarm and the situation is deteriorating day by day. Today’s black spot identification process is not based on modern technical tools and most of the cases provide wrong output. In this situation, characteristic analysis and black spot identification by spatial analysis would be an effective and low cost approach in ensuring road safety. The methodology of this study incorporates a framework on the basis of spatial-temporal study to identify most RTC occurrence locations. In this study, a very important and economic corridor like Dhaka to Sylhet highway has been chosen to apply the method. This research proposes that KDE method for identification of Hazardous Road Location (HRL) could be used for all other National highways in Bangladesh and also for other developing countries. Some recommendations have been suggested for policy maker to reduce RTC in Dhaka-Sylhet especially in black spots.Keywords: hazardous road location (HRL), crash, GIS, kernel density
Procedia PDF Downloads 3147178 An Artificially Intelligent Teaching-Agent to Enhance Learning Interactions in Virtual Settings
Authors: Abdulwakeel B. Raji
Abstract:
This paper introduces a concept of an intelligent virtual learning environment that involves communication between learners and an artificially intelligent teaching agent in an attempt to replicate classroom learning interactions. The benefits of this technology over current e-learning practices is that it creates a virtual classroom where real time adaptive learning interactions are made possible. This is a move away from the static learning practices currently being adopted by e-learning systems. Over the years, artificial intelligence has been applied to various fields, including and not limited to medicine, military applications, psychology, marketing etc. The purpose of e-learning applications is to ensure users are able to learn outside of the classroom, but a major limitation has been the inability to fully replicate classroom interactions between teacher and students. This study used comparative surveys to gain information and understanding of the current learning practices in Nigerian universities and how they compare to these practices compare to the use of a developed e-learning system. The study was conducted by attending several lectures and noting the interactions between lecturers and tutors and as an aftermath, a software has been developed that deploys the use of an artificial intelligent teaching-agent alongside an e-learning system to enhance user learning experience and attempt to create the similar learning interactions to those found in classroom and lecture hall settings. Dialogflow has been used to implement a teaching-agent, which has been developed using JSON, which serves as a virtual teacher. Course content has been created using HTML, CSS, PHP and JAVASCRIPT as a web-based application. This technology can run on handheld devices and Google based home technologies to give learners an access to the teaching agent at any time. This technology also implements the use of definite clause grammars and natural language processing to match user inputs and requests with defined rules to replicate learning interactions. This technology developed covers familiar classroom scenarios such as answering users’ questions, asking ‘do you understand’ at regular intervals and answering subsequent requests, taking advanced user queries to give feedbacks at other periods. This software technology uses deep learning techniques to learn user interactions and patterns to subsequently enhance user learning experience. A system testing has been undergone by undergraduate students in the UK and Nigeria on the course ‘Introduction to Database Development’. Test results and feedback from users shows that this study and developed software is a significant improvement on existing e-learning systems. Further experiments are to be run using the software with different students and more course contents.Keywords: virtual learning, natural language processing, definite clause grammars, deep learning, artificial intelligence
Procedia PDF Downloads 1357177 Study of Effects of 3D Semi-Spheriacl Basin-Shape-Ratio on the Frequency Content and Spectral Amplitudes of the Basin-Generated Surface Waves
Authors: Kamal, J. P. Narayan
Abstract:
In the present wok the effects of basin-shape-ratio on the frequency content and spectral amplitudes of the basin-generated surface waves and the associated spatial variation of ground motion amplification and differential ground motion in a 3D semi-spherical basin has been studied. A recently developed 3D fourth-order spatial accurate time-domain finite-difference (FD) algorithm based on the parsimonious staggered-grid approximation of the 3D viscoelastic wave equations was used to estimate seismic responses. The simulated results demonstrated the increase of both the frequency content and the spectral amplitudes of the basin-generated surface waves and the duration of ground motion in the basin with the increase of shape-ratio of semi-spherical basin. An increase of the average spectral amplification (ASA), differential ground motion (DGM) and the average aggravation factor (AAF) towards the centre of the semi-spherical basin was obtained.Keywords: 3D viscoelastic simulation, basin-generated surface waves, basin-shape-ratio effects, average spectral amplification, aggravation factors and differential ground motion
Procedia PDF Downloads 5087176 The Structural Pattern: An Event-Related Potential Study on Tang Poetry
Authors: ShuHui Yang, ChingChing Lu
Abstract:
Measuring event-related potentials (ERPs) has been fundamental to our understanding of how people process language. One specific ERP component, a P600, has been hypothesized to be associated with syntactic reanalysis processes. We, however, propose that the P600 is not restricted to reanalysis processes, but is the index of the structural pattern processing. To investigate the structural pattern processing, we utilized the effects of stimulus degradation in structural priming. To put it another way, there was no P600 effect if the structure of the prime was the same with the structure of the target. Otherwise, there would be a P600 effect if the structure were different between the prime and the target. In the experiment, twenty-two participants were presented with four sentences of Tang poetry. All of the first two sentences, being prime, were conducted with SVO+VP. The last two sentences, being the target, were divided into three types. Type one of the targets was SVO+VP. Type two of the targets was SVO+VPVP. Type three of the targets was VP+VP. The result showed that both of the targets, SVO+VPVP and VP+VP, elicited positive-going brainwave, a P600 effect, at 600~900ms time window. Furthermore, the P600 component was lager for the target’ VP+VP’ than the target’ SVO+VPVP’. That meant the more dissimilar the structure was, the lager the P600 effect we got. These results indicate that P600 was the index of the structure processing, and it would affect the P600 effect intensity with the degrees of structural heterogeneity.Keywords: ERPs, P600, structural pattern, structural priming, Tang poetry
Procedia PDF Downloads 1407175 Beyond Geometry: The Importance of Surface Properties in Space Syntax Research
Authors: Christoph Opperer
Abstract:
Space syntax is a theory and method for analyzing the spatial layout of buildings and urban environments to understand how they can influence patterns of human movement, social interaction, and behavior. While direct visibility is a key factor in space syntax research, important visual information such as light, color, texture, etc., are typically not considered, even though psychological studies have shown a strong correlation to the human perceptual experience within physical space – with light and color, for example, playing a crucial role in shaping the perception of spaciousness. Furthermore, these surface properties are often the visual features that are most salient and responsible for drawing attention to certain elements within the environment. This paper explores the potential of integrating these factors into general space syntax methods and visibility-based analysis of space, particularly for architectural spatial layouts. To this end, we use a combination of geometric (isovist) and topological (visibility graph) approaches together with image-based methods, allowing a comprehensive exploration of the relationship between spatial geometry, visual aesthetics, and human experience. Custom-coded ray-tracing techniques are employed to generate spherical panorama images, encoding three-dimensional spatial data in the form of two-dimensional images. These images are then processed through computer vision algorithms to generate saliency-maps, which serve as a visual representation of areas most likely to attract human attention based on their visual properties. The maps are subsequently used to weight the vertices of isovists and the visibility graph, placing greater emphasis on areas with high saliency. Compared to traditional methods, our weighted visibility analysis introduces an additional layer of information density by assigning different weights or importance levels to various aspects within the field of view. This extends general space syntax measures to provide a more nuanced understanding of visibility patterns that better reflect the dynamics of human attention and perception. Furthermore, by drawing parallels to traditional isovist and VGA analysis, our weighted approach emphasizes a crucial distinction, which has been pointed out by Ervin and Steinitz: the difference between what is possible to see and what is likely to be seen. Therefore, this paper emphasizes the importance of including surface properties in visibility-based analysis to gain deeper insights into how people interact with their surroundings and to establish a stronger connection with human attention and perception.Keywords: space syntax, visibility analysis, isovist, visibility graph, visual features, human perception, saliency detection, raytracing, spherical images
Procedia PDF Downloads 747174 Effect of Traffic Composition on Delay and Saturation Flow at Signal Controlled Intersections
Authors: Arpita Saha, Apoorv Jain, Satish Chandra, Indrajit Ghosh
Abstract:
Level of service at a signal controlled intersection is directly measured from the delay. Similarly, saturation flow rate is a fundamental parameter to measure the intersection capacity. The present study calculates vehicle arrival rate, departure rate, and queue length for every five seconds interval in each cycle. Based on the queue lengths, the total delay of the cycle has been calculated using Simpson’s 1/3rd rule. Saturation flow has been estimated in terms of veh/hr of green/lane for every five seconds interval of the green period until at least three vehicles are left to cross the stop line. Vehicle composition shows an immense effect on total delay and saturation flow rate. The increase in two-wheeler proportion increases the saturation flow rate and reduces the total delay per vehicle significantly. Additionally, an increase in the heavy vehicle proportion reduces the saturation flow rate and increases the total delay for each vehicle.Keywords: delay, saturation flow, signalised intersection, vehicle composition
Procedia PDF Downloads 4647173 Biobased Toughening Filler for Polylactic Acid from Ultrafine Fully Vulcanized Powder Natural Rubber Grafted with Polymethylmethacrylate
Authors: Panyawutthi Rimdusit, Krittapas Charoensuk, Sarawut Rimdusit
Abstract:
A biobased toughening filler for polylactic acid (PLA) based on natural rubber is developed in this work. Deproteinized natural rubber (DPNR) was modified by grafting polymerization with methyl methacrylate monomer (MMA) and further crosslinked by e-beam irradiation and spray drying process to achieve ultrafine full vulcanized powdered natural rubber grafted with polymethylmethacrylate (UFPNRg-PMMA) to solves in the challenges of incompatibility between natural rubber and PLA. Intriguingly, UFPNR-g-PMMA revealed outstanding and unique properties with minimal particle aggregation. The average particle size of rubber powder obtained from UFPNR-g-PMMA at PMMA grafting content of 20 phr reduced to 3.3±1.2 µm, compared to that of neat UFPNR of 5.3±2.3 µm which also showed partial particle aggregation. It is also found that the impact strength of the filled PLA was enhanced to 33.4±5.6 kJ/m2 at PLA/UFPNR-gPMMA 20 wt% compared to neat PLA of 9.6±3 kJ/m2. The thermal degradation temperature of the PLA composites was enhanced with increasing UFPNR-g-PMMA content without affecting the glass transition temperature of the composites. The fracture surface of PLA/ UFPNR-g-PMMA suggested internal cavitation and crazes are the main effects of rubber toughening PLA with substantial interfacial interaction between the filler and the matrix.Keywords: natural rubber, ultrafine fully vulcanized powder rubber, polylactic acid, polymer composites
Procedia PDF Downloads 117172 Roughness Discrimination Using Bioinspired Tactile Sensors
Authors: Zhengkun Yi
Abstract:
Surface texture discrimination using artificial tactile sensors has attracted increasing attentions in the past decade as it can endow technical and robot systems with a key missing ability. However, as a major component of texture, roughness has rarely been explored. This paper presents an approach for tactile surface roughness discrimination, which includes two parts: (1) design and fabrication of a bioinspired artificial fingertip, and (2) tactile signal processing for tactile surface roughness discrimination. The bioinspired fingertip is comprised of two polydimethylsiloxane (PDMS) layers, a polymethyl methacrylate (PMMA) bar, and two perpendicular polyvinylidene difluoride (PVDF) film sensors. This artificial fingertip mimics human fingertips in three aspects: (1) Elastic properties of epidermis and dermis in human skin are replicated by the two PDMS layers with different stiffness, (2) The PMMA bar serves the role analogous to that of a bone, and (3) PVDF film sensors emulate Meissner’s corpuscles in terms of both location and response to the vibratory stimuli. Various extracted features and classification algorithms including support vector machines (SVM) and k-nearest neighbors (kNN) are examined for tactile surface roughness discrimination. Eight standard rough surfaces with roughness values (Ra) of 50 μm, 25 μm, 12.5 μm, 6.3 μm 3.2 μm, 1.6 μm, 0.8 μm, and 0.4 μm are explored. The highest classification accuracy of (82.6 ± 10.8) % can be achieved using solely one PVDF film sensor with kNN (k = 9) classifier and the standard deviation feature.Keywords: bioinspired fingertip, classifier, feature extraction, roughness discrimination
Procedia PDF Downloads 3127171 A Novel PWM/PFM Controller for PSR Fly-Back Converter Using a New Peak Sensing Technique
Authors: Sanguk Nam, Van Ha Nguyen, Hanjung Song
Abstract:
For low-power applications such as adapters for portable devices and USB chargers, the primary side regulation (PSR) fly-back converter is widely used in lieu of the conventional fly-back converter using opto-coupler because of its simpler structure and lower cost. In the literature, there has been studies focusing on the design of PSR circuit; however, the conventional sensing method in PSR circuit using RC delay has a lower accuracy as compared to the conventional fly-back converter using opto-coupler. In this paper, we propose a novel PWM/PFM controller using new sensing technique for the PSR fly-back converter which can control an accurate output voltage. The conventional PSR circuit can sense the output voltage information from the auxiliary winding to regulate the duty cycle of the clock that control the output voltage. In the sensing signal waveform, there has two transient points at time the voltage equals to Vout+VD and Vout, respectively. In other to sense the output voltage, the PSR circuit must detect the time at which the current of the diode at the output equals to zero. In the conventional PSR flyback-converter, the sensing signal at this time has a non-sharp-negative slope that might cause a difficulty in detecting the output voltage information since a delay of sensing signal or switching clock may exist which brings out an unstable operation of PSR fly-back converter. In this paper instead of detecting output voltage at a non-sharp-negative slope, a sharp-positive slope is used to sense the proper information of the output voltage. The proposed PRS circuit consists of a saw-tooth generator, a summing circuit, a sample and hold circuit and a peak detector. Besides, there is also the start-up circuit which protects the chip from high surge current when the converter is turned on. Additionally, to reduce the standby power loss, a second mode which operates in a low frequency is designed beside the main mode at high frequency. In general, the operation of the proposed PSR circuit can be summarized as following: At the time the output information is sensed from the auxiliary winding, a saw-tooth signal from the saw-tooth generator is generated. Then, both of these signals are summed using a summing circuit. After this process, the slope of the peak of the sensing signal at the time diode current is zero becomes positive and sharp that make the peak easy to detect. The output of the summing circuit then is fed into a peak detector and the sample and hold circuit; hence, the output voltage can be properly sensed. By this way, we can sense more accurate output voltage information and extend margin even circuit is delayed or even there is the existence of noise by using only a simple circuit structure as compared with conventional circuits while the performance can be sufficiently enhanced. Circuit verification was carried out using 0.35μm 700V Magnachip process. The simulation result of sensing signal shows a maximum error of 5mV under various load and line conditions which means the operation of the converter is stable. As compared to the conventional circuit, we achieved very small error only used analog circuits compare with conventional circuits. In this paper, a PWM/PFM controller using a simple and effective sensing method for PSR fly-back converter has been presented in this paper. The circuit structure is simple as compared with the conventional designs. The gained results from simulation confirmed the idea of the designKeywords: primary side regulation, PSR, sensing technique, peak detector, PWM/PFM control, fly-back converter
Procedia PDF Downloads 3387170 Memory Retrieval and Implicit Prosody during Reading: Anaphora Resolution by L1 and L2 Speakers of English
Authors: Duong Thuy Nguyen, Giulia Bencini
Abstract:
The present study examined structural and prosodic factors on the computation of antecedent-reflexive relationships and sentence comprehension in native English (L1) and Vietnamese-English bilinguals (L2). Participants read sentences presented on the computer screen in one of three presentation formats aimed at manipulating prosodic parsing: word-by-word (RSVP), phrase-segment (self-paced), or whole-sentence (self-paced), then completed a grammaticality rating and a comprehension task (following Pratt & Fernandez, 2016). The design crossed three factors: syntactic structure (simple; complex), grammaticality (target-match; target-mismatch) and presentation format. An example item is provided in (1): (1) The actress that (Mary/John) interviewed at the awards ceremony (about two years ago/organized outside the theater) described (herself/himself) as an extreme workaholic). Results showed that overall, both L1 and L2 speakers made use of a good-enough processing strategy at the expense of more detailed syntactic analyses. L1 and L2 speakers’ comprehension and grammaticality judgements were negatively affected by the most prosodically disrupting condition (word-by-word). However, the two groups demonstrated differences in their performance in the other two reading conditions. For L1 speakers, the whole-sentence and the phrase-segment formats were both facilitative in the grammaticality rating and comprehension tasks; for L2, compared with the whole-sentence condition, the phrase-segment paradigm did not significantly improve accuracy or comprehension. These findings are consistent with the findings of Pratt & Fernandez (2016), who found a similar pattern of results in the processing of subject-verb agreement relations using the same experimental paradigm and prosodic manipulation with English L1 and L2 English-Spanish speakers. The results provide further support for a Good-Enough cue model of sentence processing that integrates cue-based retrieval and implicit prosodic parsing (Pratt & Fernandez, 2016) and highlights similarities and differences between L1 and L2 sentence processing and comprehension.Keywords: anaphora resolution, bilingualism, implicit prosody, sentence processing
Procedia PDF Downloads 1527169 Production of Cellulose Nanowhiskers from Red Algae Waste and Its Application in Polymer Composite Development
Authors: Z. Kassab, A. Aboulkas, A. Barakat, M. El Achaby
Abstract:
The red algae are available enormously around the world and their exploitation for the production of agar product has become as an important industry in recent years. However, this industrial processing of red algae generated a large quantity of solid fibrous wastes, which constitute a source of a serious environmental problem. For this reason, the exploitation of this solid waste would help to i) produce new value-added materials and ii) to improve waste disposal from environment. In fact, this solid waste can be fully utilized for the production of cellulose microfibers and nanocrystals because it consists of large amount of cellulose component. For this purpose, the red algae waste was chemically treated via alkali, bleaching and acid hydrolysis treatments with controlled conditions, in order to obtain pure cellulose microfibers and cellulose nanocrystals. The raw product and the as-extracted cellulosic materials were successively characterized using serval analysis techniques, including elemental analysis, X-ray diffraction, thermogravimetric analysis, infrared spectroscopy and transmission electron microscopy. As an application, the as extracted cellulose nanocrystals were used as nanofillers for the production of polymer-based composite films with improved thermal and tensile properties. In these composite materials, the adhesion properties and the large number of functional groups that are presented in the CNC’s surface and the macromolecular chains of the polymer matrix are exploited to improve the interfacial interactions between the both phases, improving the final properties. Consequently, the high performances of these composite materials can be expected to have potential in packaging material applications.Keywords: cellulose nanowhiskers, food packaging, polymer composites, red algae waste
Procedia PDF Downloads 2287168 Neural Correlates of Arabic Digits Naming
Authors: Fernando Ojedo, Alejandro Alvarez, Pedro Macizo
Abstract:
In the present study, we explored electrophysiological correlates of Arabic digits naming to determine semantic processing of numbers. Participants named Arabic digits grouped by category or intermixed with exemplars of other semantic categories while the N400 event-related potential was examined. Around 350-450 ms after the presentation of Arabic digits, brain waves were more positive in anterior regions and more negative in posterior regions when stimuli were grouped by category relative to the mixed condition. Contrary to what was found in other studies, electrophysiological results suggested that the production of numerals involved semantic mediation.Keywords: Arabic digit naming, event-related potentials, semantic processing, number production
Procedia PDF Downloads 5827167 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions
Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib
Abstract:
Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption
Procedia PDF Downloads 4117166 Spatial Pattern of Environmental Noise Levels and Auditory Ailments in Abeokuta Metropolis, Southwestern Nigeria
Authors: Olusegun Oguntoke, Aramide Y. Tijani, Olayide R. Adetunji
Abstract:
Environmental noise has become a major threat to the quality of human life, and it is generally more severe in cities. This study assessed the level of environmental noise, mapped the spatial pattern at different times of the day and examined the association with morbidity of auditory ailments in Abeokuta metropolis. The entire metropolis was divided into 80 cells (areas) of 1000 m by 1000 m; out of which 33 were randomly selected for noise levels assessment. Portable noise meter (AR824) was used to measure noise level, and Global Positioning System (Garmin GPS-72H) was employed to take the coordinates of the sample sites for mapping. Risk map of the noise levels was produced using Kriging interpolation techniques based on the spatial spread of measured noise values across the study area. Data on cases of hearing impairments were collected from four major hospitals in the city. Data collected from field measurements and medical records were subjected to descriptive (frequency and percentage) and inferential (mean, ANOVA and correlation) statistics using SPSS (version 20.0). ArcMap 10.1 was employed for spatial analysis and mapping. Results showed mean noise levels range at morning (42.4 ± 4.14 – 88.2 ± 15.1 dBA), afternoon (45.0 ± 6.72– 86.4 ± 12.5 dBA) and evening (51.0 ± 6.55–84.4 ± 5.19 dBA) across the study area. The interpolated maps identified Kuto, Okelowo, Isale-Igbein, and Sapon as high noise risk areas. These are the central business district and nucleus of Abeokuta metropolis where commercial activities, high traffic volume, and clustered buildings exist. The monitored noise levels varied significantly among the sampled areas in the morning, afternoon and evening (p < 0.05). A significant correlation was found between diagnosed cases of auditory ailments and noise levels measured in the morning (r=0.39 at p < 0.05). Common auditory ailments found across the metropolis included impaired hearing (25.8%), tinnitus (16.4%) and otitis (15.0%). The most affected age groups were between 11-30 years while the male gender had more cases of hearing impairments (51.2%) than the females. The study revealed that environmental noise levels exceeded the recommended standards in the morning, afternoon and evening in 60.6%, 61% and 72.7% of the sampled areas respectively. Summarily, environmental noise in the study area is high and contributes to the morbidity of auditory ailments. Areas identified as hot spots of noise pollution should be avoided in the location of noise sensitive activities while environmental noise monitoring should be included as part of the mandate of the regulatory agencies in Nigeria.Keywords: noise pollution, associative analysis, auditory impairment, urban, human exposure
Procedia PDF Downloads 1447165 Elevated Temperature Shot Peening for M50 Steel
Authors: Xinxin Ma, Guangze Tang, Shuxin Yang, Jinguang He, Fan Zhang, Peiling Sun, Ming Liu, Minyu Sun, Liqin Wang
Abstract:
As a traditional surface hardening technique, shot peening is widely used in industry. By using shot peening, a residual compressive stress is formed in the surface which is beneficial for improving the fatigue life of metal materials. At the same time, very fine grains and high density defects are generated in the surface layer which enhances the surface hardness, either. However, most of the processes are carried out at room temperature. For high strength steel, such as M50, the thickness of the strengthen layer is limited. In order to obtain a thick strengthen surface layer, elevated temperature shot peening was carried out in this work by using Φ1mm cast ion balls with a speed of 80m/s. Considering the tempering temperature of M50 steel is about 550 oC, the processing temperature was in the range from 300 to 500 oC. The effect of processing temperature and processing time of shot peening on distribution of residual stress and surface hardness was investigated. As we known, the working temperature of M50 steel can be as high as 315 oC. Because the defects formed by shot peening are unstable when the working temperature goes higher, it is worthy to understand what happens during the shot peening process, and what happens when the strengthen samples were kept at a certain temperature. In our work, the shot peening time was selected from 2 to 10 min. And after the strengthening process, the samples were annealed at various temperatures from 200 to 500 oC up to 60 h. The results show that the maximum residual compressive stress is near 900 MPa. Compared with room temperature shot peening, the strengthening depth of 500 oC shot peening sample is about 2 times deep. The surface hardness increased with the processing temperature, and the saturation peening time decreases. After annealing, the residual compressive stress decreases, however, for 500 oC peening sample, even annealing at 500 oC for 20 h, the residual compressive stress is still over 600 MPa. However, it is clean to see from SEM that the grain size of surface layers is still very small.Keywords: shot peening, M50 steel, residual compressive stress, elevated temperature
Procedia PDF Downloads 4567164 Comparing Image Processing and AI Techniques for Disease Detection in Plants
Authors: Luiz Daniel Garay Trindade, Antonio De Freitas Valle Neto, Fabio Paulo Basso, Elder De Macedo Rodrigues, Maicon Bernardino, Daniel Welfer, Daniel Muller
Abstract:
Agriculture plays an important role in society since it is one of the main sources of food in the world. To help the production and yield of crops, precision agriculture makes use of technologies aiming at improving productivity and quality of agricultural commodities. One of the problems hampering quality of agricultural production is the disease affecting crops. Failure in detecting diseases in a short period of time can result in small or big damages to production, causing financial losses to farmers. In order to provide a map of the contributions destined to the early detection of plant diseases and a comparison of the accuracy of the selected studies, a systematic literature review of the literature was performed, showing techniques for digital image processing and neural networks. We found 35 interesting tool support alternatives to detect disease in 19 plants. Our comparison of these studies resulted in an overall average accuracy of 87.45%, with two studies very closer to obtain 100%.Keywords: pattern recognition, image processing, deep learning, precision agriculture, smart farming, agricultural automation
Procedia PDF Downloads 3797163 Greenland Monitoring Using Vegetation Index: A Case Study of Lal Suhanra National Park
Authors: Rabia Munsaf Khan, Eshrat Fatima
Abstract:
The analysis of the spatial extent and temporal change of vegetation cover using remotely sensed data is of critical importance to agricultural sciences. Pakistan, being an agricultural country depends on this resource as it makes 70% of the GDP. The case study is of Lal Suhanra National Park, which is not only the biggest forest reserve of Pakistan but also of Asia. The study is performed using different temporal images of Landsat. Also, the results of Landsat are cross-checked by using Sentinel-2 imagery as it has both higher spectral and spatial resolution. Vegetation can easily be detected using NDVI which is a common and widely used index. It is an important vegetation index, widely applied in research on global environmental and climatic change. The images are then classified to observe the change occurred over 15 years. Vegetation cover maps of 2000 and 2016 are used to generate the map of vegetation change detection for the respective years and to find out the changing pattern of vegetation cover. Also, the NDVI values aided in the detection of percentage decrease in vegetation cover. The study reveals that vegetation cover of the area has decreased significantly during the year 2000 and 2016.Keywords: Landsat, normalized difference vegetation index (NDVI), sentinel 2, Greenland monitoring
Procedia PDF Downloads 309