Search results for: computer processing of large databases
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12335

Search results for: computer processing of large databases

12125 Agricultural Cooperative Model: A Panacea for Economic Development of Small Scale Business Famers in Ilesha, Osun State, Nigeria

Authors: Folasade Adegbaju, Olusola Arowolo, Olufisayo Onawumi

Abstract:

Owolowo ile – ege garri processing industry which is a small scale cassava processing industry, located in Ilesha, Osun State was purposively selected as a case study because it is a cooperative business. This industry was established in 1991 by eight men (8) who were mostly retirees. A researcher made questionnaire was used to collect information from thirty (30) respondents: the manager, four official staffs and 25 randomly selected processors in the industry. The study found that within twelve years of the utilization of their self raised initial capital of N240, 000 naira (Two hundred and forty thousand naira) this cassava – based industry had impacted on and attracted the involvement of many more people because within the period of the study (i.e. 2007-2011) the processors had quadrupled in number (e.g. 8 to 30), the facilities (equipment) in use had increased from one machine and a frying pot to many, this translated into being able to produce large quantities of fried garri, fufu and also starch for marketing to the people in Ilesha and neighbouring cities like Ibadan, Lagos, etc. This is indicative of economic growth. The industry also became a source of employment for community members in the sense that, as at the time of study four staffs were employed to work and coordinate the industry. It was observed that despite all odds of small-scale industry and the problem of people migrating from rural to urban area, this agro-based industry still existed successfully in the community, and many of such industry can be replicated by such agricultural cooperative groups nationwide so as to further boost the productivity as well as the economy of the area and nation at large. However, government and individual still have major roles to play in ensuring the growth and development of the nation in this respect.The local agricultural cooperative groups should form regional cooperative consortium with more networking for the farmers, in order to create more jobs for the young ones and to increase agricultural productivity in the country thus resulting in a better and more sustainable economy.

Keywords: agricultural cooperative, cassava processing industry, model, small scale enterprise

Procedia PDF Downloads 260
12124 A U-Net Based Architecture for Fast and Accurate Diagram Extraction

Authors: Revoti Prasad Bora, Saurabh Yadav, Nikita Katyal

Abstract:

In the context of educational data mining, the use case of extracting information from images containing both text and diagrams is of high importance. Hence, document analysis requires the extraction of diagrams from such images and processes the text and diagrams separately. To the author’s best knowledge, none among plenty of approaches for extracting tables, figures, etc., suffice the need for real-time processing with high accuracy as needed in multiple applications. In the education domain, diagrams can be of varied characteristics viz. line-based i.e. geometric diagrams, chemical bonds, mathematical formulas, etc. There are two broad categories of approaches that try to solve similar problems viz. traditional computer vision based approaches and deep learning approaches. The traditional computer vision based approaches mainly leverage connected components and distance transform based processing and hence perform well in very limited scenarios. The existing deep learning approaches either leverage YOLO or faster-RCNN architectures. These approaches suffer from a performance-accuracy tradeoff. This paper proposes a U-Net based architecture that formulates the diagram extraction as a segmentation problem. The proposed method provides similar accuracy with a much faster extraction time as compared to the mentioned state-of-the-art approaches. Further, the segmentation mask in this approach allows the extraction of diagrams of irregular shapes.

Keywords: computer vision, deep-learning, educational data mining, faster-RCNN, figure extraction, image segmentation, real-time document analysis, text extraction, U-Net, YOLO

Procedia PDF Downloads 101
12123 Algorithm for Information Retrieval Optimization

Authors: Kehinde K. Agbele, Kehinde Daniel Aruleba, Eniafe F. Ayetiran

Abstract:

When using Information Retrieval Systems (IRS), users often present search queries made of ad-hoc keywords. It is then up to the IRS to obtain a precise representation of the user’s information need and the context of the information. This paper investigates optimization of IRS to individual information needs in order of relevance. The study addressed development of algorithms that optimize the ranking of documents retrieved from IRS. This study discusses and describes a Document Ranking Optimization (DROPT) algorithm for information retrieval (IR) in an Internet-based or designated databases environment. Conversely, as the volume of information available online and in designated databases is growing continuously, ranking algorithms can play a major role in the context of search results. In this paper, a DROPT technique for documents retrieved from a corpus is developed with respect to document index keywords and the query vectors. This is based on calculating the weight (

Keywords: information retrieval, document relevance, performance measures, personalization

Procedia PDF Downloads 211
12122 An Experimental Investigation of Air Entrainment Due to Water Jets in Crossflows

Authors: Mina Esmi Jahromi, Mehdi Khiadani

Abstract:

Vertical water jets discharging into free surface turbulent cross flows result in the ingression of a large amount of air in the body of water and form a region of two-phase air-water flow with a considerable interfacial area. This research presents an experimental study of the two-phase bubbly flow using image processing technique. The air ingression and the trajectories of bubble swarms under different experimental conditions are evaluated. The rate of air entrainment and the bubble characteristics such as penetration depth, and dispersion pattern were found to be affected by the most influential parameters of water jet and cross flow including water jet-to-crossflow velocity ratio, water jet falling height, and cross flow depth. This research improves understanding of the underwater flow structure due to the water jet impingement in crossflow and advances the practical applications of water jets such as artificial aeration, circulation, and mixing where crossflow is present.

Keywords: air entrainment, image processing, jet in cross flow, two-phase flow

Procedia PDF Downloads 337
12121 Dynamic Environmental Impact Study during the Construction of the French Nuclear Power Plants

Authors: A. Er-Raki, D. Hartmann, J. P. Belaud, S. Negny

Abstract:

This paper has a double purpose: firstly, a literature review of the life cycle analysis (LCA) and secondly a comparison between conventional (static) LCA and multi-level dynamic LCA on the following items: (i) inventories evolution with time (ii) temporal evolution of the databases. The first part of the paper summarizes the state of the art of the static LCA approach. The different static LCA limits have been identified and especially the non-consideration of the spatial and temporal evolution in the inventory, for the characterization factors (FCs) and into the databases. Then a description of the different levels of integration of the notion of temporality in life cycle analysis studies was made. In the second part, the dynamic inventory has been evaluated firstly for a single nuclear plant and secondly for the entire French nuclear power fleet by taking into account the construction durations of all the plants. In addition, the databases have been adapted by integrating the temporal variability of the French energy mix. Several iterations were used to converge towards the real environmental impact of the energy mix. Another adaptation of the databases to take into account the temporal evolution of the market data of the raw material was made. An identification of the energy mix of the time studied was based on an extrapolation of the production reference values of each means of production. An application to the construction of the French nuclear power plants from 1971 to 2000 has been performed, in which a dynamic inventory of raw material has been evaluated. Then the impacts were characterized by the ILCD 2011 characterization method. In order to compare with a purely static approach, a static impact assessment was made with the V 3.4 Ecoinvent data sheets without adaptation and a static inventory considering that all the power stations would have been built at the same time. Finally, a comparison between static and dynamic LCA approaches was set up to determine the gap between them for each of the two levels of integration. The results were analyzed to identify the contribution of the evolving nuclear power fleet construction to the total environmental impacts of the French energy mix during the same period. An equivalent strategy using a dynamic approach will further be applied to identify the environmental impacts that different scenarios of the energy transition could bring, allowing to choose the best energy mix from an environmental viewpoint.

Keywords: LCA, static, dynamic, inventory, construction, nuclear energy, energy mix, energy transition

Procedia PDF Downloads 80
12120 Analyzing the Risk Based Approach in General Data Protection Regulation: Basic Challenges Connected with Adapting the Regulation

Authors: Natalia Kalinowska

Abstract:

The adoption of the General Data Protection Regulation, (GDPR) finished the four-year work of the European Commission in this area in the European Union. Considering far-reaching changes, which will be applied by GDPR, the European legislator envisaged two-year transitional period. Member states and companies have to prepare for a new regulation until 25 of May 2018. The idea, which becomes a new look at an attitude to data protection in the European Union is risk-based approach. So far, as a result of implementation of Directive 95/46/WE, in many European countries (including Poland) there have been adopted very particular regulations, specifying technical and organisational security measures e.g. Polish implementing rules indicate even how long password should be. According to the new approach from May 2018, controllers and processors will be obliged to apply security measures adequate to level of risk associated with specific data processing. The risk in GDPR should be interpreted as the likelihood of a breach of the rights and freedoms of the data subject. According to Recital 76, the likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. GDPR does not indicate security measures which should be applied – in recitals there are only examples such as anonymization or encryption. It depends on a controller’s decision what type of security measures controller considered as sufficient and he will be responsible if these measures are not sufficient or if his identification of risk level is incorrect. Data protection regulation indicates few levels of risk. Recital 76 indicates risk and high risk, but some lawyers think, that there is one more category – low risk/now risk. Low risk/now risk data processing is a situation when it is unlikely to result in a risk to the rights and freedoms of natural persons. GDPR mentions types of data processing when a controller does not have to evaluate level of risk because it has been classified as „high risk” processing e.g. processing on a large scale of special categories of data, processing with using new technologies. The methodology will include analysis of legal regulations e.g. GDPR, the Polish Act on the Protection of personal data. Moreover: ICO Guidelines and articles concerning risk based approach in GDPR. The main conclusion is that an appropriate risk assessment is a key to keeping data safe and avoiding financial penalties. On the one hand, this approach seems to be more equitable, not only for controllers or processors but also for data subjects, but on the other hand, it increases controllers’ uncertainties in the assessment which could have a direct impact on incorrect data protection and potential responsibility for infringement of regulation.

Keywords: general data protection regulation, personal data protection, privacy protection, risk based approach

Procedia PDF Downloads 221
12119 Metaphorical Perceptions of Middle School Students regarding Computer Games

Authors: Ismail Celik, Ismail Sahin, Fetah Eren

Abstract:

The computer, among the most important inventions of the twentieth century, has become an increasingly important component in our everyday lives. Computer games also have become increasingly popular among people day-by-day, owing to their features based on realistic virtual environments, audio and visual features, and the roles they offer players. In the present study, the metaphors students have for computer games are investigated, as well as an effort to fill the gap in the literature. Students were asked to complete the sentence—‘Computer game is like/similar to….because….’— to determine the middle school students’ metaphorical images of the concept for ‘computer game’. The metaphors created by the students were grouped in six categories, based on the source of the metaphor. These categories were ordered as ‘computer game as a means of entertainment’, ‘computer game as a beneficial means’, ‘computer game as a basic need’, ‘computer game as a source of evil’, ‘computer game as a means of withdrawal’, and ‘computer game as a source of addiction’, according to the number of metaphors they included.

Keywords: computer game, metaphor, middle school students, virtual environments

Procedia PDF Downloads 499
12118 Study on Shape Coefficient of Large Statue Building Based on CFD

Authors: Wang Guangda, Ma Jun, Zhao Caiqi, Pan Rui

Abstract:

Wind load is the main control load of large statue structures. Due to the irregular plane and elevation and uneven outer contour, statues’ shape coefficient can not pick up from the current code. Currently a common practice is based on wind tunnel test. But this method is time-consuming and high cost. In this paper, based on the fundamental theory of CFD, using fluid dynamics software of Fluent 15.0, a few large statue structure of 40 to 70m high, which are located in china , including large fairy statues and large Buddha statues, are analyzed by numerical wind tunnel. The results are contrasted with the recommended values in load code and the wind tunnel test results respectively. Results show that the shape coefficient has a good reliability by the numerical wind tunnel method of this kind of building. This will has a certain reference value of wind load values for large statues’ structure.

Keywords: large statue structure, shape coefficient, irregular structure, wind tunnel test, numerical wind tunnel simulation

Procedia PDF Downloads 344
12117 Use of Satellite Imaging to Understand Earth’s Surface Features: A Roadmap

Authors: Sabri Serkan Gulluoglu

Abstract:

It is possible with Geographic Information Systems (GIS) that the information about all natural and artificial resources on the earth is obtained taking advantage of satellite images are obtained by remote sensing techniques. However, determination of unknown sources, mapping of the distribution and efficient evaluation of resources are defined may not be possible with the original image. For this reasons, some process steps are needed like transformation, pre-processing, image enhancement and classification to provide the most accurate assessment numerically and visually. Many studies which present the phases of obtaining and processing of the satellite images have examined in the literature study. The research showed that the determination of the process steps may be followed at this subject with the existence of a common whole may provide to progress the process rapidly for the necessary and possible studies which will be.

Keywords: remote sensing, satellite imaging, gis, computer science, information

Procedia PDF Downloads 290
12116 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison

Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo

Abstract:

A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.

Keywords: affective computing, interface, brain, intelligent interaction

Procedia PDF Downloads 356
12115 A Survey of Semantic Integration Approaches in Bioinformatics

Authors: Chaimaa Messaoudi, Rachida Fissoune, Hassan Badir

Abstract:

Technological advances of computer science and data analysis are helping to provide continuously huge volumes of biological data, which are available on the web. Such advances involve and require powerful techniques for data integration to extract pertinent knowledge and information for a specific question. Biomedical exploration of these big data often requires the use of complex queries across multiple autonomous, heterogeneous and distributed data sources. Semantic integration is an active area of research in several disciplines, such as databases, information-integration, and ontology. We provide a survey of some approaches and techniques for integrating biological data, we focus on those developed in the ontology community.

Keywords: biological ontology, linked data, semantic data integration, semantic web

Procedia PDF Downloads 415
12114 The Challenges of Teaching First Year Accounting with a Lecturer-Student Ratio of 1:1248

Authors: Hanli Joubert

Abstract:

In South Africa, teaching large classes is a reality that lecturers face in most higher institutions. When teaching a large group, literature normally refers to groups of about 50 to 500 students. At the University of the Free State, the first-year accounting group comprises around 1300 students. Apart from extremely large classes, the problem is exacerbated by the diversity of students’ previous schooling in accounting as well as their socio-economic backgrounds. The university scenario is further complicated by a lack of venues, compressed timetables, as well as lack of resources. This study aims to investigate the challenges and effectiveness of teaching a large and diverse group of first-year accounting students by drawing from personal experience, a literature study, interviews with other lecturers as well as students registered for first year accounting. The results reveal that teaching first-year accounting students in a large group is not the ideal situation but that it can be effective if it is managed correctly.

Keywords: diverse backgrounds, large groups, limited resources, first-year accounting students

Procedia PDF Downloads 22
12113 Low-Cost Image Processing System for Evaluating Pavement Surface Distress

Authors: Keerti Kembhavi, M. R. Archana, V. Anjaneyappa

Abstract:

Most asphalt pavement condition evaluation use rating frameworks in which asphalt pavement distress is estimated by type, extent, and severity. Rating is carried out by the pavement condition rating (PCR), which is tedious and expensive. This paper presents the development of a low-cost technique for image pavement distress analysis that permits the identification of pothole and cracks. The paper explores the application of image processing tools for the detection of potholes and cracks. Longitudinal cracking and pothole are detected using Fuzzy-C- Means (FCM) and proceeded with the Spectral Theory algorithm. The framework comprises three phases, including image acquisition, processing, and extraction of features. A digital camera (Gopro) with the holder is used to capture pavement distress images on a moving vehicle. FCM classifier and Spectral Theory algorithms are used to compute features and classify the longitudinal cracking and pothole. The Matlab2016Ra Image preparing tool kit utilizes performance analysis to identify the viability of pavement distress on selected urban stretches of Bengaluru city, India. The outcomes of image evaluation with the utilization semi-computerized image handling framework represented the features of longitudinal crack and pothole with an accuracy of about 80%. Further, the detected images are validated with the actual dimensions, and it is seen that dimension variability is about 0.46. The linear regression model y=1.171x-0.155 is obtained using the existing and experimental / image processing area. The R2 correlation square obtained from the best fit line is 0.807, which is considered in the linear regression model to be ‘large positive linear association’.

Keywords: crack detection, pothole detection, spectral clustering, fuzzy-c-means

Procedia PDF Downloads 152
12112 The Role of Intermediaries in E-Government Adoption in India: Bridging the Digital Divide

Authors: Rajiv Kumar, Amit Sachan, Arindam Mukherjee

Abstract:

Despite the transparency and benefits of e-government, and its potential to serve citizens better, there is low diffusion and adoption of e-government services in India. Limited access to computer and internet, lack of computer and internet skills, low trust in technology, and risk associated in using e-government services are major hindrances in e-government adoption in India. Despite a large number of citizens belonging to the non-adopter category, the government has made some services mandatory to be accessed online where citizens have no other choice. Also despite the digital divide, a large number of citizens prefer online access to government services. In such cases intermediaries like common service centers, internet café and services agents’ roles are significant for accessing e-government services. Hence research is needed to explore this. The study aims to investigate the role of intermediaries in online access to public services by citizens. Qualitative research methodology using semi-structured interview was used. The results show that intermediaries play an important role in bridging the digital divide. The study also highlights on what circumstances citizens are taking help of these intermediaries. The study then highlights its limitations and discusses scope for future study.

Keywords: adoption, digital divide, e-government, India, intermediaries

Procedia PDF Downloads 248
12111 Computational Fluid Dynamics Simulations of Thermal and Flow Fields inside a Desktop Personal Computer Cabin

Authors: Mohammad Salehi, Mohammad Erfan Doraki

Abstract:

In this paper, airflow analysis inside a desktop computer case is performed by simulating computational fluid dynamics. The purpose is to investigate the cooling process of the central processing unit (CPU) with thermal capacities of 80 and 130 watts. The airflow inside the computer enclosure, selected from the microATX model, consists of the main components of heat production such as CPU, hard disk drive, CD drive, floppy drive, memory card and power supply unit; According to the amount of thermal power produced by the CPU with 80 and 130 watts of power, two different geometries have been used for a direct and radial heat sink. First, the independence of the computational mesh and the validation of the solution were performed, and after ensuring the correctness of the numerical solution, the results of the solution were analyzed. The simulation results showed that changes in CPU temperature and other components linearly increased with increasing CPU heat output. Also, the ambient air temperature has a significant effect on the maximum processor temperature.

Keywords: computational fluid dynamics, CPU cooling, computer case simulation, heat sink

Procedia PDF Downloads 85
12110 An Evaluation of Neural Network Efficacies for Image Recognition on Edge-AI Computer Vision Platform

Authors: Jie Zhao, Meng Su

Abstract:

Image recognition, as one of the most critical technologies in computer vision, works to help machine-like robotics understand a scene, that is, if deployed appropriately, will trigger the revolution in remote sensing and industry automation. With the developments of AI technologies, there are many prevailing and sophisticated neural networks as technologies developed for image recognition. However, computer vision platforms as hardware, supporting neural networks for image recognition, as crucial as the neural network technologies, need to be more congruently addressed as the research subjects. In contrast, different computer vision platforms are deterministic to leverage the performance of different neural networks for recognition. In this paper, three different computer vision platforms – Jetson Nano(with 4GB), a standalone laptop(with RTX 3000s, using CUDA), and Google Colab (web-based, using GPU) are explored and four prominent neural network architectures (including AlexNet, VGG(16/19), GoogleNet, and ResNet(18/34/50)), are investigated. In the context of pairwise usage between different computer vision platforms and distinctive neural networks, with the merits of recognition accuracy and time efficiency, the performances are evaluated. In the case study using public imageNets, our findings provide a nuanced perspective on optimizing image recognition tasks across Edge-AI platforms, offering guidance on selecting appropriate neural network structures to maximize performance under hardware constraints.

Keywords: alexNet, VGG, googleNet, resNet, Jetson nano, CUDA, COCO-NET, cifar10, imageNet large scale visual recognition challenge (ILSVRC), google colab

Procedia PDF Downloads 50
12109 The Effect of Parameters on Production of NİO/Al2O3/B2O3/SiO2 Composite Nanofibers by Using Sol-Gel Processing and Electrospinning Technique

Authors: F. Sevim, E. Sevimli, F. Demir, T. Çalban

Abstract:

For the first time, nanofibers of PVA /nickel nitrate/silica/alumina izopropoxide/boric acid composite were prepared by using sol-gel processing and electrospinning technique. By high temperature calcinations of the above precursor fibers, nanofibers of NiO/Al2O3/B2O3/SiO2 composite with diameters of 500 nm could be successfully obtained. The fibers were characterized by TG/DTA, FT-IR, XRD and SEM analyses.

Keywords: nano fibers, NiO/Al2O3/B2O3/SiO2 composite, sol-gel processing, electro spinning

Procedia PDF Downloads 302
12108 Meta-analysis of Technology Acceptance for Mobile and Digital Libraries in Academic Settings

Authors: Nosheen Fatima Warraich

Abstract:

One of the most often used models in information system (IS) research is the technology acceptance model (TAM). This meta-analysis aims to measure the relationship between TAM variables, Perceived Ease of Use (PEOU), and Perceived Usefulness (PU) with users’ attitudes and behavioral intention (BI) in mobile and digital libraries context. It also examines the relationship of external variables (information quality and system quality) with TAM variables (PEOU and PU) in digital libraries settings. This meta-analysis was performed through PRISMA-P guidelines. Four databases (Google Scholar, Web of Science, Scopus, and LISTA) were utilized for searching, and the search was conducted according to defined criteria. The findings of this study revealed a large effect size of PU and PEOU with BI. There was also a large effect size of PU and PEOU with attitude. A medium effect size was found between SysQ -> PU, InfoQ-> PU, and SysQ -> PEOU. However, there was a small effect size between InfoQ and PEOU. It fills the literature gap and also confirms that TAM is a valid model for the acceptance and use of technology in mobile and digital libraries context. Thus, its findings would be helpful for developers and designers in designing and developing mobile library apps. It will also be beneficial for library authorities and system librarians in designing and developing digital libraries in academic settings.

Keywords: technology acceptance model (tam), perceived ease of use, perceived usefulness, information quality, system quality, meta-analysis, systematic review, digital libraries, and mobile library apps.

Procedia PDF Downloads 44
12107 Shape Management Method of Large Structure Based on Octree Space Partitioning

Authors: Gichun Cha, Changgil Lee, Seunghee Park

Abstract:

The objective of the study is to construct the shape management method contributing to the safety of the large structure. In Korea, the research of the shape management is lack because of the new attempted technology. Terrestrial Laser Scanning (TLS) is used for measurements of large structures. TLS provides an efficient way to actively acquire accurate the point clouds of object surfaces or environments. The point clouds provide a basis for rapid modeling in the industrial automation, architecture, construction or maintenance of the civil infrastructures. TLS produce a huge amount of point clouds. Registration, Extraction and Visualization of data require the processing of a massive amount of scan data. The octree can be applied to the shape management of the large structure because the scan data is reduced in the size but, the data attributes are maintained. The octree space partitioning generates the voxel of 3D space, and the voxel is recursively subdivided into eight sub-voxels. The point cloud of scan data was converted to voxel and sampled. The experimental site is located at Sungkyunkwan University. The scanned structure is the steel-frame bridge. The used TLS is Leica ScanStation C10/C5. The scan data was condensed 92%, and the octree model was constructed with 2 millimeter in resolution. This study presents octree space partitioning for handling the point clouds. The basis is created by shape management of the large structures such as double-deck tunnel, building and bridge. The research will be expected to improve the efficiency of structural health monitoring and maintenance. "This work is financially supported by 'U-City Master and Doctor Course Grant Program' and the National Research Foundation of Korea(NRF) grant funded by the Korea government (MSIP) (NRF- 2015R1D1A1A01059291)."

Keywords: 3D scan data, octree space partitioning, shape management, structural health monitoring, terrestrial laser scanning

Procedia PDF Downloads 274
12106 Optimization in Friction Stir Processing Method with Emphasis on Optimized Process Parameters Laboratory Research

Authors: Atabak Rahimzadeh Ilkhch

Abstract:

Friction stir processing (FSP) has promised for application of thermo-mechanical processing techniques where aims to change the micro structural and mechanical properties of materials in order to obtain high performance and reducing the production time and cost. There are lots of studies focused on the microstructure of friction stir welded aluminum alloys. The main focus of this research is on the grain size obtained in the weld zone. Moreover in second part focused on temperature distribution effect over the entire weld zone and its effects on the microstructure. Also, there is a need to have more efforts on investigating to obtain the optimal value of effective parameters such as rotational speed on microstructure and to use the optimum tool designing method. the final results of this study will be present the variation of structural and mechanical properties of materials in the base of applying Friction stir processing and effect of (FSP) processing and tensile testing on surface quality. in the hand, this research addresses the FSP f AA-7020 aluminum and variation f ration of rotation and translational speeds.

Keywords: friction stir processing, AA-7020, thermo-mechanical, microstructure, temperature

Procedia PDF Downloads 249
12105 Effect of Plasma Treatment on UV Protection Properties of Fabrics

Authors: Sheila Shahidi

Abstract:

UV protection by fabrics has recently become a focus of great interest, particularly in connection with environmental degradation or ozone layer depletion. Fabrics provide simple and convenient protection against UV radiation (UVR), but not all fabrics offer sufficient UV protection. To describe the degree of UVR protection offered by clothing materials, the ultraviolet protection factor (UPF) is commonly used. UV-protective fabric can be generated by application of a chemical finish using normal wet-processing methodologies. However, traditional wet-processing techniques are known to consume large quantities of water and energy and may lead to adverse alterations of the bulk properties of the substrate. Recently, usage of plasmas to generate physicochemical surface modifications of textile substrates has become an intriguing approach to replace or enhance conventional wet-processing techniques. In this research work the effect of plasma treatment on UV protection properties of fabrics was investigated. DC magnetron sputtering was used and the parameters of plasma such as gas type, electrodes, time of exposure, power and, etc. were studied. The morphological and chemical properties of samples were analyzed using Scanning Electron Microscope (SEM) and Furrier Transform Infrared Spectroscopy (FTIR), respectively. The transmittance and UPF values of the original and plasma-treated samples were measured using a Shimadzu UV3101 PC (UV–Vis–NIR scanning spectrophotometer, 190–2, 100 nm range). It was concluded that, plasma which is an echo-friendly, cost effective and dry technique is being used in different branches of the industries, and will conquer textile industry in the near future. Also it is promising method for preparation of UV protection textile.

Keywords: fabric, plasma, textile, UV protection

Procedia PDF Downloads 497
12104 Post-Processing Method for Performance Improvement of Aerial Image Parcel Segmentation

Authors: Donghee Noh, Seonhyeong Kim, Junhwan Choi, Heegon Kim, Sooho Jung, Keunho Park

Abstract:

In this paper, we describe an image post-processing method to enhance the performance of the parcel segmentation method using deep learning-based aerial images conducted in previous studies. The study results were evaluated using a confusion matrix, IoU, Precision, Recall, and F1-Score. In the case of the confusion matrix, it was observed that the false positive value, which is the result of misclassification, was greatly reduced as a result of image post-processing. The average IoU was 0.9688 in the image post-processing, which is higher than the deep learning result of 0.8362, and the F1-Score was also 0.9822 in the image post-processing, which was higher than the deep learning result of 0.8850. As a result of the experiment, it was found that the proposed technique positively complements the deep learning results in segmenting the parcel of interest.

Keywords: aerial image, image process, machine vision, open field smart farm, segmentation

Procedia PDF Downloads 46
12103 Evaluation of Condyle Alterations after Orthognathic Surgery with a Digital Image Processing Technique

Authors: Livia Eisler, Cristiane C. B. Alves, Cristina L. F. Ortolani, Kurt Faltin Jr.

Abstract:

Purpose: This paper proposes a technically simple diagnosis method among orthodontists and maxillofacial surgeons in order to evaluate discrete bone alterations. The methodology consists of a protocol to optimize the diagnosis and minimize the possibility for orthodontic and ortho-surgical retreatment. Materials and Methods: A protocol of image processing and analysis, through ImageJ software and its plugins, was applied to 20 pairs of lateral cephalometric images obtained from cone beam computerized tomographies, before and 1 year after undergoing orthognathic surgery. The optical density of the images was analyzed in the condylar region to determine possible bone alteration after surgical correction. Results: Image density was shown to be altered in all image pairs, especially regarding the condyle contours. According to measures, condyle had a gender-related density reduction for p=0.05 and condylar contours had their alterations registered in mm. Conclusion: A simple, viable and cost-effective technique can be applied to achieve the more detailed image-based diagnosis, not depending on the human eye and therefore, offering more reliable, quantitative results.

Keywords: bone resorption, computer-assisted image processing, orthodontics, orthognathic surgery

Procedia PDF Downloads 124
12102 A NoSQL Based Approach for Real-Time Managing of Robotics's Data

Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir

Abstract:

This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.

Keywords: NoSQL databases, database management systems, robotics, big data

Procedia PDF Downloads 320
12101 Prediction of Fire Growth of the Office by Real-Scale Fire Experiment

Authors: Kweon Oh-Sang, Kim Heung-Youl

Abstract:

Estimating the engineering properties of fires is important to be prepared for the complex and various fire risks of large-scale structures such as super-tall buildings, large stadiums, and multi-purpose structures. In this study, a mock-up of a compartment which was 2.4(L) x 3.6 (W) x 2.4 (H) meter in dimensions was fabricated at the 10MW LSC (Large Scale Calorimeter) and combustible office supplies were placed in the compartment for a real-scale fire test. Maximum heat release rate was 4.1 MW and total energy release obtained through the application of t2 fire growth rate was 6705.9 MJ.

Keywords: fire growth, fire experiment, t2 curve, large scale calorimeter

Procedia PDF Downloads 306
12100 Thermo-Mechanical Treatment of Chromium Alloyed Low Carbon Steel

Authors: L. Kučerová, M. Bystrianský, V. Kotěšovec

Abstract:

Thermo-mechanical processing with various processing parameters was applied to 0.2%C-0.6%Mn-2S%i-0.8%Cr low alloyed high strength steel. The aim of the processing was to achieve the microstructures typical for transformation induced plasticity (TRIP) steels. Thermo-mechanical processing used in this work incorporated two or three deformation steps. The deformations were in all the cases carried out during the cooling from soaking temperatures to various bainite hold temperatures. In this way, 4-10% of retained austenite were retained in the final microstructures, consisting further of ferrite, bainite, martensite and pearlite. The complex character of TRIP steel microstructure is responsible for its good strength and ductility. The strengths achieved in this work were in the range of 740 MPa – 836 MPa with ductility A5mm of 31-41%.

Keywords: pearlite, retained austenite, thermo-mechanical treatment, TRIP steel

Procedia PDF Downloads 265
12099 Microstructure and Mechanical Evaluation of PMMA/Al₂O₃ Nanocomposite Fabricated via Friction Stir Processing

Authors: Reham K. El Sawah, N. S. M. El-Tayeb

Abstract:

This study aims to produce a polymer matrix composite reinforced with Al₂O₃ nanoparticles in order to enhance the mechanical properties of PMMA. The composite was fabricated via Friction stir processing to ensure homogenous dispersion of Al₂O₃ nanoparticles in the polymer, and the processing was submerged to prevent the sputtering of nanoparticles. The surface quality, microstructure, impact energy and hardness of the prepared samples were investigated. Good surface quality and dispersion of nanoparticles were attained through employing sufficient processing conditions. The experimental results indicated that as the percentage of nanoparticles increased, the impact energy and hardness increased, reaching 2 kJ/m2 and 14.7 HV at a nanoparticle concentration of 25%, which means that the toughness and the hardness of the polymer-ceramic produced composite is higher than unprocessed PMMA by 66% and 33% respectively.

Keywords: friction stir processing, polymer matrix nanocomposite, mechanical properties, microstructure

Procedia PDF Downloads 142
12098 Biosensors as Analytical Tools in Legume Processing

Authors: S. V. Ncube, A. I. O. Jideani, E. T. Gwata

Abstract:

The plight of food insecurity in developing countries has led to renewed interest in underutilized legumes. Their nutritional versatility, desirable functionality, pharmaceutical value and inherent bioactive compounds have drawn the attention of researchers. This has provoked the development of value added products with the aim of commercially exploiting their full potential. However processing of these legumes leads to changes in nutritional composition as affected by processing variables like pH, temperature and pressure. There is therefore a need for process control and quality assurance during production of the value added products. However, conventional methods for microbiological and biochemical identification are labour intensive and time-consuming. Biosensors offer rapid and affordable methods to assure the quality of the products. They may be used to quantify nutrients and anti-nutrients in the products while manipulating and monitoring variables such as pH, temperature, pressure and oxygen that affect the quality of the final product. This review gives an overview of the types of biosensors used in the food industry, their advantages and disadvantages and their possible application in processing of legumes.

Keywords: legume processing, biosensors, quality control, nutritional versatility

Procedia PDF Downloads 463
12097 Improvement Image Summarization using Image Processing and Particle swarm optimization Algorithm

Authors: Hooman Torabifard

Abstract:

In the last few years, with the progress of technology and computers and artificial intelligence entry into all kinds of scientific and industrial fields, the lifestyles of human life have changed and in general, the way of humans live on earth has many changes and development. Until now, some of the changes has occurred in the context of digital images and image processing and still continues. However, besides all the benefits, there have been disadvantages. One of these disadvantages is the multiplicity of images with high volume and data; the focus of this paper is on improving and developing a method for summarizing and enhancing the productivity of these images. The general method used for this purpose in this paper consists of a set of methods based on data obtained from image processing and using the PSO (Particle swarm optimization) algorithm. In the remainder of this paper, the method used is elaborated in detail.

Keywords: image summarization, particle swarm optimization, image threshold, image processing

Procedia PDF Downloads 102
12096 A Genetic-Neural-Network Modeling Approach for Self-Heating in GaN High Electron Mobility Transistors

Authors: Anwar Jarndal

Abstract:

In this paper, a genetic-neural-network (GNN) based large-signal model for GaN HEMTs is presented along with its parameters extraction procedure. The model is easy to construct and implement in CAD software and requires only DC and S-parameter measurements. An improved decomposition technique is used to model self-heating effect. Two GNN models are constructed to simulate isothermal drain current and power dissipation, respectively. The two model are then composed to simulate the drain current. The modeling procedure was applied to a packaged GaN-on-Si HEMT and the developed model is validated by comparing its large-signal simulation with measured data. A very good agreement between the simulation and measurement is obtained.

Keywords: GaN HEMT, computer-aided design and modeling, neural networks, genetic optimization

Procedia PDF Downloads 352