Search results for: computer processing of large databases
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12525

Search results for: computer processing of large databases

11565 The Effect of Object Presentation on Action Memory in School-Aged Children

Authors: Farzaneh Badinlou, Reza Kormi-Nouri, Monika Knopf

Abstract:

Enacted tasks are typically remembered better than when the same task materials are only verbally encoded, a robust finding referred to as the enactment effect. It has been assumed that enactment effect is independent of object presence but the size of enactment effect can be increased by providing objects at study phase in adults. To clarify the issues in children, free recall and cued recall performance of action phrases with or without using real objects were compared in 410 school-aged children from four age groups (8, 10, 12 and 14 years old). In this study, subjects were instructed to learn a series of action phrases under three encoding conditions, participants listened to verbal action phrases (VTs), performed the phrases (SPTs: subject-performed tasks), and observed the experimenter perform the phrases (EPTs: experimenter-performed tasks). Then, free recall and cued recall memory tests were administrated. The results revealed that the real object compared with imaginary objects improved recall performance in SPTs and EPTs, but more so in VTs. It was also found that the object presence was not necessary for the occurrence of the enactment effect but it was changed the size of enactment effect in all age groups. The size of enactment effect was more pronounced for imaginary objects than the real object in both free recall and cued recall memory tests in children. It was discussed that SPTs and EPTs deferentially facilitate item-specific and relation information processing and providing the objects can moderate the processing underlying the encoding conditions.

Keywords: action memory, enactment effect, item-specific processing, object, relational processing, school-aged children

Procedia PDF Downloads 232
11564 Pressure Gradient Prediction of Oil-Water Two Phase Flow through Horizontal Pipe

Authors: Ahmed I. Raheem

Abstract:

In this thesis, stratified and stratified wavy flow regimes have been investigated numerically for the oil (1.57 mPa s viscosity and 780 kg/m3 density) and water twophase flow in small and large horizontal steel pipes with a diameter between 0.0254 to 0.508 m by ANSYS Fluent software. Volume of fluid (VOF) with two phases flows using two equations family models (Realizable k-

Keywords: CFD, two-phase flow, pressure gradient, volume of fluid, large diameter, horizontal pipe, oil-water stratified and stratified wavy flow

Procedia PDF Downloads 424
11563 Affective Transparency in Compound Word Processing

Authors: Jordan Gallant

Abstract:

In the compound word processing literature, much attention has been paid to the relationship between a compound’s denotational meaning and that of its morphological whole-word constituents, which is referred to as ‘semantic transparency’. However, the parallel relationship between a compound’s connotation and that of its constituents has not been addressed at all. For instance, while a compound like ‘painkiller’ might be semantically transparent, it is not ‘affectively transparent’. That is, both constituents have primarily negative connotations, while the whole compound has a positive one. This paper investigates the role of affective transparency on compound processing using two methodologies commonly employed in this field: a lexical decision task and a typing task. The critical stimuli used were 112 English bi-constituent compounds that differed in terms of the effective transparency of their constituents. Of these, 36 stimuli contained constituents with similar connotations to the compound (e.g., ‘dreamland’), 36 contained constituents with more positive connotations (e.g. ‘bedpan’), and 36 contained constituents with more negative connotations (e.g. ‘painkiller’). Connotation of whole-word constituents and compounds were operationalized via valence ratings taken from an off-line ratings database. In Experiment 1, compound stimuli and matched non-word controls were presented visually to participants, who were then asked to indicate whether it was a real word in English. Response times and accuracy were recorded. In Experiment 2, participants typed compound stimuli presented to them visually. Individual keystroke response times and typing accuracy were recorded. The results of both experiments provided positive evidence that compound processing is influenced by effective transparency. In Experiment 1, compounds in which both constituents had more negative connotations than the compound itself were responded to significantly more slowly than compounds in which the constituents had similar or more positive connotations. Typed responses from Experiment 2 showed that inter-keystroke intervals at the morphological constituent boundary were significantly longer when the connotation of the head constituent was either more positive or more negative than that of the compound. The interpretation of this finding is discussed in the context of previous compound typing research. Taken together, these findings suggest that affective transparency plays a role in the recognition, storage, and production of English compound words. This study provides a promising first step in a new direction for research on compound words.

Keywords: compound processing, semantic transparency, typed production, valence

Procedia PDF Downloads 117
11562 Reliability of Intra-Logistics Systems – Simulating Performance Availability

Authors: Steffen Schieweck, Johannes Dregger, Sascha Kaczmarek, Michael ten Hompel

Abstract:

Logistics distributors face the issue of having to provide increasing service levels while being forced to reduce costs at the same time. Same-day delivery, quick order processing and rapidly growing ranges of articles are only some of the prevailing challenges. One key aspect of the performance of an intra-logistics system is how often and in which amplitude congestions and dysfunctions affect the processing operations. By gaining knowledge of the so called ‘performance availability’ of such a system during the planning stage, oversizing and wasting can be reduced whereas planning transparency is increased. State of the art for the determination of this KPI are simulation studies. However, their structure and therefore their results may vary unforeseeably. This article proposes a concept for the establishment of ‘certified’ and hence reliable and comparable simulation models.

Keywords: intra-logistics, performance availability, simulation, warehousing

Procedia PDF Downloads 447
11561 Enhancement of Mechanical and Dissolution Properties of a Cast Magnesium Alloy via Equal Angular Channel Processing

Authors: Tim Dunne, Jiaxiang Ren, Lei Zhao, Peng Cheng, Yi Song, Yu Liu, Wenhan Yue, Xiongwen Yang

Abstract:

Two decades of the Shale Revolution has transforming transformed the global energy market, in part by the adaption of multi-stage dissolvable frac plugs. Magnesium has been favored for the bulk of plugs, requiring development of materials to suit specific field requirements. Herein, the mechanical and dissolution results from equal channel angular pressing (ECAP) of two cast dissolvable magnesium alloy are described. ECAP was selected as a route to increase the mechanical properties of two formulations of dissolvable magnesium, as solutionizing failed. In this study, 1” square cross section samples cast Mg alloys formulations containing rare earth were processed at temperatures ranging from 200 to 350 °C, at a rate of 0.005”/s, with a backpressure from 0 to 70 MPa, in a brass, or brass + graphite sheet. Generally, the yield and ultimate tensile strength (UTS) doubled for all. For formulation DM-2, the yield increased from 100 MPa to 250 MPa; UTS from 175 MPa to 325 MPa, but the strain fell from 2 to 1%. Formulation DM-3 yield increased from 75 MPa to 200 MPa, UTS from 150 MPa to 275 MPa, with strain increasing from 1 to 3%. Meanwhile, ECAP has also been found to reduce the dissolution rate significantly. A microstructural analysis showed grain refinement of the alloy and the movement of secondary phases away from the grain boundary. It is believed that reconfiguration of the grain boundary phases increased the mechanical properties and decreased the dissolution rate. ECAP processing of dissolvable high rare earth content magnesium is possible despite the brittleness of the material. ECAP is a possible processing route to increase mechanical properties for dissolvable aluminum alloys that do not extrude.

Keywords: equal channel angular processing, dissolvable magnesium, frac plug, mechanical properties

Procedia PDF Downloads 109
11560 Remote Sensing and GIS Based Methodology for Identification of Low Crop Productivity in Gautam Buddha Nagar District

Authors: Shivangi Somvanshi

Abstract:

Poor crop productivity in salt-affected environment in the country is due to insufficient and untimely canal supply to agricultural land and inefficient field water management practices. This could further degrade due to inadequate maintenance of canal network, ongoing secondary soil salinization and waterlogging, worsening of groundwater quality. Large patches of low productivity in irrigation commands are occurring due to waterlogging and salt-affected soil, particularly in the scarcity rainfall year. Satellite remote sensing has been used for mapping of areas of low crop productivity, waterlogging and salt in irrigation commands. The spatial results obtained for these problems so far are less reliable for further use due to rapid change in soil quality parameters over the years. The existing spatial databases of canal network and flow data, groundwater quality and salt-affected soil were obtained from the central and state line departments/agencies and were integrated with GIS. Therefore, an integrated methodology based on remote sensing and GIS has been developed in ArcGIS environment on the basis of canal supply status, groundwater quality, salt-affected soils, and satellite-derived vegetation index (NDVI), salinity index (NDSI) and waterlogging index (NSWI). This methodology was tested for identification and delineation of area of low productivity in the Gautam Buddha Nagar district (Uttar Pradesh). It was found that the area affected by this problem lies mainly in Dankaur and Jewar blocks of the district. The problem area was verified with ground data and was found to be approximately 78% accurate. The methodology has potential to be used in other irrigation commands in the country to obtain reliable spatial data on low crop productivity.

Keywords: remote sensing, GIS, salt affected soil, crop productivity, Gautam Buddha Nagar

Procedia PDF Downloads 279
11559 Detection of Intentional Attacks in Images Based on Watermarking

Authors: Hazem Munawer Al-Otum

Abstract:

In this work, an efficient watermarking technique is proposed and can be used for detecting intentional attacks in RGB color images. The proposed technique can be implemented for image authentication and exhibits high robustness against unintentional common image processing attacks. It deploys two measures to discern between intentional and unintentional attacks based on using a quantization-based technique in a modified 2D multi-pyramidal DWT transform. Simulations have shown high accuracy in detecting intentionally attacked regions while exhibiting high robustness under moderate to severe common image processing attacks.

Keywords: image authentication, copyright protection, semi-fragile watermarking, tamper detection

Procedia PDF Downloads 248
11558 Character and Evolution of Electronic Waste: A Technologically Developing Country's Experience

Authors: Karen C. Olufokunbi, Odetunji A. Odejobi

Abstract:

The discourse of this paper is the examination of the generation, accumulation and growth of e-waste in a developing country. Images and other data about computer e-waste were collected using a digital camera, 290 copies of questionnaire and three structured interviews using Obafemi Awolowo University (OAU), Ile-Ife, Nigeria environment as a case study. The numerical data were analysed using R data analysis and process tool. Automata-based techniques and Petri net modeling tool were used to design and simulate a computational model for the recovery of saleable materials from e-waste. The R analysis showed that at a 95 percent confidence level, the computer equipment that will be disposed by 2020 will be 417 units. Compared to the 800 units in circulation in 2014, 50 percent of personal computer components will become e-waste. This indicates that personal computer components were in high demand due to their low costs and will be disposed more rapidly when replaced by new computer equipment Also, 57 percent of the respondents discarded their computer e-waste by throwing it into the garbage bin or by dumping it. The simulated model using Coloured Petri net modelling tool for the process showed that the e-waste dynamics is a forward sequential process in the form of a pipeline meaning that an e-waste recovery of saleable materials process occurs in identifiable discrete stages indicating that e-waste will continue to accumulate and grow in volume with time.

Keywords: Coloured Petri net, computational modelling, electronic waste, electronic waste process dynamics

Procedia PDF Downloads 158
11557 SC-LSH: An Efficient Indexing Method for Approximate Similarity Search in High Dimensional Space

Authors: Sanaa Chafik, Imane Daoudi, Mounim A. El Yacoubi, Hamid El Ouardi

Abstract:

Locality Sensitive Hashing (LSH) is one of the most promising techniques for solving nearest neighbour search problem in high dimensional space. Euclidean LSH is the most popular variation of LSH that has been successfully applied in many multimedia applications. However, the Euclidean LSH presents limitations that affect structure and query performances. The main limitation of the Euclidean LSH is the large memory consumption. In order to achieve a good accuracy, a large number of hash tables is required. In this paper, we propose a new hashing algorithm to overcome the storage space problem and improve query time, while keeping a good accuracy as similar to that achieved by the original Euclidean LSH. The Experimental results on a real large-scale dataset show that the proposed approach achieves good performances and consumes less memory than the Euclidean LSH.

Keywords: approximate nearest neighbor search, content based image retrieval (CBIR), curse of dimensionality, locality sensitive hashing, multidimensional indexing, scalability

Procedia PDF Downloads 316
11556 Error Probability of Multi-User Detection Techniques

Authors: Komal Babbar

Abstract:

Multiuser Detection is the intelligent estimation/demodulation of transmitted bits in the presence of Multiple Access Interference. The authors have presented the Bit-error rate (BER) achieved by linear multi-user detectors: Matched filter (which treats the MAI as AWGN), Decorrelating and MMSE. In this work, authors investigate the bit error probability analysis for Matched filter, decorrelating, and MMSE. This problem arises in several practical CDMA applications where the receiver may not have full knowledge of the number of active users and their signature sequences. In particular, the behavior of MAI at the output of the Multi-user detectors (MUD) is examined under various asymptotic conditions including large signal to noise ratio; large near-far ratios; and a large number of users. In the last section Authors also shows Matlab Simulation results for Multiuser detection techniques i.e., Matched filter, Decorrelating, MMSE for 2 users and 10 users.

Keywords: code division multiple access, decorrelating, matched filter, minimum mean square detection (MMSE) detection, multiple access interference (MAI), multiuser detection (MUD)

Procedia PDF Downloads 513
11555 Control of Belts for Classification of Geometric Figures by Artificial Vision

Authors: Juan Sebastian Huertas Piedrahita, Jaime Arturo Lopez Duque, Eduardo Luis Perez Londoño, Julián S. Rodríguez

Abstract:

The process of generating computer vision is called artificial vision. The artificial vision is a branch of artificial intelligence that allows the obtaining, processing, and analysis of any type of information especially the ones obtained through digital images. Actually the artificial vision is used in manufacturing areas for quality control and production, as these processes can be realized through counting algorithms, positioning, and recognition of objects that can be measured by a single camera (or more). On the other hand, the companies use assembly lines formed by conveyor systems with actuators on them for moving pieces from one location to another in their production. These devices must be previously programmed for their good performance and must have a programmed logic routine. Nowadays the production is the main target of every industry, quality, and the fast elaboration of the different stages and processes in the chain of production of any product or service being offered. The principal base of this project is to program a computer that recognizes geometric figures (circle, square, and triangle) through a camera, each one with a different color and link it with a group of conveyor systems to organize the mentioned figures in cubicles, which differ from one another also by having different colors. This project bases on artificial vision, therefore the methodology needed to develop this project must be strict, this one is detailed below: 1. Methodology: 1.1 The software used in this project is QT Creator which is linked with Open CV libraries. Together, these tools perform to realize the respective program to identify colors and forms directly from the camera to the computer. 1.2 Imagery acquisition: To start using the libraries of Open CV is necessary to acquire images, which can be captured by a computer’s web camera or a different specialized camera. 1.3 The recognition of RGB colors is realized by code, crossing the matrices of the captured images and comparing pixels, identifying the primary colors which are red, green, and blue. 1.4 To detect forms it is necessary to realize the segmentation of the images, so the first step is converting the image from RGB to grayscale, to work with the dark tones of the image, then the image is binarized which means having the figure of the image in a white tone with a black background. Finally, we find the contours of the figure in the image to detect the quantity of edges to identify which figure it is. 1.5 After the color and figure have been identified, the program links with the conveyor systems, which through the actuators will classify the figures in their respective cubicles. Conclusions: The Open CV library is a useful tool for projects in which an interface between a computer and the environment is required since the camera obtains external characteristics and realizes any process. With the program for this project any type of assembly line can be optimized because images from the environment can be obtained and the process would be more accurate.

Keywords: artificial intelligence, artificial vision, binarized, grayscale, images, RGB

Procedia PDF Downloads 373
11554 Tool Wear Monitoring of High Speed Milling Based on Vibratory Signal Processing

Authors: Hadjadj Abdechafik, Kious Mecheri, Ameur Aissa

Abstract:

The objective of this study is to develop a process of treatment of the vibratory signals generated during a horizontal high speed milling process without applying any coolant in order to establish a monitoring system able to improve the machining performance. Thus, many tests were carried out on the horizontal high speed centre (PCI Météor 10), in given cutting conditions, by using a milling cutter with only one insert and measured its frontal wear from its new state that is considered as a reference state until a worn state that is considered as unsuitable for the tool to be used. The results obtained show that the first harmonic follow well the evolution of frontal wear, on another hand a wavelet transform is used for signal processing and is found to be useful for observing the evolution of the wavelet approximations through the cutting tool life. The power and the Root Mean Square (RMS) values of the wavelet transformed signal gave the best results and can be used for tool wear estimation. All this features can constitute the suitable indicators for an effective detection of tool wear and then used for the input parameters of an online monitoring system. Although we noted the remarkable influence of the machining cycle on the quality of measurements by the introduction of a bias on the signal, this phenomenon appears in particular in horizontal milling and in the majority of studies is ignored.

Keywords: flank wear, vibration, milling, signal processing, monitoring

Procedia PDF Downloads 593
11553 Analysis of Public Space Usage Characteristics Based on Computer Vision Technology - Taking Shaping Park as an Example

Authors: Guantao Bai

Abstract:

Public space is an indispensable and important component of the urban built environment. How to more accurately evaluate the usage characteristics of public space can help improve its spatial quality. Compared to traditional survey methods, computer vision technology based on deep learning has advantages such as dynamic observation and low cost. This study takes the public space of Shaping Park as an example and, based on deep learning computer vision technology, processes and analyzes the image data of the public space to obtain the spatial usage characteristics and spatiotemporal characteristics of the public space. Research has found that the spontaneous activity time in public spaces is relatively random with a relatively short average activity time, while social activities have a relatively stable activity time with a longer average activity time. Computer vision technology based on deep learning can effectively describe the spatial usage characteristics of the research area, making up for the shortcomings of traditional research methods and providing relevant support for creating a good public space.

Keywords: computer vision, deep learning, public spaces, using features

Procedia PDF Downloads 61
11552 Automatic Detection of Sugarcane Diseases: A Computer Vision-Based Approach

Authors: Himanshu Sharma, Karthik Kumar, Harish Kumar

Abstract:

The major problem in crop cultivation is the occurrence of multiple crop diseases. During the growth stage, timely identification of crop diseases is paramount to ensure the high yield of crops, lower production costs, and minimize pesticide usage. In most cases, crop diseases produce observable characteristics and symptoms. The Surveyors usually diagnose crop diseases when they walk through the fields. However, surveyor inspections tend to be biased and error-prone due to the nature of the monotonous task and the subjectivity of individuals. In addition, visual inspection of each leaf or plant is costly, time-consuming, and labour-intensive. Furthermore, the plant pathologists and experts who can often identify the disease within the plant according to their symptoms in early stages are not readily available in remote regions. Therefore, this study specifically addressed early detection of leaf scald, red rot, and eyespot types of diseases within sugarcane plants. The study proposes a computer vision-based approach using a convolutional neural network (CNN) for automatic identification of crop diseases. To facilitate this, firstly, images of sugarcane diseases were taken from google without modifying the scene, background, or controlling the illumination to build the training dataset. Then, the testing dataset was developed based on the real-time collected images from the sugarcane field from India. Then, the image dataset is pre-processed for feature extraction and selection. Finally, the CNN-based Visual Geometry Group (VGG) model was deployed on the training and testing dataset to classify the images into diseased and healthy sugarcane plants and measure the model's performance using various parameters, i.e., accuracy, sensitivity, specificity, and F1-score. The promising result of the proposed model lays the groundwork for the automatic early detection of sugarcane disease. The proposed research directly sustains an increase in crop yield.

Keywords: automatic classification, computer vision, convolutional neural network, image processing, sugarcane disease, visual geometry group

Procedia PDF Downloads 110
11551 Precious and Rare Metals in Overburden Carbonaceous Rocks: Methods of Extraction

Authors: Tatyana Alexandrova, Alexandr Alexandrov, Nadezhda Nikolaeva

Abstract:

A problem of complex mineral resources development is urgent and priority, it is aimed at realization of the processes of their ecologically safe development, one of its components is revealing the influence of the forms of element compounds in raw materials and in the processing products. In view of depletion of the precious metal reserves at the traditional deposits in the XXI century the large-size open cast deposits, localized in black shale strata begin to play the leading role. Carbonaceous (black) shales carry a heightened metallogenic potential. Black shales with high content of carbon are widely distributed within the scope of Bureinsky massif. According to academician Hanchuk`s data black shales of Sutirskaya series contain generally PGEs native form. The presence of high absorptive towards carbonaceous matter gold and PGEs compounds in crude ore results in decrease of valuable components extraction because of their sorption into dissipated carbonaceous matter.

Keywords: сarbonaceous rocks, bitumens, precious metals, concentration, extraction

Procedia PDF Downloads 240
11550 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges

Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch

Abstract:

Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.

Keywords: big data interpretation, datathon, systems toxicology, verification

Procedia PDF Downloads 271
11549 Big Data Analysis with RHadoop

Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim

Abstract:

It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.

Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop

Procedia PDF Downloads 427
11548 Human Motion Capture: New Innovations in the Field of Computer Vision

Authors: Najm Alotaibi

Abstract:

Human motion capture has become one of the major area of interest in the field of computer vision. Some of the major application areas that have been rapidly evolving include the advanced human interfaces, virtual reality and security/surveillance systems. This study provides a brief overview of the techniques and applications used for the markerless human motion capture, which deals with analyzing the human motion in the form of mathematical formulations. The major contribution of this research is that it classifies the computer vision based techniques of human motion capture based on the taxonomy, and then breaks its down into four systematically different categories of tracking, initialization, pose estimation and recognition. The detailed descriptions and the relationships descriptions are given for the techniques of tracking and pose estimation. The subcategories of each process are further described. Various hypotheses have been used by the researchers in this domain are surveyed and the evolution of these techniques have been explained. It has been concluded in the survey that most researchers have focused on using the mathematical body models for the markerless motion capture.

Keywords: human motion capture, computer vision, vision-based, tracking

Procedia PDF Downloads 308
11547 Analyzing the Significance of Religion in Economic Development in East and Southeast Asia: Case Study of the City of Wenzhou in China

Authors: Wenting Pan, Fang Chen

Abstract:

The aim is to increase understanding of the potential effects of religion and economy development in East and Southeast Asia. Religion developed in the east, and southeast Asia is connected with community intensively, especially the activities by women. It could facilitate spiritual awakening in the community and economic empowerment. The theories were assessed by using survey information for Wenzhou which is the legendary city of Chinese economic development, measuring attendance at formal religious services, religious beliefs, and self-identification as religious. Wenzhou’s chamber of commerce is all over the world. Apart from large and small processing factories, Wenzhou is dotted with temples and Taoist temples. In the survey four of the control variables (size of temples, profitability, multiple densities, type of industry and so on) were significant issues to find a relationship between local people and the culture of local religion. What’s more, women should be taken into account seriously. This study has social economy implications for Wenzhou as well as a number of other countries in the East and Southeast Asia.

Keywords: East and Southeast Asia, economy development, Religion, Wenzhou

Procedia PDF Downloads 308
11546 A Convolutional Deep Neural Network Approach for Skin Cancer Detection Using Skin Lesion Images

Authors: Firas Gerges, Frank Y. Shih

Abstract:

Malignant melanoma, known simply as melanoma, is a type of skin cancer that appears as a mole on the skin. It is critical to detect this cancer at an early stage because it can spread across the body and may lead to the patient's death. When detected early, melanoma is curable. In this paper, we propose a deep learning model (convolutional neural networks) in order to automatically classify skin lesion images as malignant or benign. Images underwent certain pre-processing steps to diminish the effect of the normal skin region on the model. The result of the proposed model showed a significant improvement over previous work, achieving an accuracy of 97%.

Keywords: deep learning, skin cancer, image processing, melanoma

Procedia PDF Downloads 132
11545 Use of Telehealth for Facilitating the Diagnostic Assessment of Autism Spectrum Disorder: A Scoping Review

Authors: Manahil Alfuraydan, Jodie Croxall, Lisa Hurt, Mike Kerr, Sinead Brophy

Abstract:

Autism Spectrum Disorder (ASD) is a developmental condition characterised by impairment in terms of social communication, social interaction, and a repetitive or restricted pattern of interest, behaviour, and activity. There is a significant delay between seeking help and a confirmed diagnosis of ASD. This may result in delay in receiving early intervention services, which are critical for positive outcomes. The long wait times also cause stress for the individuals and their families. Telehealth potentially offers a way of improving the diagnostic pathway for ASD. This review of the literature aims to examine which telehealth approaches have been used in the diagnosis and assessment of autism in children and adults, whether they are feasible and acceptable, and how they compare with face-to-face diagnosis and assessment methods. A comprehensive search of following databases- MEDLINE, CINAHL Plus with Full text, Business Sources Complete, Web of Science, Scopus, PsycINFO and trail and systematic review databases including Cochrane Library, Health Technology Assessment, Database of Abstracts and Reviews of Effectiveness and NHS Economic Evaluation was conducted, combining the terms of autism and telehealth from 2000 to 2018. A total of 10 studies were identified for inclusion in the review. This review of the literature found there to be two methods of using telehealth: (a) video conferencing to enable teams in different areas to consult with the families and to assess the child/adult in real time and (b) a video upload to a web portal that enables the clinical assessment of behaviours in the family home. The findings were positive, finding there to be high agreement in terms of the diagnosis between remote methods and face to face methods and with high levels of satisfaction among the families and clinicians. This field is in the very early stages, and so only studies with small sample size were identified, but the findings suggest that there is potential for telehealth methods to improve assessment and diagnosis of autism used in conjunction with existing methods, especially for those with clear autism traits and adults with autism. Larger randomised controlled trials of this technology are warranted.

Keywords: assessment, autism spectrum disorder, diagnosis, telehealth

Procedia PDF Downloads 121
11544 The Implementation of Sexual and Reproductive Health Education Policy in Schools in Asia and Africa: A Scoping Review

Authors: Rhea Khosla, Victoria Tzortziou-Brown

Abstract:

Introduction: Adolescent SRH has been neglected since the start of the millennium. Adolescents comprise 16% of the global population, with the largest proportion living in Asia (650 million). By late adolescence, individuals in these regions are likely to become sexually active, and thus they must understand their SRH rights. Many lack knowledge of SRH, using unreliable sources for such information. Sex education is necessary to standardize and inform sexual knowledge, which empowers adolescents to make informed SRH decisions. School is an appropriate environment for this, however, SRH education requires effective policy to enforce. Nonetheless, this issue remains of low political priority in Asia and Africa. Current literature on sex education policy in schools in these regions is scarce and tends to have broad aims. Thus, a scoping review was necessary. Methods: Literature searches were conducted in February 2023 using six databases, including grey literature databases (PubMed, Scopus, Embase, Web of Science, Google Scholar, Global Index Medicus), returning a total of 1537 unique articles. After screening titles, abstracts and full text, 17 articles remained. References of included articles were additionally searched, producing a further 7 articles, which then underwent thematic analysis Results: Most countries in Africa and Asia did not have studies on this topic. Studies derived data from interviews with key stakeholders and quantitative methods quantified questionnaire responses. Barriers were: policy/curriculum issues, societal opinions, teaching discomfort, and lack of educator training. Limitations were insufficient timing, inconsistent implementation, insufficient hours dedicated to teaching, education received late into schooling, and discrepancies between teachers, schools, and students about whether policies were being implemented. Discussion: Based on the existing limited evidence, a cultural shift to reduce stigma seems necessary, alongside teacher and student involvement in policy formulation with effective implementation monitoring and educator training.

Keywords: adolescent, Africa, Asia, education, sexual and reproductive health, policy

Procedia PDF Downloads 40
11543 Performance Degradation for the GLR Test-Statistics for Spatial Signal Detection

Authors: Olesya Bolkhovskaya, Alexander Maltsev

Abstract:

Antenna arrays are widely used in modern radio systems in sonar and communications. The solving of the detection problems of a useful signal on the background of noise is based on the GLRT method. There is a large number of problem which depends on the known a priori information. In this work, in contrast to the majority of already solved problems, it is used only difference spatial properties of the signal and noise for detection. We are analyzing the influence of the degree of non-coherence of signal and noise unhomogeneity on the performance characteristics of different GLRT statistics. The description of the signal and noise is carried out by means of the spatial covariance matrices C in the cases of different number of known information. The partially coherent signal is simulated as a plane wave with a random angle of incidence of the wave concerning a normal. Background noise is simulated as random process with uniform distribution function in each element. The results of investigation of degradation of performance characteristics for different cases are represented in this work.

Keywords: GLRT, Neumann-Pearson’s criterion, Test-statistics, degradation, spatial processing, multielement antenna array

Procedia PDF Downloads 377
11542 A Comparison of Bias Among Relaxed Divisor Methods Using 3 Bias Measurements

Authors: Sumachaya Harnsukworapanich, Tetsuo Ichimori

Abstract:

The apportionment method is used by many countries, to calculate the distribution of seats in political bodies. For example, this method is used in the United States (U.S.) to distribute house seats proportionally based on the population of the electoral district. Famous apportionment methods include the divisor methods called the Adams Method, Dean Method, Hill Method, Jefferson Method and Webster Method. Sometimes the results from the implementation of these divisor methods are unfair and include errors. Therefore, it is important to examine the optimization of this method by using a bias measurement to figure out precise and fair results. In this research we investigate the bias of divisor methods in the U.S. Houses of Representatives toward large and small states by applying the Stolarsky Mean Method. We compare the bias of the apportionment method by using two famous bias measurements: The Balinski and Young measurement and the Ernst measurement. Both measurements have a formula for large and small states. The Third measurement however, which was created by the researchers, did not factor in the element of large and small states into the formula. All three measurements are compared and the results show that our measurement produces similar results to the other two famous measurements.

Keywords: apportionment, bias, divisor, fair, measurement

Procedia PDF Downloads 360
11541 Enhancing Word Meaning Retrieval Using FastText and Natural Language Processing Techniques

Authors: Sankalp Devanand, Prateek Agasimani, Shamith V. S., Rohith Neeraje

Abstract:

Machine translation has witnessed significant advancements in recent years, but the translation of languages with distinct linguistic characteristics, such as English and Sanskrit, remains a challenging task. This research presents the development of a dedicated English-to-Sanskrit machine translation model, aiming to bridge the linguistic and cultural gap between these two languages. Using a variety of natural language processing (NLP) approaches, including FastText embeddings, this research proposes a thorough method to improve word meaning retrieval. Data preparation, part-of-speech tagging, dictionary searches, and transliteration are all included in the methodology. The study also addresses the implementation of an interpreter pattern and uses a word similarity task to assess the quality of word embeddings. The experimental outcomes show how the suggested approach may be used to enhance word meaning retrieval tasks with greater efficacy, accuracy, and adaptability. Evaluation of the model's performance is conducted through rigorous testing, comparing its output against existing machine translation systems. The assessment includes quantitative metrics such as BLEU scores, METEOR scores, Jaccard Similarity, etc.

Keywords: machine translation, English to Sanskrit, natural language processing, word meaning retrieval, fastText embeddings

Procedia PDF Downloads 32
11540 Pedestrian Safe Bumper Design from Commingled Glass Fiber/Polypropylene Reinforced Sandwich Composites

Authors: L. Onal

Abstract:

The aim of this study is to optimize manufacturing process for thermoplastic sandwich composite structures for the pedestrian safety of automobiles subjected to collision condition. In particular, cost-effective manufacturing techniques for sandwich structures with commingled GF/PP skins and low-density foam cores are being investigated. The performance of these structures under bending load is being studied. Samples are manufactured using compression moulding technique. The relationship of this performance to processing parameters such as mould temperature, moulding time, moulding pressure and sequence of the layers during moulding is being investigated. The results of bending tests are discussed in the light of the moulding conditions and conclusions are given regarding optimum set of processing conditions using the compression moulding route

Keywords: twintex, flexural properties, automobile composites, sandwich structures

Procedia PDF Downloads 422
11539 Circular Tool and Dynamic Approach to Grow the Entrepreneurship of Macroeconomic Metabolism

Authors: Maria Areias, Diogo Simões, Ana Figueiredo, Anishur Rahman, Filipa Figueiredo, João Nunes

Abstract:

It is expected that close to 7 billion people will live in urban areas by 2050. In order to improve the sustainability of the territories and its transition towards circular economy, it’s necessary to understand its metabolism and promote and guide the entrepreneurship answer. The study of a macroeconomic metabolism involves the quantification of the inputs, outputs and storage of energy, water, materials and wastes for an urban region. This quantification and analysis representing one opportunity for the promotion of green entrepreneurship. There are several methods to assess the environmental impacts of an urban territory, such as human and environmental risk assessment (HERA), life cycle assessment (LCA), ecological footprint assessment (EF), material flow analysis (MFA), physical input-output table (PIOT), ecological network analysis (ENA), multicriteria decision analysis (MCDA) among others. However, no consensus exists about which of those assessment methods are best to analyze the sustainability of these complex systems. Taking into account the weaknesses and needs identified, the CiiM - Circular Innovation Inter-Municipality project aims to define an uniform and globally accepted methodology through the integration of various methodologies and dynamic approaches to increase the efficiency of macroeconomic metabolisms and promoting entrepreneurship in a circular economy. The pilot territory considered in CiiM project has a total area of 969,428 ha, comprising a total of 897,256 inhabitants (about 41% of the population of the Center Region). The main economic activities in the pilot territory, which contribute to a gross domestic product of 14.4 billion euros, are: social support activities for the elderly; construction of buildings; road transport of goods, retailing in supermarkets and hypermarkets; mass production of other garments; inpatient health facilities; and the manufacture of other components and accessories for motor vehicles. The region's business network is mostly constituted of micro and small companies (similar to the Central Region of Portugal), with a total of 53,708 companies identified in the CIM Region of Coimbra (39 large companies), 28,146 in the CIM Viseu Dão Lafões (22 large companies) and 24,953 in CIM Beiras and Serra da Estrela (13 large companies). For the construction of the database was taking into account data available at the National Institute of Statistics (INE), General Directorate of Energy and Geology (DGEG), Eurostat, Pordata, Strategy and Planning Office (GEP), Portuguese Environment Agency (APA), Commission for Coordination and Regional Development (CCDR) and Inter-municipal Community (CIM), as well as dedicated databases. In addition to the collection of statistical data, it was necessary to identify and characterize the different stakeholder groups in the pilot territory that are relevant to the different metabolism components under analysis. The CIIM project also adds the potential of a Geographic Information System (GIS) so that it is be possible to obtain geospatial results of the territorial metabolisms (rural and urban) of the pilot region. This platform will be a powerful visualization tool of flows of products/services that occur within the region and will support the stakeholders, improving their circular performance and identifying new business ideas and symbiotic partnerships.

Keywords: circular economy tools, life cycle assessment macroeconomic metabolism, multicriteria decision analysis, decision support tools, circular entrepreneurship, industrial and regional symbiosis

Procedia PDF Downloads 90
11538 Signal Processing of Barkhausen Noise Signal for Assessment of Increasing Down Feed in Surface Ground Components with Poor Micro-Magnetic Response

Authors: Tanmaya Kumar Dash, Tarun Karamshetty, Soumitra Paul

Abstract:

The Barkhausen Noise Analysis (BNA) technique has been utilized to assess surface integrity of steels. But the BNA technique is not very successful in evaluating surface integrity of ground steels that exhibit poor micro-magnetic response. A new approach has been proposed for the processing of BN signal with Fast Fourier transforms while Wavelet transforms has been used to remove noise from the BN signal, with judicious choice of the ‘threshold’ value, when the micro-magnetic response of the work material is poor. In the present study, the effect of down feed induced upon conventional plunge surface grinding of hardened bearing steel has been investigated along with an ultrasonically cleaned, wet polished and a sample ground with spark out technique for benchmarking. Moreover, the FFT analysis has been established, at different sets of applied voltages and applied frequency and the pattern of the BN signal in the frequency domain is analyzed. The study also depicts the wavelet transforms technique with different levels of decomposition and different mother wavelets, which has been used to reduce the noise value in BN signal of materials with poor micro-magnetic response, in order to standardize the procedure for all BN signals depending on the frequency of the applied voltage.

Keywords: barkhausen noise analysis, grinding, magnetic properties, signal processing, micro-magnetic response

Procedia PDF Downloads 661
11537 The Threshold Values of Soil Water Index for Landslides on Country Road No.89

Authors: Ji-Yuan Lin, Yu-Ming Liou, Yi-Ting Chen, Chen-Syuan Lin

Abstract:

Soil water index obtained by tank model is now commonly used in soil and sand disaster alarm system in Japan. Comparing with the rainfall trigging index in Taiwan, the tank model is easy to predict the slope water content on large-scale landslide. Therefore, this study aims to estimate the threshold value of large-scale landslide using the soil water index Sixteen typhoons and heavy rainfall events, were selected to establish the, to relationship between landslide event and soil water index. Finally, the proposed threshold values for landslides on country road No.89 are suggested in this study. The study results show that 95% landslide cases occurred in soil water index more than 125mm, and 30% of the more serious slope failure occurred in the soil water index is greater than 250mm. Beside, this study speculates when soil water index more than 250mm and the difference value between second tank and third tank less than -25mm, it leads to large-scale landslide more probably.

Keywords: soil water index, tank model, landslide, threshold values

Procedia PDF Downloads 378
11536 Characterization of Shiga Toxin Escherichia coli Recovered from a Beef Processing Facility within Southern Ontario and Comparative Performance of Molecular Diagnostic Platforms

Authors: Jessica C. Bannon, Cleso M. Jordao Jr., Mohammad Melebari, Carlos Leon-Velarde, Roger Johnson, Keith Warriner

Abstract:

There has been an increased incidence of non-O157 Shiga Toxin Escherichia coli (STEC) with six serotypes (Top 6) being implicated in causing haemolytic uremic syndrome (HUS). Beef has been suggested to be a significant vehicle for non-O157 STEC although conclusive evidence has yet to be obtained. The following aimed to determine the prevalence of the Top 6 non-O157 STEC in beef processing using three different diagnostic platforms then characterize the recovered isolates. Hide, carcass and environmental swab samples (n = 60) were collected from a beef processing facility over a 12 month period. Enriched samples were screened using Biocontrol GDS, BAX or PALLgene molecular diagnostic tests. Presumptive non-O157 STEC positive samples were confirmed using conventional PCR and serology. STEC was detected by GDS (55% positive), BAX (85% positive), and PALLgene (93%). However, during confirmation testing only 8 of the 60 samples (13%) were found to harbour STEC. Interestingly, the presence of virulence factors in the recovered isolates was unstable and readily lost during subsequent sub-culturing. There is a low prevalence of Top 6 non-O157 STEC associated with beef although other serotypes are encountered. Yet, the instability of the virulence factors in recovered strains would question their clinical relevance.

Keywords: beef, food microbiology, shiga toxin, STEC

Procedia PDF Downloads 455