Search results for: face processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6034

Search results for: face processing

5824 Embedded System of Signal Processing on FPGA: Underwater Application Architecture

Authors: Abdelkader Elhanaoui, Mhamed Hadji, Rachid Skouri, Said Agounad

Abstract:

The purpose of this paper is to study the phenomenon of acoustic scattering by using a new method. The signal processing (Fast Fourier Transform FFT Inverse Fast Fourier Transform iFFT and BESSEL functions) is widely applied to obtain information with high precision accuracy. Signal processing has a wider implementation in general-purpose pro-cessors. Our interest was focused on the use of FPGAs (Field-Programmable Gate Ar-rays) in order to minimize the computational complexity in single processor architecture, then be accelerated on FPGA and meet real-time and energy efficiency requirements. Gen-eral-purpose processors are not efficient for signal processing. We implemented the acous-tic backscattered signal processing model on the Altera DE-SOC board and compared it to Odroid xu4. By comparison, the computing latency of Odroid xu4 and FPGA is 60 sec-onds and 3 seconds, respectively. The detailed SoC FPGA-based system has shown that acoustic spectra are performed up to 20 times faster than the Odroid xu4 implementation. FPGA-based system of processing algorithms is realized with an absolute error of about 10⁻³. This study underlines the increasing importance of embedded systems in underwater acoustics, especially in non-destructive testing. It is possible to obtain information related to the detection and characterization of submerged cells. So we have achieved good exper-imental results in real-time and energy efficiency.

Keywords: DE1 FPGA, acoustic scattering, form function, signal processing, non-destructive testing

Procedia PDF Downloads 49
5823 Image Processing-Based Maize Disease Detection Using Mobile Application

Authors: Nathenal Thomas

Abstract:

In the food chain and in many other agricultural products, corn, also known as maize, which goes by the scientific name Zea mays subsp, is a widely produced agricultural product. Corn has the highest adaptability. It comes in many different types, is employed in many different industrial processes, and is more adaptable to different agro-climatic situations. In Ethiopia, maize is among the most widely grown crop. Small-scale corn farming may be a household's only source of food in developing nations like Ethiopia. The aforementioned data demonstrates that the country's requirement for this crop is excessively high, and conversely, the crop's productivity is very low for a variety of reasons. The most damaging disease that greatly contributes to this imbalance between the crop's supply and demand is the corn disease. The failure to diagnose diseases in maize plant until they are too late is one of the most important factors influencing crop output in Ethiopia. This study will aid in the early detection of such diseases and support farmers during the cultivation process, directly affecting the amount of maize produced. The diseases in maize plants, such as northern leaf blight and cercospora leaf spot, have distinct symptoms that are visible. This study aims to detect the most frequent and degrading maize diseases using the most efficiently used subset of machine learning technology, deep learning so, called Image Processing. Deep learning uses networks that can be trained from unlabeled data without supervision (unsupervised). It is a feature that simulates the exercises the human brain goes through when digesting data. Its applications include speech recognition, language translation, object classification, and decision-making. Convolutional Neural Network (CNN) for Image Processing, also known as convent, is a deep learning class that is widely used for image classification, image detection, face recognition, and other problems. it will also use this algorithm as the state-of-the-art for my research to detect maize diseases by photographing maize leaves using a mobile phone.

Keywords: CNN, zea mays subsp, leaf blight, cercospora leaf spot

Procedia PDF Downloads 44
5822 Fruit of the General Status of Usak Provicce District of Sivasli

Authors: Ayşen Melda Çolak, Volkan Okatan, Ercan Yıldız

Abstract:

In our country, fruit production was determined as 17.2 million tons in 2011 according to official data. Turkey fig, apricot, cherry and quince production ranks first in the world. Almost all the regions of our country, despite the growing of fruit 54% of the total fruit production occur in the Mediterranean and the Aegean Region. However, fruit production in the country is consumed in the domestic market and export rates are often very low. In this study, a questionnaire to 100 farmers face-to-face interview. According to the survey, 40% of those in fruit and 7 da of 7 hectares land are small. 30% of soil testing for manufacturers, testing for 20% of the water. Manufacturers who deliberately fertilization rate of only 10%.

Keywords: fruit, generation, potential, Sivasli survey

Procedia PDF Downloads 228
5821 Teaching Tools for Web Processing Services

Authors: Rashid Javed, Hardy Lehmkuehler, Franz Josef-Behr

Abstract:

Web Processing Services (WPS) have up growing concern in geoinformation research. However, teaching about them is difficult because of the generally complex circumstances of their use. They limit the possibilities for hands- on- exercises on Web Processing Services. To support understanding however a Training Tools Collection was brought on the way at University of Applied Sciences Stuttgart (HFT). It is limited to the scope of Geostatistical Interpolation of sample point data where different algorithms can be used like IDW, Nearest Neighbor etc. The Tools Collection aims to support understanding of the scope, definition and deployment of Web Processing Services. For example it is necessary to characterize the input of Interpolation by the data set, the parameters for the algorithm and the interpolation results (here a grid of interpolated values is assumed). This paper reports on first experiences using a pilot installation. This was intended to find suitable software interfaces for later full implementations and conclude on potential user interface characteristics. Experiences were made with Deegree software, one of several Services Suites (Collections). Being strictly programmed in Java, Deegree offers several OGC compliant Service Implementations that also promise to be of benefit for the project. The mentioned parameters for a WPS were formalized following the paradigm that any meaningful component will be defined in terms of suitable standards. E.g. the data output can be defined as a GML file. But, the choice of meaningful information pieces and user interactions is not free but partially determined by the selected WPS Processing Suite.

Keywords: deegree, interpolation, IDW, web processing service (WPS)

Procedia PDF Downloads 329
5820 Recognition of Objects in a Maritime Environment Using a Combination of Pre- and Post-Processing of the Polynomial Fit Method

Authors: R. R. Hordijk, O. J. G. Somsen

Abstract:

Traditionally, radar systems are the eyes and ears of a ship. However, these systems have their drawbacks and nowadays they are extended with systems that work with video and photos. Processing of data from these videos and photos is however very labour-intensive and efforts are being made to automate this process. A major problem when trying to recognize objects in water is that the 'background' is not homogeneous so that traditional image recognition technics do not work well. Main question is, can a method be developed which automate this recognition process. There are a large number of parameters involved to facilitate the identification of objects on such images. One is varying the resolution. In this research, the resolution of some images has been reduced to the extreme value of 1% of the original to reduce clutter before the polynomial fit (pre-processing). It turned out that the searched object was clearly recognizable as its grey value was well above the average. Another approach is to take two images of the same scene shortly after each other and compare the result. Because the water (waves) fluctuates much faster than an object floating in the water one can expect that the object is the only stable item in the two images. Both these methods (pre-processing and comparing two images of the same scene) delivered useful results. Though it is too early to conclude that with these methods all image problems can be solved they are certainly worthwhile for further research.

Keywords: image processing, image recognition, polynomial fit, water

Procedia PDF Downloads 506
5819 Effective Teaching of Thermofluid Pratical Courses during COVID-19

Authors: Opeyemi Fadipe, Masud Salimian

Abstract:

The COVID-19 pandemic has introduced a new normal into the world; online teaching is now the most used method of teaching over the face to face meeting. With the emergency of these teaching, online-teaching has been improved over time and with more technological advancement tools introduced. Practical courses are more demanding to teach because it requires the physical presence of the student as well as a demonstration of the equipment. In this study, a case of Lagos State University thermofluid practical was the understudy. A survey was done and give to a sample of students to fill. The result showed that the blend-approach is better for practical course teaching. Software simulation of the equipment used to conduct practical should be encouraged in the future.

Keywords: COVID-19, online teaching, t-distribution, thermofluid

Procedia PDF Downloads 135
5818 Cold Flow Investigation of Silicon Carbide Cylindrical Filter Element

Authors: Mohammad Alhajeri

Abstract:

This paper reports a computational fluid dynamics (CFD) investigation of cylindrical filter. Silicon carbide cylindrical filter elements have proven to be an effective mean of removing particulates to levels exceeding the new source performance standard. The CFD code is used here to understand the deposition process and the factors that affect the particles distribution over the filter element surface. Different approach cross flow velocity to filter face velocity ratios and different face velocities (ranging from 2 to 5 cm/s) are used in this study. Particles in the diameter range 1 to 100 microns are tracked through the domain. The radius of convergence (or the critical trajectory) is compared and plotted as a function of many parameters.

Keywords: filtration, CFD, CCF, hot gas filtration

Procedia PDF Downloads 437
5817 Resume Ranking Using Custom Word2vec and Rule-Based Natural Language Processing Techniques

Authors: Subodh Chandra Shakya, Rajendra Sapkota, Aakash Tamang, Shushant Pudasaini, Sujan Adhikari, Sajjan Adhikari

Abstract:

Lots of efforts have been made in order to measure the semantic similarity between the text corpora in the documents. Techniques have been evolved to measure the similarity of two documents. One such state-of-art technique in the field of Natural Language Processing (NLP) is word to vector models, which converts the words into their word-embedding and measures the similarity between the vectors. We found this to be quite useful for the task of resume ranking. So, this research paper is the implementation of the word2vec model along with other Natural Language Processing techniques in order to rank the resumes for the particular job description so as to automate the process of hiring. The research paper proposes the system and the findings that were made during the process of building the system.

Keywords: chunking, document similarity, information extraction, natural language processing, word2vec, word embedding

Procedia PDF Downloads 128
5816 Crop Classification using Unmanned Aerial Vehicle Images

Authors: Iqra Yaseen

Abstract:

One of the well-known areas of computer science and engineering, image processing in the context of computer vision has been essential to automation. In remote sensing, medical science, and many other fields, it has made it easier to uncover previously undiscovered facts. Grading of diverse items is now possible because of neural network algorithms, categorization, and digital image processing. Its use in the classification of agricultural products, particularly in the grading of seeds or grains and their cultivars, is widely recognized. A grading and sorting system enables the preservation of time, consistency, and uniformity. Global population growth has led to an increase in demand for food staples, biofuel, and other agricultural products. To meet this demand, available resources must be used and managed more effectively. Image processing is rapidly growing in the field of agriculture. Many applications have been developed using this approach for crop identification and classification, land and disease detection and for measuring other parameters of crop. Vegetation localization is the base of performing these task. Vegetation helps to identify the area where the crop is present. The productivity of the agriculture industry can be increased via image processing that is based upon Unmanned Aerial Vehicle photography and satellite. In this paper we use the machine learning techniques like Convolutional Neural Network, deep learning, image processing, classification, You Only Live Once to UAV imaging dataset to divide the crop into distinct groups and choose the best way to use it.

Keywords: image processing, UAV, YOLO, CNN, deep learning, classification

Procedia PDF Downloads 64
5815 Optimizing Multimodal Teaching Strategies for Enhanced Engagement and Performance

Authors: Victor Milanes, Martha Hubertz

Abstract:

In the wake of COVID-19, all aspects of life have been estranged, and humanity has been forced to shift toward a more technologically integrated mode of operation. Essential work such as Healthcare, business, and public policy are a few notable industries that were initially dependent upon face-to-face modality but have completely reimagined their operation style. Unique to these fields, education was particularly strained because academics, teachers, and professors alike were obligated to shift their curriculums online over the course of a few weeks while also maintaining the expectation that they were educating their students to a similar level accomplished pre-pandemic. This was notable as research indicates two key concepts: Students prefer face-to-face modality, and due to the disruption in academic continuity/style, there was a negative impact on student's overall education and performance. With these two principles in mind, this study aims to inquire what online strategies could be best employed by teachers to educate their students, as well as what strategies could be adopted in a multimodal setting if deemed necessary by the instructor or outside convoluting factors (Such as the case of COVID-19, or a personal matter that demands the teacher's attention away from the classroom). Strategies and methods will be cross-analyzed via a ranking system derived from various recognized teaching assessments, in which engagement, retention, flexibility, interest, and performance are specifically accounted for. We expect to see an emphasis on positive social pressure as a dominant factor in the improved propensity for education, as well as a preference for visual aids across platforms, as research indicates most individuals are visual learners.

Keywords: technological integration, multimodal teaching, education, student engagement

Procedia PDF Downloads 31
5814 The Provision of a Safe Face-to-Face Teaching Program for Final Year Medical Students during the COVID-19 Pandemic

Authors: Rachel Byrne

Abstract:

Background: Due to patient and student safety concerns, combined with clinical teachers being redeployed to clinical practice, COVID-19 has resulted in a reduction in face-to-face teaching sessions for medical students. Traditionally such sessions are particularly important for final year medical students, especially in preparing for their final practical exams. A reduced student presence on the wards has also resulted in fewer opportunities for junior doctors to provide teaching sessions. This has implications for junior doctors achieving their own curriculum outcomes for teaching, as well as potentially hindering the development of a future interest in medical education. Aims: The aims of the study are 1) To create a safe face-to-face teaching environment during COVID-19 which focussed on exam preparation for final year medical students, 2) To provide a platform for doctors to gain teaching experience, 3 ) to enable doctors to gain feedback or assessments on their teaching, 4) To create beginners guide to designing a new teaching program for future junior doctors. Methods: We created a program of timed clinical stations consisting of four sessions every five weeks during the student’s medicine attachment. Each session could be attended by 6 students and consisted of 6 stations ran by junior doctors, with each station following social distancing and personal protective equipment requirements. Junior doctors were asked to design their own stations. The sessions ran out-of-hours on weekday evenings and were optional for the students. Results: 95/95 students and 20/40 doctors involved in the programme completed feedback. 100% (n=95) of students strongly agreed/agreed that sessions were aimed at an appropriate level and provided constructive feedback. 100% (n=95) of students stated they felt more confident in their abilities and would recommend the session to peers. 90% (n=18) of the teachers strongly agreed/agreed that they felt more confident in their teaching abilities and that the sessions had improved their own medical knowledge. 85% (n=17) of doctors had a teaching assessment completed, and 83% (n=16) said the program had made them consider a career in medical education. The difficulties of creating such a program were highlighted throughout, and a beginner’s guide was created with the hopes of helping future doctors who are interested in teaching address the common obstacles.

Keywords: COVID-19, education, safety, medical

Procedia PDF Downloads 160
5813 Giant Achievements in Food Processing

Authors: Farnaz Amidi Fazli

Abstract:

After long period of human experience about food processing from raw eating to canning of food in the last century now it is time to use novel technologies which are sometimes completely different from common technologies. It is possible to decontaminate food without using heat or the foods are stored without using cold chain. Pulsed electric field (PEF) processing is a non-thermal method of food preservation that uses short bursts of electricity, PEF can be used for processing liquid and semi-liquid food products. PEF processing offers high quality fresh-like liquid foods with excellent flavor, nutritional value, and shelf-life. High pressure processing (HPP) technology has the potential to fulfill both consumer and scientific requirements. The use of HPP for over 50 years has found applications in non-food industries. For food applications, ‘high pressure’ can be generally considered to be up to 600 MPa for most food products. After years, freezing has its high potential to food preservation due to new and quick freezing methods. Foods which are prepared by this technology have more acceptability and high quality comparing with old fashion slow freezing. Thus, quick freezing has further been adopted as a widespread commercial method for long-term preservation of perishable foods which improved both the health and convenience of everyone in the industrialised countries. Above parameters are achieved by Fluidised-bed freezing systems, freezing by immersion and Hydrofluidisation on the other hand new thawing methods like high-pressure, microwave, ohmic, and acoustic thawing have a key role in quality and adaptability of final product.

Keywords: quick freezing, thawing, high pressure, pulse electric, hydrofluidisation

Procedia PDF Downloads 293
5812 Person-Centered Approaches in Face-to-Face Interventions to Support Enrolment in Cardiac Rehabilitation: A Scoping Review Study

Authors: Birgit Rasmussen, Thomas Maribo, Bente S. Toft

Abstract:

BACKGROUND: Cardiac rehabilitation is the standard treatment for ischemic heart disease. Cardiac rehabilitation improves quality of life, reduces mortality and the risk of readmission, and provides patients with valuable knowledge and encouragement from peers and staff. Still, less than half of eligible patients enroll. Face-to-face interventions have the potential to support patients' decision-making and increase enrolment in cardiac rehabilitation. However, we lack knowledge of the content and characteristics of interventions. AIM: The aim was to outline and evaluate the content and characteristics of studies that have reported on face-to-face interventions to encourage enrolment in cardiac rehabilitation in patients with ischemic heart disease. METHOD: This scoping review followed the Joanne Briggs Institute methodology. Based on an a-priori protocol that defined the systematic search criteria, six databases were searched for studies published between 2001 and 2023. Two reviewers independently screened and selected studies. All authors discussed the summarized data prior to the narrative presentation. RESULTS: After screening and full text review of 5583 records, 20 studies of heterogeneous design and content were included. Four studies described the key contents in face-to-face interventions to be education, support of autonomy, addressing reasons for change, and emotional and cognitive support while showing understanding. Two studies used motivational interviewing to target patients' experiences and address worries and anticipated difficulties. Four quantitative studies found associations between enrolment and intention to attend, cardiac rehabilitation barriers, exercise self-efficacy, and perceived control. When patients asked questions, enrolment rates were higher, while providing reassurance and optimism could lead to non-attendance if patients had a high degree of worry. In qualitative studies, support to overcome barriers and knowledge about health benefits from participation in cardiac rehabilitation facilitated enrolment. Feeling reassured that the cardiac condition was good could lead to non-attendance. DISCUSSION AND CONCLUSION: To support patients' enrolment in cardiac rehabilitation, it is recommended that interventions integrate a person-centered dialogue. Individual worries and barriers to cardiac rehabilitation should be jointly explored. When talking with patients for whom worries predominate, the recommendation is to focus on the patients' perspectives and avoid too much focus on reassurance and problem-solving. The patients' perspectives, the mechanisms of change, and the process evaluation of the intervention including person-centeredness are relevant to include in future studies.

Keywords: ischemic heart disease, cardiac rehabilitation, enrolment, person-centered, in-hospital interventions

Procedia PDF Downloads 32
5811 Classifying Facial Expressions Based on a Motion Local Appearance Approach

Authors: Fabiola M. Villalobos-Castaldi, Nicolás C. Kemper, Esther Rojas-Krugger, Laura G. Ramírez-Sánchez

Abstract:

This paper presents the classification results about exploring the combination of a motion based approach with a local appearance method to describe the facial motion caused by the muscle contractions and expansions that are presented in facial expressions. The proposed feature extraction method take advantage of the knowledge related to which parts of the face reflects the highest deformations, so we selected 4 specific facial regions at which the appearance descriptor were applied. The most common used approaches for feature extraction are the holistic and the local strategies. In this work we present the results of using a local appearance approach estimating the correlation coefficient to the 4 corresponding landmark-localized facial templates of the expression face related to the neutral face. The results let us to probe how the proposed motion estimation scheme based on the local appearance correlation computation can simply and intuitively measure the motion parameters for some of the most relevant facial regions and how these parameters can be used to recognize facial expressions automatically.

Keywords: facial expression recognition system, feature extraction, local-appearance method, motion-based approach

Procedia PDF Downloads 386
5810 Corpus-Based Neural Machine Translation: Empirical Study Multilingual Corpus for Machine Translation of Opaque Idioms - Cloud AutoML Platform

Authors: Khadija Refouh

Abstract:

Culture bound-expressions have been a bottleneck for Natural Language Processing (NLP) and comprehension, especially in the case of machine translation (MT). In the last decade, the field of machine translation has greatly advanced. Neural machine translation NMT has recently achieved considerable development in the quality of translation that outperformed previous traditional translation systems in many language pairs. Neural machine translation NMT is an Artificial Intelligence AI and deep neural networks applied to language processing. Despite this development, there remain some serious challenges that face neural machine translation NMT when translating culture bounded-expressions, especially for low resources language pairs such as Arabic-English and Arabic-French, which is not the case with well-established language pairs such as English-French. Machine translation of opaque idioms from English into French are likely to be more accurate than translating them from English into Arabic. For example, Google Translate Application translated the sentence “What a bad weather! It runs cats and dogs.” to “يا له من طقس سيء! تمطر القطط والكلاب” into the target language Arabic which is an inaccurate literal translation. The translation of the same sentence into the target language French was “Quel mauvais temps! Il pleut des cordes.” where Google Translate Application used the accurate French corresponding idioms. This paper aims to perform NMT experiments towards better translation of opaque idioms using high quality clean multilingual corpus. This Corpus will be collected analytically from human generated idiom translation. AutoML translation, a Google Neural Machine Translation Platform, is used as a custom translation model to improve the translation of opaque idioms. The automatic evaluation of the custom model will be compared to the Google NMT using Bilingual Evaluation Understudy Score BLEU. BLEU is an algorithm for evaluating the quality of text which has been machine-translated from one natural language to another. Human evaluation is integrated to test the reliability of the Blue Score. The researcher will examine syntactical, lexical, and semantic features using Halliday's functional theory.

Keywords: multilingual corpora, natural language processing (NLP), neural machine translation (NMT), opaque idioms

Procedia PDF Downloads 106
5809 Developing a Quality Mentor Program: Creating Positive Change for Students in Enabling Programs

Authors: Bianca Price, Jennifer Stokes

Abstract:

Academic and social support systems are critical for students in enabling education; these support systems have the potential to enhance the student experience whilst also serving a vital role for student retention. In the context of international moves toward widening university participation, Australia has developed enabling programs designed to support underrepresented students to access to higher education. The purpose of this study is to examine the effectiveness of a mentor program based within an enabling course. This study evaluates how the mentor program supports new students to develop social networks, improve retention, and increase satisfaction with the student experience. Guided by Social Learning Theory (SLT), this study highlights the benefits that can be achieved when students engage in peer-to-peer based mentoring for both social and learning support. Whilst traditional peer mentoring programs are heavily based on face-to-face contact, the present study explores the difference between mentors who provide face-to-face mentoring, in comparison with mentoring that takes place through the virtual space, specifically via a virtual community in the shape of a Facebook group. This paper explores the differences between these two methods of mentoring within an enabling program. The first method involves traditional face-to-face mentoring that is provided by alumni students who willingly return to the learning community to provide social support and guidance for new students. The second method requires alumni mentor students to voluntarily join a Facebook group that is specifically designed for enabling students. Using this virtual space, alumni students provide advice, support and social commentary on how to be successful within an enabling program. Whilst vastly different methods, both of these mentoring approaches provide students with the support tools needed to enhance their student experience and improve transition into University. To evaluate the impact of each mode, this study uses mixed methods including a focus group with mentors, in-depth interviews, as well as engaging in netnography of the Facebook group ‘Wall’. Netnography is an innovative qualitative research method used to interpret information that is available online to better understand and identify the needs and influences that affect the users of the online space. Through examining the data, this research will reflect upon best practice for engaging students in enabling programs. Findings support the applicability of having both face-to-face and online mentoring available for students to assist enabling students to make a positive transition into University undergraduate studies.

Keywords: enabling education, mentoring, netnography, social learning theory

Procedia PDF Downloads 96
5808 Decision Making, Reward Processing and Response Selection

Authors: Benmansour Nassima, Benmansour Souheyla

Abstract:

The appropriate integration of reward processing and decision making provided by the environment is vital for behavioural success and individuals’ well being in everyday life. Functional neurological investigation has already provided an inclusive image on affective and emotional (motivational) processing in the healthy human brain and has recently focused its interest also on the assessment of brain function in anxious and depressed individuals. This article offers an overview on the theoretical approaches that relate emotion and decision-making, and spotlights investigation with anxious or depressed individuals to reveal how emotions can interfere with decision-making. This research aims at incorporating the emotional structure based on response and stimulation with a Bayesian approach to decision-making in terms of probability and value processing. It seeks to show how studies of individuals with emotional dysfunctions bear out that alterations of decision-making can be considered in terms of altered probability and value subtraction. The utmost objective is to critically determine if the probabilistic representation of belief affords could be a critical approach to scrutinize alterations in probability and value representation in subjective with anxiety and depression, and draw round the general implications of this approach.

Keywords: decision-making, motivation, alteration, reward processing, response selection

Procedia PDF Downloads 443
5807 A Review of Travel Data Collection Methods

Authors: Muhammad Awais Shafique, Eiji Hato

Abstract:

Household trip data is of crucial importance for managing present transportation infrastructure as well as to plan and design future facilities. It also provides basis for new policies implemented under Transportation Demand Management. The methods used for household trip data collection have changed with passage of time, starting with the conventional face-to-face interviews or paper-and-pencil interviews and reaching to the recent approach of employing smartphones. This study summarizes the step-wise evolution in the travel data collection methods. It provides a comprehensive review of the topic, for readers interested to know the changing trends in the data collection field.

Keywords: computer, smartphone, telephone, travel survey

Procedia PDF Downloads 282
5806 Contour Defects of Face with Hyperpigmentation

Authors: Afzaal Bashir, Sunaina Afzaal

Abstract:

Background: Facial contour deformities associated with pigmentary changes are of major concern for plastic surgeons, both being important and difficult in treating such issues. No definite ideal treatment option is available to simultaneously address both the contour defect as well as related pigmentation. Objectives: The aim of the current study is to compare the long-term effects of conventional adipose tissue grafting and ex-vivo expanded Mesenchymal Stem Cells enriched adipose tissue grafting for the treatment of contour deformities with pigmentary changes on the face. Material and Methods: In this study, eighty (80) patients with contour deformities of the face with hyperpigmentation were recruited after informed consent. Two techniques i.e., conventional fat grafting (C-FG) and fat grafts enriched with expanded adipose stem cells (FG-ASCs), were used to address the pigmentation. Both techniques were explained to patients, and enrolled patients were divided into two groups i.e., C-FG and FG-ASCS, per patients’ choice and satisfaction. Patients of the FG-ASCs group were treated with fat grafts enriched with expanded adipose stem cells, while patients of the C-FGs group were treated with conventional fat grafting. Patients were followed for 12 months, and improvement in face pigmentation was assessed clinically as well as measured objectively. Patient satisfaction was also documented as highly satisfied, satisfied, and unsatisfied. Results: Mean age of patients was 24.42(±4.49), and 66 patients were females. The forehead was involved in 61.20% of cases, the cheek in 21.20% of cases, the chin in 11.20% of cases, and the nose in 6.20% of cases. In the GF-ASCs group, the integrated color density (ICD) was decreased (1.08×10⁶ ±4.64×10⁵) as compared to the C-FG group (2.80×10⁵±1.69×10⁵). Patients treated with fat grafts enriched with expanded adipose stem cells were significantly more satisfied as compared to patients treated with conventional fat grafting only. Conclusion: Mesenchymal stem cell-enriched autologous fat grafting is a preferred option for improving the contour deformities related to increased pigmentation of face skin.

Keywords: hyperpigmentation, color density, enriched adipose tissue graft, fat grafting, contour deformities, Image J

Procedia PDF Downloads 69
5805 The Use of Image Processing Responses Tools Applied to Analysing Bouguer Gravity Anomaly Map (Tangier-Tetuan's Area-Morocco)

Authors: Saad Bakkali

Abstract:

Image processing is a powerful tool for the enhancement of edges in images used in the interpretation of geophysical potential field data. Arial and terrestrial gravimetric surveys were carried out in the region of Tangier-Tetuan. From the observed and measured data of gravity Bouguer gravity anomalies map was prepared. This paper reports the results and interpretations of the transformed maps of Bouguer gravity anomaly of the Tangier-Tetuan area using image processing. Filtering analysis based on classical image process was applied. Operator image process like logarithmic and gamma correction are used. This paper also present the results obtained from this image processing analysis of the enhancement edges of the Bouguer gravity anomaly map of the Tangier-Tetuan zone.

Keywords: bouguer, tangier, filtering, gamma correction, logarithmic enhancement edges

Procedia PDF Downloads 397
5804 A Novel Method for Face Detection

Authors: H. Abas Nejad, A. R. Teymoori

Abstract:

Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, etc. in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as the user stays neutral for the majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this work, we propose a light-weight neutral vs. emotion classification engine, which acts as a preprocessor to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at Key Emotion (KE) points using a textural statistical model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a textural statistical model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves ER accuracy and simultaneously reduces the computational complexity of ER system, as validated on multiple databases.

Keywords: neutral vs. emotion classification, Constrained Local Model, procrustes analysis, Local Binary Pattern Histogram, statistical model

Procedia PDF Downloads 318
5803 Theory about the Gebel El-Arak Knife: Egyptian Knife with Canaanite Relief

Authors: Doaa El-Shereef

Abstract:

This paper will focus on proving a theory that it is an Egyptian knife with Canaanite relief and will discuss the nature of the Gebel el-Arak Knife and the civilization to which it belongs and the relationship of the Canaanite deity with Mount Abydos in Egypt, and the influence between the ancient Egyptian religion and the Canaanite religion and their effect on each other. Finally, the paper will discuss in full detail the engraving of the two faces of the knife handle and analyze the register on the front face, which is the scene of the two lions engraved, and between them, an old man symbolized as the deity in the middle and compared it with other drawings similar in the Egyptian civilization. The paper will also discuss the registers on the back face, which are the engravings of a battle, soldiers, uniforms, and boats, and how the back face describes a water battle between three Egyptian boats and two foreign ships. In addition, it will prove that those foreign ships were not Mesopotamian ships because, in the period to which the knife of Gebel Al-Arak belongs, between 3300-3100 BC, there were no battles or trade exchanges between Egypt and Mesopotamia on sea routes. However, there was already a strong land and sea trade between Canaan and Egypt during Chalcolithic Age (4500-3500 BC), as described in many primary sources.

Keywords: Canaan, Egypt, Gebel el-Arak Knife, Louvre

Procedia PDF Downloads 144
5802 Value Chain Analysis and Enhancement Added Value in Palm Oil Supply Chain

Authors: Juliza Hidayati, Sawarni Hasibuan

Abstract:

PT. XYZ is a manufacturing company that produces Crude Palm Oil (CPO). The fierce competition in the global markets not only between companies but also a competition between supply chains. This research aims to analyze the supply chain and value chain of Crude Palm Oil (CPO) in the company. Data analysis method used is qualitative analysis and quantitative analysis. The qualitative analysis describes supply chain and value chain, while the quantitative analysis is used to find out value added and the establishment of the value chain. Based on the analysis, the value chain of crude palm oil (CPO) in the company consists of four main actors that are suppliers of raw materials, processing, distributor, and customer. The value chain analysis consists of two actors; those are palm oil plantation and palm oil processing plant. The palm oil plantation activities include nurseries, planting, plant maintenance, harvesting, and shipping. The palm oil processing plant activities include reception, sterilizing, thressing, pressing, and oil classification. The value added of palm oil plantations was 72.42% and the palm oil processing plant was 10.13%.

Keywords: palm oil, value chain, value added, supply chain

Procedia PDF Downloads 334
5801 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 215
5800 Mechanism of Sinkhole Development on Water-Bearing Soft Ground Tunneling

Authors: H. J. Kim, K. H. Kim, N. H. Park, K. T. Nam, Y. H. Jung, T. H. Kim, J. H. Shin

Abstract:

Underground excavations in an urban area can cause various geotechnical problems such as ground loss and lowering of groundwater level. When the ground loss becomes uncontrollably large, sinkholes can be developed to the ground surface. A sinkhole is commonly known as the natural phenomenon associated with lime rock areas. However, sinkholes in urban areas due to pressurized sewers and/or tunneling are also frequently reported. In this study, mechanism of a sinkhole developed at the site ‘A’ where a tunneling work underwent is investigated. The sinkhole occurred in the sand strata with the high level of groundwater when excavating a tunnel of which diameter is 3.6 m. The sinkhole was progressed in two steps. The first step began with the local failure around the tunnel face followed by tons of groundwater inflow, and the second step was triggered by the TBM (Tunnel Boring Machine) chamber opening which led to the progressive general failure. The possibility of the sinkhole was evaluated by using Limit Equilibrium Method (LEM), and critical height was evaluated by the empirical stability chart. It is found that the lowering of the face pressure and inflow of groundwater into the tunnel face turned to be the main reason for the sinkhole.

Keywords: limit equilibrium method, sinkhole, stability chart, tunneling

Procedia PDF Downloads 213
5799 Retrieving Iconometric Proportions of South Indian Sculptures Based on Statistical Analysis

Authors: M. Bagavandas

Abstract:

Introduction: South Indian stone sculptures are known for their elegance and history. They are available in large numbers in different monuments situated different parts of South India. These art pieces have been studied using iconography details, but this pioneering study introduces a novel method known as iconometry which is a quantitative study that deals with measurements of different parts of icons to find answers for important unanswered questions. The main aim of this paper is to compare iconometric measurements of the sculptures with canonical proportion to determine whether the sculptors of the past had followed any of the canonical proportions prescribed in the ancient text. If not, this study recovers the proportions used for carving sculptures which is not available to us now. Also, it will be interesting to see how these sculptural proportions of different monuments belonging to different dynasties differ from one another in terms these proportions. Methods and Materials: As Indian sculptures are depicted in different postures, one way of making measurements independent of size, is to decode on a suitable measurement and convert the other measurements as proportions with respect to the chosen measurement. Since in all canonical texts of Indian art, all different measurements are given in terms of face length, it is chosen as the required measurement for standardizing the measurements. In order to compare these facial measurements with measurements prescribed in Indian canons of Iconography, the ten facial measurements like face length, morphological face length, nose length, nose-to-chin length, eye length, lip length, face breadth, nose breadth, eye breadth and lip breadth were standardized using the face length and the number of measurements reduced to nine. Each measurement was divided by the corresponding face length and multiplied by twelve and given in angula unit used in the canonical texts. The reason for multiplying by twelve is that the face length is given as twelve angulas in the canonical texts for all figures. Clustering techniques were used to determine whether the sculptors of the past had followed any of the proportions prescribed in the canonical texts of the past to carve sculptures and also to compare the proportions of sculptures of different monuments. About one hundred twenty-seven stone sculptures from four monuments belonging to the Pallava, the Chola, the Pandya and the Vijayanagar dynasties were taken up for this study. These art pieces belong to a period ranging from the eighth to the sixteenth century A.D. and all of them adorning different monuments situated in different parts of Tamil Nadu State, South India. Anthropometric instruments were used for taking measurements and the author himself had measured all the sample pieces of this study. Result: Statistical analysis of sculptures of different centers of art from different dynasties shows a considerable difference in facial proportions and many of these proportions differ widely from the canonical proportions. The retrieved different facial proportions indicate that the definition of beauty has been changing from period to period and region to region.

Keywords: iconometry, proportions, sculptures, statistics

Procedia PDF Downloads 133
5798 Quantitative Analysis of Multiprocessor Architectures for Radar Signal Processing

Authors: Deepak Kumar, Debasish Deb, Reena Mamgain

Abstract:

Radar signal processing requires high number crunching capability. Most often this is achieved using multiprocessor platform. Though multiprocessor platform provides the capability of meeting the real time computational challenges, the architecture of the same along with mapping of the algorithm on the architecture plays a vital role in efficiently using the platform. Towards this, along with standard performance metrics, few additional metrics are defined which helps in evaluating the multiprocessor platform along with the algorithm mapping. A generic multiprocessor architecture can not suit all the processing requirements. Depending on the system requirement and type of algorithms used, the most suitable architecture for the given problem is decided. In the paper, we study different architectures and quantify the different performance metrics which enables comparison of different architectures for their merit. We also carried out case study of different architectures and their efficiency depending on parallelism exploited on algorithm or data or both.

Keywords: radar signal processing, multiprocessor architecture, efficiency, load imbalance, buffer requirement, pipeline, parallel, hybrid, cluster of processors (COPs)

Procedia PDF Downloads 378
5797 Toward Cloud E-learning System Based on Smart Tools

Authors: Mohsen Maraoui

Abstract:

In the face of the growth in the quantity of data produced, several methods and techniques appear to remedy the problems of processing and analyzing large amounts of information mainly in the field of teaching. In this paper, we propose an intelligent cloud-based teaching system for E-learning content services. This system makes easy the manipulation of various educational content forms, including text, images, videos, 3 dimensions objects and scenes of virtual reality and augmented reality. We discuss the integration of institutional and external services to provide personalized assistance to university members in their daily activities. The proposed system provides an intelligent solution for media services that can be accessed from smart devices cloud-based intelligent service environment with a fully integrated system.

Keywords: cloud computing, e-learning, indexation, IoT, learning in Arabic language, smart tools

Procedia PDF Downloads 102
5796 The Influence of Concreteness on English Compound Noun Processing: Modulation of Constituent Transparency

Authors: Turgut Coskun

Abstract:

'Concreteness effect' refers to faster processing of concrete words and 'compound facilitation' refers to faster response to compounds. In this study, our main goal was to investigate the interaction between compound facilitation and concreteness effect. The latter might modulate compound processing basing on constituents’ transparency patterns. To evaluate these, we created lists for compound and monomorphemic words, sub-categorized them into concrete and abstract words, and further sub-categorized them basing on their transparency. The transparency conditions were opaque-opaque (OO), transparent-opaque (TO), and transparent-transparent (TT). We used RT data from English Lexicon Project (ELP) for our comparisons. The results showed the importance of concreteness factor (facilitation) in both compound and monomorphemic processing. Important for our present concern, separate concrete and abstract compound analyses revealed different patterns for OO, TO, and TT compounds. Concrete TT and TO conditions were processed faster than Concrete OO, Abstract OO and Abstract TT compounds, however, they weren’t processed faster than Abstract TO compounds. These results may reflect on different representation patterns of concrete and abstract compounds.

Keywords: abstract word, compound representation, concrete word, constituent transparency, processing speed

Procedia PDF Downloads 162
5795 Arousal, Encoding, And Intrusive Memories

Authors: Hannah Gutmann, Rick Richardson, Richard Bryant

Abstract:

Intrusive memories following a traumatic event are not uncommon. However, in some individuals, these memories become maladaptive and lead to prolonged stress reactions. A seminal model of PTSD explains that aberrant processing during trauma may lead to prolonged stress reactions and intrusive memories. This model explains that elevated arousal at the time of the trauma promotes data driven processing, leading to fragmented and intrusive memories. This study investigated the role of elevated arousal on the development of intrusive memories. We measured salivary markers of arousal and investigated what impact this had on data driven processing, memory fragmentation, and subsequently, the development of intrusive memories. We assessed 100 healthy participants to understand their processing style, arousal, and experience of intrusive memories. Participants were randomised to a control or experimental condition, the latter of which was designed to increase their arousal. Based on current theory, participants in the experimental condition were expected to engage in more data driven processing and experience more intrusive memories than participants in the control condition. This research aims to shed light on the mechanisms underlying the development of intrusive memories to illustrate ways in which therapeutic approaches for PTSD may be augmented for greater efficacy.

Keywords: stress, cortisol, SAA, PTSD, intrusive memories

Procedia PDF Downloads 162