Search results for: image manipulation
1614 Clinical Application of Measurement of Eyeball Movement for Diagnose of Autism
Authors: Ippei Torii, Kaoruko Ohtani, Takahito Niwa, Naohiro Ishii
Abstract:
This paper shows developing an objectivity index using the measurement of subtle eyeball movement to diagnose autism. The developmentally disabled assessment varies, and the diagnosis depends on the subjective judgment of professionals. Therefore, a supplementary inspection method that will enable anyone to obtain the same quantitative judgment is needed. The diagnosis are made based on a comparison of the time of gazing an object in the conventional autistic study, but the results do not match. First, we divided the pupil into four parts from the center using measurements of subtle eyeball movement and comparing the number of pixels in the overlapping parts based on an afterimage. Then we developed the objective evaluation indicator to judge non-autistic and autistic people more clearly than conventional methods by analyzing the differences of subtle eyeball movements between the right and left eyes. Even when a person gazes at one point and his/her eyeballs always stay fixed at that point, their eyes perform subtle fixating movements (ie. tremors, drifting, microsaccades) to keep the retinal image clear. Particularly, the microsaccades link with nerves and reflect the mechanism that process the sight in a brain. We converted the differences between these movements into numbers. The process of the conversion is as followed: 1) Select the pixel indicating the subject's pupil from images of captured frames. 2) Set up a reference image, known as an afterimage, from the pixel indicating the subject's pupil. 3) Divide the pupil of the subject into four from the center in the acquired frame image. 4) Select the pixel in each divided part and count the number of the pixels of the overlapping part with the present pixel based on the afterimage. 5) Process the images with precision in 24 - 30fps from a camera and convert the amount of change in the pixels of the subtle movements of the right and left eyeballs in to numbers. The difference in the area of the amount of change occurs by measuring the difference between the afterimage in consecutive frames and the present frame. We set the amount of change to the quantity of the subtle eyeball movements. This method made it possible to detect a change of the eyeball vibration in numerical value. By comparing the numerical value between the right and left eyes, we found that there is a difference in how much they move. We compared the difference in these movements between non-autistc and autistic people and analyzed the result. Our research subjects consists of 8 children and 10 adults with autism, and 6 children and 18 adults with no disability. We measured the values through pasuit movements and fixations. We converted the difference in subtle movements between the right and left eyes into a graph and define it in multidimensional measure. Then we set the identification border with density function of the distribution, cumulative frequency function, and ROC curve. With this, we established an objective index to determine autism, normal, false positive, and false negative.Keywords: subtle eyeball movement, autism, microsaccade, pursuit eye movements, ROC curve
Procedia PDF Downloads 2801613 Platform Integration for High-Throughput Functional Screening Applications
Authors: Karolis Leonavičius, Dalius Kučiauskas, Dangiras Lukošius, Arnoldas Jasiūnas, Kostas Zdanys, Rokas Stanislovas, Emilis Gegevičius, Žana Kapustina, Juozas Nainys
Abstract:
Screening throughput is a common bottleneck in many research areas, including functional genomics, drug discovery, and directed evolution. High-throughput screening techniques can be classified into two main categories: (i) affinity-based screening and (ii) functional screening. The first one relies on binding assays that provide information about the affinity of a test molecule for a target binding site. Binding assays are relatively easy to establish; however, they reveal no functional activity. In contrast, functional assays show an effect triggered by the interaction of a ligand at a target binding site. Functional assays might be based on a broad range of readouts, such as cell proliferation, reporter gene expression, downstream signaling, and other effects that are a consequence of ligand binding. Screening of large cell or gene libraries based on direct activity rather than binding affinity is now a preferred strategy in many areas of research as functional assays more closely resemble the context where entities of interest are anticipated to act. Droplet sorting is the basis of high-throughput functional biological screening, yet its applicability is limited due to the technical complexity of integrating high-performance droplet analysis and manipulation systems. As a solution, the Droplet Genomics Styx platform enables custom droplet sorting workflows, which are necessary for the development of early-stage or complex biological therapeutics or industrially important biocatalysts. The poster will focus on the technical design considerations of Styx in the context of its application spectra.Keywords: functional screening, droplet microfluidics, droplet sorting, dielectrophoresis
Procedia PDF Downloads 1371612 Field Emission Scanning Microscope Image Analysis for Porosity Characterization of Autoclaved Aerated Concrete
Authors: Venuka Kuruwita Arachchige Don, Mohamed Shaheen, Chris Goodier
Abstract:
Aerated autoclaved concrete (AAC) is known for its lightweight, easy handling, high thermal insulation, and extremely porous structure. Investigation of pore behavior in AAC is crucial for characterizing the material, standardizing design and production techniques, enhancing the mechanical, durability, and thermal performance, studying the effectiveness of protective measures, and analyzing the effects of weather conditions. The significant details of pores are complicated to observe with acknowledged accuracy. The High-resolution Field Emission Scanning Electron Microscope (FESEM) image analysis is a promising technique for investigating the pore behavior and density of AAC, which is adopted in this study. Mercury intrusion porosimeter and gas pycnometer were employed to characterize porosity distribution and density parameters. The analysis considered three different densities of AAC blocks and three layers in the altitude direction within each block. A set of understandings was presented to extract and analyze the details of pore shape, pore size, pore connectivity, and pore percentages from FESEM images of AAC. Average pore behavior outcomes per unit area were presented. Comparison of porosity distribution and density parameters revealed significant variations. FESEM imaging offered unparalleled insights into porosity behavior, surpassing the capabilities of other techniques. The analysis conducted from a multi-staged approach provides porosity percentage occupied by various pore categories, total porosity, variation of pore distribution compared to AAC densities and layers, number of two-dimensional and three-dimensional pores, variation of apparent and matrix densities concerning pore behaviors, variation of pore behavior with respect to aluminum content, and relationship among shape, diameter, connectivity, and percentage in each pore classification.Keywords: autoclaved aerated concrete, density, imaging technique, microstructure, porosity behavior
Procedia PDF Downloads 691611 New Approach for Constructing a Secure Biometric Database
Authors: A. Kebbeb, M. Mostefai, F. Benmerzoug, Y. Chahir
Abstract:
The multimodal biometric identification is the combination of several biometric systems. The challenge of this combination is to reduce some limitations of systems based on a single modality while significantly improving performance. In this paper, we propose a new approach to the construction and the protection of a multimodal biometric database dedicated to an identification system. We use a topological watermarking to hide the relation between face image and the registered descriptors extracted from other modalities of the same person for more secure user identification.Keywords: biometric databases, multimodal biometrics, security authentication, digital watermarking
Procedia PDF Downloads 3911610 Antecedents and Consequences of Organizational Intelligence in an R and D Organization
Authors: Akriti Srivastava, Soumi Awasthy
Abstract:
One of the disciplines that provoked increased interest in the importance of intelligence is the management and organization development literature. Organization intelligence is a key enabling force underlying many vital activities and processes dominating organizational life. Hence, the factors which lead to organizational intelligence and the result which comes out of the whole procedure is important to be understood with the understanding of OI. The focus of this research was to uncover potential antecedents and consequences of organizational intelligence, thus a non-experimental explanatory survey research design was used. A non-experimental research design is in which the manipulation of variables and randomization of samples are not present. The data was collected with the help of the questionnaire from 321 scientists from different laboratories of an R & D organization. Out of which 304 data were found suitable for the analysis. There were 194 males (age, M= 35.03, SD=7.63) and 110 females (age, M= 34.34, SD=8.44). This study tested a conceptual model linking antecedent variables (leadership and organizational culture) to organizational intelligence, followed by organizational innovational capability and organizational performance. Structural equation modeling techniques were used to analyze the hypothesized model. But, before that, confirmatory factor analysis of organizational intelligence scale was done which resulted in an insignificant model. Then, exploratory factor analysis was done which gave six factors for organizational intelligence scale. This structure was used throughout the study. Following this, the final analysis revealed relatively good fit of data to the hypothesized model with certain modifications. Leadership and organizational culture emerged out as the significant antecedents of organizational intelligence. Organizational innovational capability and organizational performance came out to be the consequent factors of organizational intelligence. But organizational intelligence did not predict organizational performance via organizational innovational capability. With this, additional significant pathway emerged out between leadership and organizational performance. The model offers a fresh and comprehensive view of the organizational intelligence. In this study, prior studies in related literature were reviewed to offer a basic framework of organizational intelligence. The study proved to be beneficial for organizational intelligence scholarship, seeing its importance in the competitive environment.Keywords: leadership, organizational culture, organizational intelligence, organizational innovational capability
Procedia PDF Downloads 3441609 Ill-Posed Inverse Problems in Molecular Imaging
Authors: Ranadhir Roy
Abstract:
Inverse problems arise in medical (molecular) imaging. These problems are characterized by large in three dimensions, and by the diffusion equation which models the physical phenomena within the media. The inverse problems are posed as a nonlinear optimization where the unknown parameters are found by minimizing the difference between the predicted data and the measured data. To obtain a unique and stable solution to an ill-posed inverse problem, a priori information must be used. Mathematical conditions to obtain stable solutions are established in Tikhonov’s regularization method, where the a priori information is introduced via a stabilizing functional, which may be designed to incorporate some relevant information of an inverse problem. Effective determination of the Tikhonov regularization parameter requires knowledge of the true solution, or in the case of optical imaging, the true image. Yet, in, clinically-based imaging, true image is not known. To alleviate these difficulties we have applied the penalty/modified barrier function (PMBF) method instead of Tikhonov regularization technique to make the inverse problems well-posed. Unlike the Tikhonov regularization method, the constrained optimization technique, which is based on simple bounds of the optical parameter properties of the tissue, can easily be implemented in the PMBF method. Imposing the constraints on the optical properties of the tissue explicitly restricts solution sets and can restore uniqueness. Like the Tikhonov regularization method, the PMBF method limits the size of the condition number of the Hessian matrix of the given objective function. The accuracy and the rapid convergence of the PMBF method require a good initial guess of the Lagrange multipliers. To obtain the initial guess of the multipliers, we use a least square unconstrained minimization problem. Three-dimensional images of fluorescence absorption coefficients and lifetimes were reconstructed from contact and noncontact experimentally measured data.Keywords: constrained minimization, ill-conditioned inverse problems, Tikhonov regularization method, penalty modified barrier function method
Procedia PDF Downloads 2711608 Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images
Authors: Seyed-Yaser Nabavi-Chashmi, Davood Asadi, Karim Ahmadi, Eren Demir
Abstract:
The landing phase of a UAV is very critical as there are many uncertainties in this phase, which can easily entail a hard landing or even a crash. In this paper, the estimation of relative distance and velocity to the ground, as one of the most important processes during the landing phase, is studied. Using accurate measurement sensors as an alternative approach can be very expensive for sensors like LIDAR, or with a limited operational range, for sensors like ultrasonic sensors. Additionally, absolute positioning systems like GPS or IMU cannot provide distance to the ground independently. The focus of this paper is to determine whether we can measure the relative distance and velocity of UAV and ground in the landing phase using just low-resolution images taken by a monocular camera. The Lucas-Konda feature detection technique is employed to extract the most suitable feature in a series of images taken during the UAV landing. Two different approaches based on Extended Kalman Filters (EKF) have been proposed, and their performance in estimation of the relative distance and velocity are compared. The first approach uses the kinematics of the UAV as the process and the calculated optical flow as the measurement; On the other hand, the second approach uses the feature’s projection on the camera plane (pixel position) as the measurement while employing both the kinematics of the UAV and the dynamics of variation of projected point as the process to estimate both relative distance and relative velocity. To verify the results, a sequence of low-quality images taken by a camera that is moving on a specifically developed testbed has been used to compare the performance of the proposed algorithm. The case studies show that the quality of images results in considerable noise, which reduces the performance of the first approach. On the other hand, using the projected feature position is much less sensitive to the noise and estimates the distance and velocity with relatively high accuracy. This approach also can be used to predict the future projected feature position, which can drastically decrease the computational workload, as an important criterion for real-time applications.Keywords: altitude estimation, drone, image processing, trajectory planning
Procedia PDF Downloads 1131607 Metabolic Manipulation as a Strategy for Optimization of Biomass Productivity and Oil Content in the Microalgae Desmodesmus Sp.
Authors: Ivan A. Sandoval Salazar, Silvia F. Valderrama
Abstract:
The microalgae oil emerges as a promising source of raw material for many industrial applications. Thus, this study had as a main focus on the cultivation of the microalgae species Desmodesmus sp. in laboratory scale with a view to maximizing biomass production and triglyceride content in the lipid fraction. Initially, culture conditions were selected to optimize biomass production, which was subsequently subjected to nutritional stress by varying nitrate and phosphate concentrations in order to increase the content and productivity of fatty acids. The culture medium BOLD 3N, nitrate and phosphate, light intensity 250,500 and 1000 μmol photons.m².s⁻¹, photoperiod of 12:12 were evaluated. Under the best conditions of the tests, a maximum cell division of 1.13 div.dia⁻¹ was obtained on the sixth day of culture, beginning of the exponential phase, and a maximum concentration of 8.42x107 cell.mL⁻¹ and dry biomass of 3.49 gL⁻¹ on the 20th day, in the stationary phase. The lipid content in the first stage of culture was approximately 8% after 12 days and at the end of the culture in the stationary phase ranged from 12% to 16% (20 days). In the microalgae grown at 250 μmol fotons.m2.s-1 the fatty acid profile was mostly polyunsaturated (52%). The total of unsaturated fatty acids, identified in this species of microalga, reached values between 70 and 75%, being qualified for use in the food and pharmaceutical industry. In addition, this study showed that the cultivation conditions influenced mainly the production of polyunsaturated fatty acids, with the predominance of γ-linolenic acid. However, in the cultures submitted to the highest the intensity of light (1000 μmol photons.m².s⁻¹) and low concentrations of nitrate and phosphate, saturated and monounsaturated fatty acids, which present greater oxidative stability, were identified mainly (60 to 70 %) being qualified for the production of biodiesel and for oleochemistry.Keywords: microalgae, Desmodesmus sp, fatty acids, biodiesel
Procedia PDF Downloads 1501606 Hand Gesture Detection via EmguCV Canny Pruning
Authors: N. N. Mosola, S. J. Molete, L. S. Masoebe, M. Letsae
Abstract:
Hand gesture recognition is a technique used to locate, detect, and recognize a hand gesture. Detection and recognition are concepts of Artificial Intelligence (AI). AI concepts are applicable in Human Computer Interaction (HCI), Expert systems (ES), etc. Hand gesture recognition can be used in sign language interpretation. Sign language is a visual communication tool. This tool is used mostly by deaf societies and those with speech disorder. Communication barriers exist when societies with speech disorder interact with others. This research aims to build a hand recognition system for Lesotho’s Sesotho and English language interpretation. The system will help to bridge the communication problems encountered by the mentioned societies. The system has various processing modules. The modules consist of a hand detection engine, image processing engine, feature extraction, and sign recognition. Detection is a process of identifying an object. The proposed system uses Canny pruning Haar and Haarcascade detection algorithms. Canny pruning implements the Canny edge detection. This is an optimal image processing algorithm. It is used to detect edges of an object. The system employs a skin detection algorithm. The skin detection performs background subtraction, computes the convex hull, and the centroid to assist in the detection process. Recognition is a process of gesture classification. Template matching classifies each hand gesture in real-time. The system was tested using various experiments. The results obtained show that time, distance, and light are factors that affect the rate of detection and ultimately recognition. Detection rate is directly proportional to the distance of the hand from the camera. Different lighting conditions were considered. The more the light intensity, the faster the detection rate. Based on the results obtained from this research, the applied methodologies are efficient and provide a plausible solution towards a light-weight, inexpensive system which can be used for sign language interpretation.Keywords: canny pruning, hand recognition, machine learning, skin tracking
Procedia PDF Downloads 1851605 Remote Wireless Patient Monitoring System
Authors: Sagar R. Patil, Dinesh R. Gawade, Sudhir N. Divekar
Abstract:
One of the medical devices we found when we visit a hospital care unit such device is ‘patient monitoring system’. This device (patient monitoring system) informs doctors and nurses about the patient’s physiological signals. However, this device (patient monitoring system) does not have a remote monitoring capability, which is necessitates constant onsite attendance by support personnel (doctors and nurses). Thus, we have developed a Remote Wireless Patient Monitoring System using some biomedical sensors and Android OS, which is a portable patient monitoring. This device(Remote Wireless Patient Monitoring System) monitors the biomedical signals of patients in real time and sends them to remote stations (doctors and nurse’s android Smartphone and web) for display and with alerts when necessary. Wireless Patient Monitoring System different from conventional device (Patient Monitoring system) in two aspects: First its wireless communication capability allows physiological signals to be monitored remotely and second, it is portable so patients can move while there biomedical signals are being monitor. Wireless Patient Monitoring is also notable because of its implementation. We are integrated four sensors such as pulse oximeter (SPO2), thermometer, respiration, blood pressure (BP), heart rate and electrocardiogram (ECG) in this device (Wireless Patient Monitoring System) and Monitoring and communication applications are implemented on the Android OS using threads, which facilitate the stable and timely manipulation of signals and the appropriate sharing of resources. The biomedical data will be display on android smart phone as well as on web Using web server and database system we can share these physiological signals with remote place medical personnel’s or with any where in the world medical personnel’s. We verified that the multitasking implementation used in the system was suitable for patient monitoring and for other Healthcare applications.Keywords: patient monitoring, wireless patient monitoring, bio-medical signals, physiological signals, embedded system, Android OS, healthcare, pulse oximeter (SPO2), thermometer, respiration, blood pressure (BP), heart rate, electrocardiogram (ECG)
Procedia PDF Downloads 5721604 Normal and Peaberry Coffee Beans Classification from Green Coffee Bean Images Using Convolutional Neural Networks and Support Vector Machine
Authors: Hira Lal Gope, Hidekazu Fukai
Abstract:
The aim of this study is to develop a system which can identify and sort peaberries automatically at low cost for coffee producers in developing countries. In this paper, the focus is on the classification of peaberries and normal coffee beans using image processing and machine learning techniques. The peaberry is not bad and not a normal bean. The peaberry is born in an only single seed, relatively round seed from a coffee cherry instead of the usual flat-sided pair of beans. It has another value and flavor. To make the taste of the coffee better, it is necessary to separate the peaberry and normal bean before green coffee beans roasting. Otherwise, the taste of total beans will be mixed, and it will be bad. In roaster procedure time, all the beans shape, size, and weight must be unique; otherwise, the larger bean will take more time for roasting inside. The peaberry has a different size and different shape even though they have the same weight as normal beans. The peaberry roasts slower than other normal beans. Therefore, neither technique provides a good option to select the peaberries. Defect beans, e.g., sour, broken, black, and fade bean, are easy to check and pick up manually by hand. On the other hand, the peaberry pick up is very difficult even for trained specialists because the shape and color of the peaberry are similar to normal beans. In this study, we use image processing and machine learning techniques to discriminate the normal and peaberry bean as a part of the sorting system. As the first step, we applied Deep Convolutional Neural Networks (CNN) and Support Vector Machine (SVM) as machine learning techniques to discriminate the peaberry and normal bean. As a result, better performance was obtained with CNN than with SVM for the discrimination of the peaberry. The trained artificial neural network with high performance CPU and GPU in this work will be simply installed into the inexpensive and low in calculation Raspberry Pi system. We assume that this system will be used in under developed countries. The study evaluates and compares the feasibility of the methods in terms of accuracy of classification and processing speed.Keywords: convolutional neural networks, coffee bean, peaberry, sorting, support vector machine
Procedia PDF Downloads 1451603 An Overview of the SIAFIM Connected Resources
Authors: Tiberiu Boros, Angela Ionita, Maria Visan
Abstract:
Wildfires are one of the frequent and uncontrollable phenomena that currently affect large areas of the world where the climate, geographic and social conditions make it impossible to prevent and control such events. In this paper we introduce the ground concepts that lie behind the SIAFIM (Satellite Image Analysis for Fire Monitoring) project in order to create a context and we introduce a set of newly created tools that are external to the project but inherently in interventions and complex decision making based on geospatial information and spatial data infrastructures.Keywords: wildfire, forest fire, natural language processing, mobile applications, communication, GPS
Procedia PDF Downloads 5821602 Rheological Properties of Dough and Sensory Quality of Crackers with Dietary Fibers
Authors: Ljubica Dokić, Ivana Nikolić, Dragana Šoronja–Simović, Zita Šereš, Biljana Pajin, Nils Juul, Nikola Maravić
Abstract:
The possibility of application the dietary fibers in production of crackers was observed in this work, as well as their influence on rheological and textural properties on the dough for crackers and influence on sensory properties of obtained crackers. Three different dietary fibers, oat, potato and pea fibers, replaced 10% of wheat flour. Long fermentation process and baking test method were used for crackers production. The changes of dough for crackers were observed by rheological methods of determination the viscoelastic dough properties and by textural measurements. Sensory quality of obtained crackers was described using quantity descriptive method (QDA) by trained members of descriptive panel. Additional analysis of crackers surface was performed by videometer. Based on rheological determination, viscoelastic properties of dough for crackers were reduced by application of dietary fibers. Manipulation of dough with 10% of potato fiber was disabled, thus the recipe modification included increase in water content at 35%. Dough compliance to constant stress for samples with dietary fibers decreased, due to more rigid and stiffer dough consistency compared to control sample. Also, hardness of dough for these samples increased and dough extensibility decreased. Sensory properties of final products, crackers, were reduced compared to control sample. Application of dietary fibers affected mostly hardness, structure and crispness of the crackers. Observed crackers were low marked for flavor and taste, due to influence of fibers specific aroma. The sample with 10% of potato fibers and increased water content was the most adaptable to applied stresses and to production process. Also this sample was close to control sample without dietary fibers by evaluation of sensory properties and by results of videometer method.Keywords: crackers, dietary fibers, rheology, sensory properties
Procedia PDF Downloads 3231601 Integration of Polarization States and Color Multiplexing through a Singular Metasurface
Authors: Tarik Sipahi
Abstract:
Photonics research continues to push the boundaries of optical science, and the development of metasurface technology has emerged as a transformative force in this domain. The work presents the intricacies of a unified metasurface design tailored for efficient polarization and color control in optical systems. The proposed unified metasurface serves as a singular, nanoengineered optical element capable of simultaneous polarization modulation and color encoding. Leveraging principles from metamaterials and nanophotonics, this design allows for unprecedented control over the behavior of light at the subwavelength scale. The metasurface's spatially varying architecture enables seamless manipulation of both polarization states and color wavelengths, paving the way for a paradigm shift in optical system design. The advantages of this unified metasurface are diverse and impactful. By consolidating functions that traditionally require multiple optical components, the design streamlines optical systems, reducing complexity and enhancing overall efficiency. This approach is particularly promising for applications where compactness, weight considerations, and multifunctionality are crucial. Furthermore, the proposed unified metasurface design not only enhances multifunctionality but also addresses key challenges in optical system design, offering a versatile solution for applications demanding compactness and lightweight structures. The metasurface's capability to simultaneously manipulate polarization and color opens new possibilities in diverse technological fields. The research contributes to the evolution of optical science by showcasing the transformative potential of metasurface technology, emphasizing its role in reshaping the landscape of optical system architectures. This work represents a significant step forward in the ongoing pursuit of pushing the boundaries of photonics, providing a foundation for future innovations in compact and efficient optical devices.Keywords: metasurface, nanophotonics, optical system design, polarization control
Procedia PDF Downloads 541600 Effects of Non-Diagnostic Haptic Information on Consumers' Product Judgments and Decisions
Authors: Eun Young Park, Jongwon Park
Abstract:
A physical touch of a product can provide ample diagnostic information about the product attributes and quality. However, consumers’ product judgments and purchases can be erroneously influenced by non-diagnostic haptic information. For example, consumers’ evaluations of the coffee they drink could be affected by the heaviness of a cup that is used for just serving the coffee. This important issue has received little attention in prior research. The present research contributes to the literature by identifying when and how non-diagnostic haptic information can have an influence and why such influence occurs. Specifically, five studies experimentally varied the content of non-diagnostic haptic information, such as the weight of a cup (heavy vs. light) and the texture of a cup holder (smooth vs. rough), and then assessed the impact of the manipulation on product judgments and decisions. Results show that non-diagnostic haptic information has a biasing impact on consumer judgments. For example, the heavy (vs. light) cup increases consumers’ perception of the richness of coffee in it, and the rough (vs. smooth) texture of a cup holder increases the perception of the healthfulness of fruit juice in it, which in turn increases consumers’ purchase intentions of the product. When consumers are cognitively distracted during the touch experience, the impact of the content of haptic information is no longer evident, but the valence (positive vs. negative) of the haptic experience influences product judgments. However, consumers are able to avoid the impact of non-diagnostic haptic information, if and only if they are both knowledgeable about the product category and undistracted from processing the touch experience. In sum, the nature of the influence by non-diagnostic haptic information (i.e., assimilation effect vs. contrast effect vs. null effect) is determined by the content and valence of haptic information, the relative impact of which depends on whether consumers can identify the content and source of the haptic information. Theoretically, to our best knowledge, this research is the first to document the empirical evidence of the interplay between cognitive and affective processes that determines the impact of non-diagnostic haptic information. Managerial implications are discussed.Keywords: consumer behavior, haptic information, product judgments, touch effect
Procedia PDF Downloads 1761599 Emotion Recognition in Video and Images in the Wild
Authors: Faizan Tariq, Moayid Ali Zaidi
Abstract:
Facial emotion recognition algorithms are expanding rapidly now a day. People are using different algorithms with different combinations to generate best results. There are six basic emotions which are being studied in this area. Author tried to recognize the facial expressions using object detector algorithms instead of traditional algorithms. Two object detection algorithms were chosen which are Faster R-CNN and YOLO. For pre-processing we used image rotation and batch normalization. The dataset I have chosen for the experiments is Static Facial Expression in Wild (SFEW). Our approach worked well but there is still a lot of room to improve it, which will be a future direction.Keywords: face recognition, emotion recognition, deep learning, CNN
Procedia PDF Downloads 1881598 Diapause Incidence in Zygogramma bicolorata Pallister Coleoptera: Chrysomelidae
Authors: Fazil Hasan, M. Shafiq Ansari, Mohammad Muslim
Abstract:
Zygogramma bicolorata Pallister (Coleoptera: Chrysomelidae) is an exotic insect and effective biocontrol agent of Parthenium hysterophorus L. (Asteraceae). Our study aimed to determine the induction and termination of diapause, in response to abiotic (temperature and moisture) and biotic factors (age and reproductive status) and the effect of diapause on adult longevity and female fecundity. The adults burrowed into the soil about 1–6 cm below the surface for diapause at any time from July to December with a peak of 70% in the 2nd week of December at Aligarh region, India. The termination of diapause took place in May and June with the commencement of monsoon rains. Non-diapausing adults were also capable of breeding during winter under laboratory conditions. There was a significantly increased in the percentage of diapaused adults in subsequent generation i.e. 4% in F1 generation and 90% in F7 generation. The percentage of diapause was also significantly increased with age of adults. It has a positive effect on female fecundity as compared to the fecundity in pre-diapaused duration. Experiments proved that soil moisture played an important role in providing the conditions for initiation and termination of diapause. The adults which undergone diapause in January and February were continuously exposed to 35º, 40º and 45º C for one week and a daily dose of 10 and 8 hours for 6 and 5 days, respectively resulting in termination of diapause. This method may be used to initiate mass multiplication for carrying out releases early in the season. Exposure of adults to extremely low temperatures i.e. 5º and 10º C induced 94.3% and 92.5% diapause, respectively with no adult mortality. Therefore, low temperatures can also be used as a medium for the storage of mass reared beetles for a long time without having negative effect on their longevity and fecundity. Thus, our findings are of great utility in the biological suppression of P. hysterophorus as it will enhance the effectiveness of this beetle through manipulation of diapause.Keywords: Zygogramma bicolorata, environmental factors, age, sex, diapause, Parthenium hysterophorus, biocontrol
Procedia PDF Downloads 3061597 Yacht DB Construction Based on Five Essentials of Sailing
Authors: Jae-Neung Lee, Myung-Won Lee, Jung-Su Han, Keun-Chang Kwak
Abstract:
The paper established DB on the basis of five sailing essentials in the real yachting environment. It obtained the yacht condition (tilt, speed and course), surrounding circumstances (wind direction and speed) and user motion. Gopro camera for image processing was used to recognize the user motion and tilt sensor was employed to see the yacht balance. In addition, GPS for course, wind speed and direction sensor and marked suit were employed.Keywords: DB consturuction, yacht, five essentials of sailing, marker, Gps
Procedia PDF Downloads 4621596 Study of a Lean Premixed Combustor: A Thermo Acoustic Analysis
Authors: Minoo Ghasemzadeh, Rouzbeh Riazi, Shidvash Vakilipour, Alireza Ramezani
Abstract:
In this study, thermo acoustic oscillations of a lean premixed combustor has been investigated, and a mono-dimensional code was developed in this regard. The linearized equations of motion are solved for perturbations with time dependence〖 e〗^iwt. Two flame models were considered in this paper and the effect of mean flow and boundary conditions were also investigated. After manipulation of flame heat release equation together with the equations of flow perturbation within the main components of the combustor model (i.e., plenum/ premixed duct/ and combustion chamber) and by considering proper boundary conditions between the components of model, a system of eight homogeneous equations can be obtained. This simplification, for the main components of the combustor model, is convenient since low frequency acoustic waves are not affected by bends. Moreover, some elements in the combustor are smaller than the wavelength of propagated acoustic perturbations. A convection time is also assumed to characterize the required time for the acoustic velocity fluctuations to travel from the point of injection to the location of flame front in the combustion chamber. The influence of an extended flame model on the acoustic frequencies of combustor was also investigated, assuming the effect of flame speed as a function of equivalence ratio perturbation, on the rate of flame heat release. The abovementioned system of equations has a related eigenvalue equation which has complex roots. The sign of imaginary part of these roots determines whether the disturbances grow or decay and the real part of these roots would give the frequency of the modes. The results show a reasonable agreement between the predicted values of dominant frequencies in the present model and those calculated in previous related studies.Keywords: combustion instability, dominant frequencies, flame speed, premixed combustor
Procedia PDF Downloads 3791595 Towards Visual Personality Questionnaires Based on Deep Learning and Social Media
Authors: Pau Rodriguez, Jordi Gonzalez, Josep M. Gonfaus, Xavier Roca
Abstract:
Image sharing in social networks has increased exponentially in the past years. Officially, there are 600 million Instagrammers uploading around 100 million photos and videos per day. Consequently, there is a need for developing new tools to understand the content expressed in shared images, which will greatly benefit social media communication and will enable broad and promising applications in education, advertisement, entertainment, and also psychology. Following these trends, our work aims to take advantage of the existing relationship between text and personality, already demonstrated by multiple researchers, so that we can prove that there exists a relationship between images and personality as well. To achieve this goal, we consider that images posted on social networks are typically conditioned on specific words, or hashtags, therefore any relationship between text and personality can also be observed with those posted images. Our proposal makes use of the most recent image understanding models based on neural networks to process the vast amount of data generated by social users to determine those images most correlated with personality traits. The final aim is to train a weakly-supervised image-based model for personality assessment that can be used even when textual data is not available, which is an increasing trend. The procedure is described next: we explore the images directly publicly shared by users based on those accompanying texts or hashtags most strongly related to personality traits as described by the OCEAN model. These images will be used for personality prediction since they have the potential to convey more complex ideas, concepts, and emotions. As a result, the use of images in personality questionnaires will provide a deeper understanding of respondents than through words alone. In other words, from the images posted with specific tags, we train a deep learning model based on neural networks, that learns to extract a personality representation from a picture and use it to automatically find the personality that best explains such a picture. Subsequently, a deep neural network model is learned from thousands of images associated with hashtags correlated to OCEAN traits. We then analyze the network activations to identify those pictures that maximally activate the neurons: the most characteristic visual features per personality trait will thus emerge since the filters of the convolutional layers of the neural model are learned to be optimally activated depending on each personality trait. For example, among the pictures that maximally activate the high Openness trait, we can see pictures of books, the moon, and the sky. For high Conscientiousness, most of the images are photographs of food, especially healthy food. The high Extraversion output is mostly activated by pictures of a lot of people. In high Agreeableness images, we mostly see flower pictures. Lastly, in the Neuroticism trait, we observe that the high score is maximally activated by animal pets like cats or dogs. In summary, despite the huge intra-class and inter-class variabilities of the images associated to each OCEAN traits, we found that there are consistencies between visual patterns of those images whose hashtags are most correlated to each trait.Keywords: emotions and effects of mood, social impact theory in social psychology, social influence, social structure and social networks
Procedia PDF Downloads 1981594 Interactive IoT-Blockchain System for Big Data Processing
Authors: Abdallah Al-ZoubI, Mamoun Dmour
Abstract:
The spectrum of IoT devices is becoming widely diversified, entering almost all possible fields and finding applications in industry, health, finance, logistics, education, to name a few. The IoT active endpoint sensors and devices exceeded the 12 billion mark in 2021 and are expected to reach 27 billion in 2025, with over $34 billion in total market value. This sheer rise in numbers and use of IoT devices bring with it considerable concerns regarding data storage, analysis, manipulation and protection. IoT Blockchain-based systems have recently been proposed as a decentralized solution for large-scale data storage and protection. COVID-19 has actually accelerated the desire to utilize IoT devices as it impacted both demand and supply and significantly affected several regions due to logistic reasons such as supply chain interruptions, shortage of shipping containers and port congestion. An IoT-blockchain system is proposed to handle big data generated by a distributed network of sensors and controllers in an interactive manner. The system is designed using the Ethereum platform, which utilizes smart contracts, programmed in solidity to execute and manage data generated by IoT sensors and devices. such as Raspberry Pi 4, Rasbpian, and add-on hardware security modules. The proposed system will run a number of applications hosted by a local machine used to validate transactions. It then sends data to the rest of the network through InterPlanetary File System (IPFS) and Ethereum Swarm, forming a closed IoT ecosystem run by blockchain where a number of distributed IoT devices can communicate and interact, thus forming a closed, controlled environment. A prototype has been deployed with three IoT handling units distributed over a wide geographical space in order to examine its feasibility, performance and costs. Initial results indicated that big IoT data retrieval and storage is feasible and interactivity is possible, provided that certain conditions of cost, speed and thorough put are met.Keywords: IoT devices, blockchain, Ethereum, big data
Procedia PDF Downloads 1501593 Research Trends in Using Virtual Reality for the Analysis and Treatment of Lower-Limb Musculoskeletal Injury of Athletes: A Literature Review
Authors: Hannah K. M. Tang, Muhammad Ateeq, Mark J. Lake, Badr Abdullah, Frederic A. Bezombes
Abstract:
There is little research applying virtual reality (VR) to the treatment of musculoskeletal injury in athletes. This is despite their prevalence, and the implications for physical and psychological health. Nevertheless, developments of wireless VR headsets better facilitate dynamic movement in VR environments (VREs), and more research is expected in this emerging field. This systematic review identified publications that used VR interventions for the analysis or treatment of lower-limb musculoskeletal injury of athletes. It established a search protocol, and through narrative discussion, identified existing trends. Database searches encompassed four term sets: 1) VR systems; 2) musculoskeletal injuries; 3) sporting population; 4) movement outcome analysis. Overall, a total of 126 publications were identified through database searching, and twelve were included in the final analysis and discussion. Many of the studies were pilot and proof of concept work. Seven of the twelve publications were observational studies. However, this may provide preliminary data from which clinical trials will branch. If specified, the focus of the literature was very narrow, with very similar population demographics and injuries. The trends in the literature findings emphasised the role of VR and attentional focus, the strategic manipulation of movement outcomes, and the transfer of skill to the real-world. Causal inferences may have been undermined by flaws, as most studies were limited by the practicality of conducting a two-factor clinical-VR-based study. In conclusion, by assessing the exploratory studies, and combining this with the use of numerous developments, techniques, and tools, a novel application could be established to utilise VR with dynamic movement, for the effective treatment of specific musculoskeletal injuries of athletes.Keywords: athletes, lower-limb musculoskeletal injury, rehabilitation, return-to-sport, virtual reality
Procedia PDF Downloads 2341592 Micropropagation and in vitro Conservation via Slow Growth Techniques of Prunus webbii (Spach) Vierh: An Endangered Plant Species in Albania
Authors: Valbona Sota, Efigjeni Kongjika
Abstract:
Wild almond is a woody species, which is difficult to propagate either generatively by seed or by vegetative methods (grafting or cuttings) and also considered as Endangered (EN) in Albania based on IUCN criteria. As a wild relative of cultivated fruit trees, this species represents a source of genetic variability and can be very important in breeding programs and cultivation. For this reason, it would be of interest to use an effective method of in vitro mid-term conservation, which involves strategies to slow plant growth through physicochemical alterations of in vitro growth conditions. Multiplication of wild almond was carried out using zygotic embryos, as primary explants, with the purpose to develop a successful propagation protocol. Results showed that zygotic embryos can proliferate through direct or indirect organogenesis. During subculture, stage was obtained a great number of new plantlets identical to mother plants derived from the zygotic embryos. All in vitro plantlets obtained from subcultures underwent in vitro conservation by minimal growth in low temperature (4ºC) and darkness. The efficiency of this technique was evaluated for 3, 6, and 10 months of conservation period. Maintenance in these conditions reduced micro cuttings growth. Survival and regeneration rates for each period were evaluated and resulted that the maximal time of conservation without subculture on 4ºC was 10 months, but survival and regeneration rates were significantly reduced, specifically 15.6% and 7.6%. An optimal period of conservation in these conditions can be considered the 5-6 months storage, which can lead to 60-50% of survival and regeneration rates. This protocol may be beneficial for mass propagation, mid-term conservation, and for genetic manipulation of wild almond.Keywords: micropropagation, minimal growth, storage, wild almond
Procedia PDF Downloads 1281591 Femoropatellar Groove: An Anatomical Study
Authors: Mamatha Hosapatna, Anne D. Souza, Vrinda Hari Ankolekar, Antony Sylvan D. Souza
Abstract:
Introduction: The lower extremity of the femur is characterized by an anterior groove in which patella is held during motion. This groove separates the two lips of the trochlea (medial and lateral), prolongation of the two condyles. In humans, the lateral trochlear lip is more developed than the medial one, creating an asymmetric groove that is also specific to the human body. Because of femoral obliquity, contraction of quadriceps leads to a lateral dislocation stress on the patella, and the more elevated lateral side of the patellar groove helps the patella stays in its correct place, acting as a wall against lateral dislocation. This specific shape fits an oblique femur. It is known that femoral obliquity is not genetically determined but comes with orthostatism and biped walking. Material and Methodology: To measure the various dimensions of the Femoropatellar groove (FPG) and femoral condyle using digital image analyser. 37 dried adult femora (22 right,15 left) were used for the study. End on images of the lower end of the femur was taken. Various dimensions of the Femoropatellar groove and FP angle were measured using image J software. Results were analyzed statistically. Results: Maximum of the altitude of medial condyle of the right femur is 4.98± 0.35 cm and of the left femur is 5.20±.16 cm. Maximum altitude of lateral condyle is 5.44±0.4 and 5.50±0.14 on the right and left side respectively. Medial length of the groove is 1.30±0.38 cm on the right side and on the left side is 1.88±0.16 cm. The lateral length of the groove on the right side is 1.900±.16 cm and left side is 1.88±0.16 cm. Femoropatellar angle is 136.38◦±2.59 on the right side and on the left side it is 142.38◦±7.0 Angle and dimensions of the femoropatellar groove on the medial and lateral sides were measured. Asymmetry in the patellar groove was observed. The lateral lip was found to be wider and bigger which correlated with the previous studies. An asymmetrical patellar groove with a protruding lateral side associated with an oblique femur is a specific mark of bipedal locomotion. Conclusion: Dimensions of FPG are important in maintaining the stability of patella and also in knee replacement surgeries. The implants used in to replace the patellofemoral compartment consist of a metal groove to fit on the femoral end and a plastic disc that attaches to the undersurface of the patella. The location and configuration of the patellofemoral groove of the distal femur are clinically significant in the mechanics and pathomechanics of the patellofemoral articulation.Keywords: femoral patellar groove, femoro patellar angle, lateral condyle, medial condyle
Procedia PDF Downloads 4041590 Dosimetry in Interventional Radiology Examinations for Occupational Exposure Monitoring
Authors: Ava Zarif Sanayei, Sedigheh Sina
Abstract:
Interventional radiology (IR) uses imaging guidance, including X-rays and CT scans, to deliver therapy precisely. Most IR procedures are performed under local anesthesia and start with a small needle being inserted through the skin, which may be called pinhole surgery or image-guided surgery. There is increasing concern about radiation exposure during interventional radiology procedures due to procedure complexity. The basic aim of optimizing radiation protection as outlined in ICRP 139, is to strike a balance between image quality and radiation dose while maximizing benefits, ensuring that diagnostic interpretation is satisfactory. This study aims to estimate the equivalent doses to the main trunk of the body for the Interventional radiologist and Superintendent using LiF: Mg, Ti (TLD-100) chips at the IR department of a hospital in Shiraz, Iran. In the initial stage, the dosimeters were calibrated with the use of various phantoms. Afterward, a group of dosimeters was prepared, following which they were used for three months. To measure the personal equivalent dose to the body, three TLD chips were put in a tissue-equivalent batch and used under a protective lead apron. After the completion of the duration, TLDs were read out by a TLD reader. The results revealed that these individuals received equivalent doses of 387.39 and 145.11 µSv, respectively. The findings of this investigation revealed that the total radiation exposure to the staff was less than the annual limit of occupational exposure. However, it's imperative to implement appropriate radiation protection measures. Although the dose received by the interventional radiologist is a bit noticeable, it may be due to the reason for using conventional equipment with over-couch x-ray tubes for interventional procedures. It is therefore important to use dedicated equipment and protective means such as glasses and screens whenever compatible with the intervention when they are available or have them fitted to equipment if they are not present. Based on the results, the placement of staff in an appropriate location led to increasing the dose to the radiologist. Manufacturing and installation of moveable lead curtains with a thickness of 0.25 millimeters can effectively minimize the radiation dose to the body. Providing adequate training on radiation safety principles, particularly for technologists, can be an optimal approach to further decreasing exposure.Keywords: interventional radiology, personal monitoring, radiation protection, thermoluminescence dosimetry
Procedia PDF Downloads 621589 Star Images Constructed Based on Kramer vs. Kramer
Authors: Huailei Wen
Abstract:
The Kramers vs. Kramers (1979) is a film that comprehensively examines the role and status of women under the traditional secular vision, where women have become subordinate to the patriarchal society and family. Through the construction of the protagonist Joanna's dissatisfaction with the social and ethical status quo, her struggle to subvert the existing status of women, and her return to her own self, the story comprehensively reflects the difficult journey of women, represented by Joanna, to subvert the stereotypes and return to their own selves in the specific historical context of the time, revealing the self-value of Joanna's phenomenon to modern women.Keywords: star image, feminism, Kramers vs. Kramers, Hollywood
Procedia PDF Downloads 1091588 Digital Image Correlation Based Mechanical Response Characterization of Thin-Walled Composite Cylindrical Shells
Authors: Sthanu Mahadev, Wen Chan, Melanie Lim
Abstract:
Anisotropy dominated continuous-fiber composite materials have garnered attention in numerous mechanical and aerospace structural applications. Tailored mechanical properties in advanced composites can exhibit superiority in terms of stiffness-to-weight ratio, strength-to-weight ratio, low-density characteristics, coupled with significant improvements in fatigue resistance as opposed to metal structure counterparts. Extensive research has demonstrated their core potential as more than just mere lightweight substitutes to conventional materials. Prior work done by Mahadev and Chan focused on formulating a modified composite shell theory based prognosis methodology for investigating the structural response of thin-walled circular cylindrical shell type composite configurations under in-plane mechanical loads respectively. The prime motivation to develop this theory stemmed from its capability to generate simple yet accurate closed-form analytical results that can efficiently characterize circular composite shell construction. It showcased the development of a novel mathematical framework to analytically identify the location of the centroid for thin-walled, open cross-section, curved composite shells that were characterized by circumferential arc angle, thickness-to-mean radius ratio, and total laminate thickness. Ply stress variations for curved cylindrical shells were analytically examined under the application of centric tensile and bending loading. This work presents a cost-effective, small-platform experimental methodology by taking advantage of the full-field measurement capability of digital image correlation (DIC) for an accurate assessment of key mechanical parameters such as in-plane mechanical stresses and strains, centroid location etc. Mechanical property measurement of advanced composite materials can become challenging due to their anisotropy and complex failure mechanisms. Full-field displacement measurements are well suited for characterizing the mechanical properties of composite materials because of the complexity of their deformation. This work encompasses the fabrication of a set of curved cylindrical shell coupons, the design and development of a novel test-fixture design and an innovative experimental methodology that demonstrates the capability to very accurately predict the location of centroid in such curved composite cylindrical strips via employing a DIC based strain measurement technique. Error percentage difference between experimental centroid measurements and previously estimated analytical centroid results are observed to be in good agreement. The developed analytical modified-shell theory provides the capability to understand the fundamental behavior of thin-walled cylindrical shells and offers the potential to generate novel avenues to understand the physics of such structures at a laminate level.Keywords: anisotropy, composites, curved cylindrical shells, digital image correlation
Procedia PDF Downloads 3181587 Study on the Self-Location Estimate by the Evolutional Triangle Similarity Matching Using Artificial Bee Colony Algorithm
Authors: Yuji Kageyama, Shin Nagata, Tatsuya Takino, Izuru Nomura, Hiroyuki Kamata
Abstract:
In previous study, technique to estimate a self-location by using a lunar image is proposed. We consider the improvement of the conventional method in consideration of FPGA implementation in this paper. Specifically, we introduce Artificial Bee Colony algorithm for reduction of search time. In addition, we use fixed point arithmetic to enable high-speed operation on FPGA.Keywords: SLIM, Artificial Bee Colony Algorithm, location estimate, evolutional triangle similarity
Procedia PDF Downloads 5191586 Cinema and the Documentation of Mass Killings in Third World Countries: A Study of Selected African Films
Authors: Chijindu D. Mgbemere
Abstract:
Mass killing also known as genocide is the systematic killing of people from national, ethnic, or religious group, or an attempt to do so. The act has been there before 1948, when it was officially recognized for what it is. From then, the world has continued to witness genocide in diverse forms- negating different measures by the United Nations and its agencies to curb it. So far, all the studies and documentations on this subject are biased in favor of radio and the print. This paper therefore extended the interrogation of genocide, drumming its devastating effects, using the film medium; and in doing so devised innovative and pragmatic approach to genocide scholarship. It further centered attention on the factors and impacts of genocide, with a view to determine how effective film can be in such a study. The study is anchored on Bateson’s Framing Theory. Four films- Hotel Rwanda, Half of a Yellow Sun, Attack on Darfur, and sarafina, were analyzed, based on background, factors/causes, impacts, and development of genocide, via Content Analysis. The study discovered that: as other continents strive towards peace, acts of genocide are on the increase in African. Bloodletting stereotypes give Africa negative image in the global society. Difficult political frameworks, the trauma of postcolonial state, aggravated by ethnic and religious intolerance, and limited access to resources are responsible for high cases of genocide in Africa. The media, international communities, and peace agencies often abet other than prevent genocide or mass killings in Africa. High human casualty and displacement, children soldering, looting, hunger, rape, sex-slavery and abuse, mental and psychosomatic stress disorders are some of the impacts of genocide. Genocidaires are either condemned or killed. Grievances can be vented using civil resistance, negotiation, adjudication, arbitration, and mediation. The cinema is an effective means of studying and documenting genocide. Africans must factor the image laundering of their continent into consideration. Punishment of genocidaires without an attempt to de-radicalize them is counterproductive.Keywords: African film, genocide, framing theory, mass murder
Procedia PDF Downloads 1181585 Using the Dokeos Platform for Industrial E-Learning Solution
Authors: Kherafa Abdennasser
Abstract:
The application of Information and Communication Technologies (ICT) to the training area led to the creation of this new reality called E-learning. That last one is described like the marriage of multi- media (sound, image and text) and of the internet (diffusion on line, interactivity). Distance learning became an important totality for training and that last pass in particular by the setup of a distance learning platform. In our memory, we will use an open source platform named Dokeos for the management of a distance training of GPS called e-GPS. The learner is followed in all his training. In this system, trainers and learners communicate individually or in group, the administrator setup and make sure of this system maintenance.Keywords: ICT, E-learning, learning plate-forme, Dokeos, GPS
Procedia PDF Downloads 478