Search results for: image processing techniques
10378 Investigating the Effect of VR, Time Study and Ergonomics on the Design of Industrial Workstations
Authors: Aydin Azizi, Poorya Ghafoorpoor Yazdi
Abstract:
This paper presents the review of the studies on the ergonomics, virtual reality, and work measurement (time study) at the industrial workstations because each of these three individual techniques can be used to improve the design of workstations and task position. The objective of this paper is to give an overall literature review that if there is any relation between these three different techniques. Therefore, it is so important to review the scientific studies to find a better and effective way for improving design of workstations. On the other hand, manufacturers found that instead of using one of the approaches, utilizing the combination of these individual techniques are more effective to reduce the cost and production time.Keywords: ergonomics, time study, virtual reality, workplace
Procedia PDF Downloads 11910377 View Synthesis of Kinetic Depth Imagery for 3D Security X-Ray Imaging
Authors: O. Abusaeeda, J. P. O. Evans, D. Downes
Abstract:
We demonstrate the synthesis of intermediary views within a sequence of X-ray images that exhibit depth from motion or kinetic depth effect in a visual display. Each synthetic image replaces the requirement for a linear X-ray detector array during the image acquisition process. Scale invariant feature transform, SIFT, in combination with epipolar morphing is employed to produce synthetic imagery. Comparison between synthetic and ground truth images is reported to quantify the performance of the approach. Our work is a key aspect in the development of a 3D imaging modality for the screening of luggage at airport checkpoints. This programme of research is in collaboration with the UK Home Office and the US Dept. of Homeland Security.Keywords: X-ray, kinetic depth, KDE, view synthesis
Procedia PDF Downloads 26510376 Analysis of Two Phase Hydrodynamics in a Column Flotation by Particle Image Velocimetry
Authors: Balraju Vadlakonda, Narasimha Mangadoddy
Abstract:
The hydrodynamic behavior in a laboratory column flotation was analyzed using particle image velocimetry. For complete characterization of column flotation, it is necessary to determine the flow velocity induced by bubbles in the liquid phase, the bubble velocity and bubble characteristics:diameter,shape and bubble size distribution. An experimental procedure for analyzing simultaneous, phase-separated velocity measurements in two-phase flows was introduced. The non-invasive PIV technique has used to quantify the instantaneous flow field, as well as the time averaged flow patterns in selected planes of the column. Using the novel particle velocimetry (PIV) technique by the combination of fluorescent tracer particles, shadowgraphy and digital phase separation with masking technique measured the bubble velocity as well as the Reynolds stresses in the column. Axial and radial mean velocities as well as fluctuating components were determined for both phases by averaging the sufficient number of double images. Bubble size distribution was cross validated with high speed video camera. Average turbulent kinetic energy of bubble were analyzed. Different air flow rates were considered in the experiments.Keywords: particle image velocimetry (PIV), bubble velocity, bubble diameter, turbulent kinetic energy
Procedia PDF Downloads 51010375 Combined Synchrotron Radiography and Diffraction for in Situ Study of Reactive Infiltration of Aluminum into Iron Porous Preform
Authors: S. Djaziri, F. Sket, A. Hynowska, S. Milenkovic
Abstract:
The use of Fe-Al based intermetallics as an alternative to Cr/Ni based stainless steels is very promising for industrial applications that use critical raw materials parts under extreme conditions. However, the development of advanced Fe-Al based intermetallics with appropriate mechanical properties presents several challenges that involve appropriate processing and microstructure control. A processing strategy is being developed which aims at producing a net-shape porous Fe-based preform that is infiltrated with molten Al or Al-alloy. In the present work, porous Fe-based preforms produced by two different methods (selective laser melting (SLM) and Kochanek-process (KE)) are studied during infiltration with molten aluminum. In the objective to elucidate the mechanisms underlying the formation of Fe-Al intermetallic phases during infiltration, an in-house furnace has been designed for in situ observation of infiltration at synchrotron facilities combining x-ray radiography (XR) and x-ray diffraction (XRD) techniques. The feasibility of this approach has been demonstrated, and information about the melt flow front propagation has been obtained. In addition, reactive infiltration has been achieved where a bi-phased intermetallic layer has been identified to be formed between the solid Fe and liquid Al. In particular, a tongue-like Fe₂Al₅ phase adhering to the Fe and a needle-like Fe₄Al₁₃ phase adhering to the Al were observed. The growth of the intermetallic compound was found to be dependent on the temperature gradient present along the preform as well as on the reaction time which will be discussed in view of the different obtained results.Keywords: combined synchrotron radiography and diffraction, Fe-Al intermetallic compounds, in-situ molten Al infiltration, porous solid Fe preforms
Procedia PDF Downloads 22610374 The Influence of Different Technologies on the Infiltration Properties and Soil Surface Crusting Processing in the North Bohemia Region
Authors: Miroslav Dumbrovsky, Lucie Larisova
Abstract:
The infiltration characteristic of the soil surface is one of the major factors that determines the potential soil degradation risk. The physical, chemical and biological characteristic of soil is changed by the processing of soil. The infiltration soil ability has an important role in soil and water conservation. The subject of the contribution is the evaluation of the influence of the conventional tillage and reduced tillage technology on soil surface crusting processing and infiltration properties of the soil in the North Bohemia region. Field experimental work at the area was carried out in the years 2013-2016 on Cambisol district medium-heavy clayey soil. The research was conducted on sloping erosion-endangered blocks of compacted arable land. The areas were chosen each year in the way that one of the experimental areas was handled by conventional tillage technologies and the other by reduced tillage technologies. Intact soil samples were taken into Kopecký´s cylinders in the three landscape positions, at a depth of 10 cm (representing topsoil) and 30 cm (representing subsoil). The cumulative infiltration was measured using a mini-disc infiltrometer near the consumption points. The Zhang method (1997), which provides an estimate of the unsaturated hydraulic conductivity K(h), was used for the evaluation of the infiltration tests of the mini-disc infiltrometer. The soil profile processed by conventional tillage showed a higher degree of compaction and soil crusting processing. The bulk density was between 1.10–1.67 g.cm⁻³, compared to the land processed by the reduced tillage technology, where the values were between 0.80–1.29 g.cm⁻³. Unsaturated hydraulic conductivity values were about one-third higher within the reduced tillage technology soil processing.Keywords: soil crusting processing, unsaturated hydraulic conductivity, cumulative infiltration, bulk density, porosity
Procedia PDF Downloads 24710373 Training a Neural Network to Segment, Detect and Recognize Numbers
Authors: Abhisek Dash
Abstract:
This study had three neural networks, one for number segmentation, one for number detection and one for number recognition all of which are coupled to one another. All networks were trained on the MNIST dataset and were convolutional. It was assumed that the images had lighter background and darker foreground. The segmentation network took 28x28 images as input and had sixteen outputs. Segmentation training starts when a dark pixel is encountered. Taking a window(7x7) over that pixel as focus, the eight neighborhood of the focus was checked for further dark pixels. The segmentation network was then trained to move in those directions which had dark pixels. To this end the segmentation network had 16 outputs. They were arranged as “go east”, ”don’t go east ”, “go south east”, “don’t go south east”, “go south”, “don’t go south” and so on w.r.t focus window. The focus window was resized into a 28x28 image and the network was trained to consider those neighborhoods which had dark pixels. The neighborhoods which had dark pixels were pushed into a queue in a particular order. The neighborhoods were then popped one at a time stitched to the existing partial image of the number one at a time and trained on which neighborhoods to consider when the new partial image was presented. The above process was repeated until the image was fully covered by the 7x7 neighborhoods and there were no more uncovered black pixels. During testing the network scans and looks for the first dark pixel. From here on the network predicts which neighborhoods to consider and segments the image. After this step the group of neighborhoods are passed into the detection network. The detection network took 28x28 images as input and had two outputs denoting whether a number was detected or not. Since the ground truth of the bounds of a number was known during training the detection network outputted in favor of number not found until the bounds were not met and vice versa. The recognition network was a standard CNN that also took 28x28 images and had 10 outputs for recognition of numbers from 0 to 9. This network was activated only when the detection network votes in favor of number detected. The above methodology could segment connected and overlapping numbers. Additionally the recognition unit was only invoked when a number was detected which minimized false positives. It also eliminated the need for rules of thumb as segmentation is learned. The strategy can also be extended to other characters as well.Keywords: convolutional neural networks, OCR, text detection, text segmentation
Procedia PDF Downloads 16110372 Dynamic Foot Pressure Measurement System Using Optical Sensors
Authors: Tanapon Keatsamarn, Chuchart Pintavirooj
Abstract:
Foot pressure measurement provides necessary information for diagnosis diseases, foot insole design, disorder prevention and other application. In this paper, dynamic foot pressure measurement is presented for pressure measuring with high resolution and accuracy. The dynamic foot pressure measurement system consists of hardware and software system. The hardware system uses a transparent acrylic plate and uses steel as the base. The glossy white paper is placed on the top of the transparent acrylic plate and covering with a black acrylic on the system to block external light. Lighting from LED strip entering around the transparent acrylic plate. The optical sensors, the digital cameras, are underneath the acrylic plate facing upwards. They have connected with software system to process and record foot pressure video in avi file. Visual Studio 2017 is used for software system using OpenCV library.Keywords: foot, foot pressure, image processing, optical sensors
Procedia PDF Downloads 24810371 A Comparative Study of Optimization Techniques and Models to Forecasting Dengue Fever
Abstract:
Dengue is a serious public health issue that causes significant annual economic and welfare burdens on nations. However, enhanced optimization techniques and quantitative modeling approaches can predict the incidence of dengue. By advocating for a data-driven approach, public health officials can make informed decisions, thereby improving the overall effectiveness of sudden disease outbreak control efforts. The National Oceanic and Atmospheric Administration and the Centers for Disease Control and Prevention are two of the U.S. Federal Government agencies from which this study uses environmental data. Based on environmental data that describe changes in temperature, precipitation, vegetation, and other factors known to affect dengue incidence, many predictive models are constructed that use different machine learning methods to estimate weekly dengue cases. The first step involves preparing the data, which includes handling outliers and missing values to make sure the data is prepared for subsequent processing and the creation of an accurate forecasting model. In the second phase, multiple feature selection procedures are applied using various machine learning models and optimization techniques. During the third phase of the research, machine learning models like the Huber Regressor, Support Vector Machine, Gradient Boosting Regressor (GBR), and Support Vector Regressor (SVR) are compared with several optimization techniques for feature selection, such as Harmony Search and Genetic Algorithm. In the fourth stage, the model's performance is evaluated using Mean Square Error (MSE), Mean Absolute Error (MAE), and Root Mean Square Error (RMSE) as assistance. Selecting an optimization strategy with the least number of errors, lowest price, biggest productivity, or maximum potential results is the goal. In a variety of industries, including engineering, science, management, mathematics, finance, and medicine, optimization is widely employed. An effective optimization method based on harmony search and an integrated genetic algorithm is introduced for input feature selection, and it shows an important improvement in the model's predictive accuracy. The predictive models with Huber Regressor as the foundation perform the best for optimization and also prediction.Keywords: deep learning model, dengue fever, prediction, optimization
Procedia PDF Downloads 6510370 Depth Estimation in DNN Using Stereo Thermal Image Pairs
Authors: Ahmet Faruk Akyuz, Hasan Sakir Bilge
Abstract:
Depth estimation using stereo images is a challenging problem in computer vision. Many different studies have been carried out to solve this problem. With advancing machine learning, tackling this problem is often done with neural network-based solutions. The images used in these studies are mostly in the visible spectrum. However, the need to use the Infrared (IR) spectrum for depth estimation has emerged because it gives better results than visible spectra in some conditions. At this point, we recommend using thermal-thermal (IR) image pairs for depth estimation. In this study, we used two well-known networks (PSMNet, FADNet) with minor modifications to demonstrate the viability of this idea.Keywords: thermal stereo matching, deep neural networks, CNN, Depth estimation
Procedia PDF Downloads 27910369 Edge Detection Using Multi-Agent System: Evaluation on Synthetic and Medical MR Images
Authors: A. Nachour, L. Ouzizi, Y. Aoura
Abstract:
Recent developments on multi-agent system have brought a new research field on image processing. Several algorithms are used simultaneously and improved in deferent applications while new methods are investigated. This paper presents a new automatic method for edge detection using several agents and many different actions. The proposed multi-agent system is based on parallel agents that locally perceive their environment, that is to say, pixels and additional environmental information. This environment is built using Vector Field Convolution that attract free agent to the edges. Problems of partial, hidden or edges linking are solved with the cooperation between agents. The presented method was implemented and evaluated using several examples on different synthetic and medical images. The obtained experimental results suggest that this approach confirm the efficiency and accuracy of detected edge.Keywords: edge detection, medical MRImages, multi-agent systems, vector field convolution
Procedia PDF Downloads 39110368 Electrodeposition and Selenization of Cuin Alloys for the Synthesis of Photoactive Cu2in1-X Gax Se2 (Cigs) Thin Films
Authors: Mohamed Benaicha, Mahdi Allam
Abstract:
A new two stage electrochemical process as a safe, large area and low processing cost technique for the production of semi-conducting CuInSe2 (CIS) thin films is studied. CuIn precursors were first potentiostatically electrodeposited onto molybdenum substrates from an acidic thiocyanate electrolyte. In a second stage, the prepared metallic CuIn layers were used as substrate in the selenium electrochemical deposition system and subjected to a thermal treatment in vacuum atmosphere, to eliminate binary phase formation by reaction of the Cu2-x Se and InxSey selenides, leading to the formation of CuInSe2 thin film. Electrochemical selenization from aqueous electrolyte is introduced as an alternative to toxic and hazardous H2Se or Se vapor phase selenization used in physical techniques. In this study, the influence of film deposition parameters such as bath composition, temperature and potential on film properties was studied. The electrochemical, morphological, structural and compositional properties of electrodeposited thin films were characterized using various techniques. Results of Cyclic and Stripping-Cyclic Voltammetry (CV, SCV), Scanning Electron Microscopy (SEM) and Energy Dispersive X-Ray microanalysis (EDX) investigations revealed good reproducibility and homogeneity of the film composition. Thereby optimal technological parameters for the electrochemical production of CuIn, Se as precursors for CuInSe2 thin layers are determined.Keywords: photovoltaic, CIGS, copper alloys, electrodeposition, thin films
Procedia PDF Downloads 46410367 Studying the Value-Added Chain for the Fish Distribution Process at Quang Binh Fishing Port in Vietnam
Authors: Van Chung Nguyen
Abstract:
The purpose of this study is to study the current status of the value chain for fish distribution at Quang Binh Fishing Port with 360 research samples in which the research subjects are fishermen, traders, retailers, and businesses. The research uses the approach of applying the value chain theoretical framework of Kaplinsky and Morris to quantify and describe market channels and actors participating in the value chain and analyze the value-added process of these companies according to market channels. The analysis results show that fishermen directly catch fish with high economic efficiency, but processing enterprises and, especially retailers, are the agents to obtain higher added value. Processing enterprises play a role that is not really clear due to outdated processing technology; in contrast, retailers have the highest added value. This shows that the added value of the fish supply chain at Quang Binh fishing port is still limited, leading to low output quality. Therefore, the selling price of fish to the market is still high compared to the abundant fish resources, leading to low consumption and limiting exports due to the quality of processing enterprises. This reduces demand and fishing capacity, and productivity is lower than potential. To improve the fish value chain at fishing ports, it is necessary to focus on improving product quality, strengthening linkages between actors, building brands and product consumption markets at the same time, improving the capacity of export processing enterprises.Keywords: Quang Binh fishing port, value chain, market, distributions channel
Procedia PDF Downloads 7310366 Multi-Criteria Decision Making Approaches for Facility Planning Problem Evaluation: A Survey
Authors: Ahmed M. El-Araby, Ibrahim Sabry, Ahmed El-Assal
Abstract:
The relationships between the industrial facilities, the capacity available for these facilities, and the costs involved are the main factors in deciding the correct selection of a facility layout. In general, an issue of facility layout is considered to be an unstructured problem of decision-making. The objective of this work is to provide a survey that describes the techniques by which a facility planning problem can be solved and also the effect of these techniques on the efficiency of the layout. The multi-criteria decision making (MCDM) techniques can be classified according to the previous researches into three categories which are the use of single MCDM, combining two or more MCDM, and the integration of MCDM with another technique such as genetic algorithms (GA). This paper presents a review of different multi-criteria decision making (MCDM) techniques that have been proposed in the literature to pick the most suitable layout design. These methods are particularly suitable to deal with complex situations, including various criteria and conflicting goals which need to be optimized simultaneously.Keywords: facility layout, MCDM, GA, literature review
Procedia PDF Downloads 20510365 Recent Development on Application of Microwave Energy on Process Metallurgy
Authors: Mamdouh Omran, Timo Fabritius
Abstract:
A growing interest in microwave heating has emerged recently. Many researchers have begun to pay attention to microwave energy as an alternative technique for processing various primary and secondary raw materials. Compared to conventional methods, microwave processing offers several advantages, such as selective heating, rapid heating, and volumetric heating. The present study gives a summary on our recent works related to the use of microwave energy for the recovery of valuable metals from primary and secondary raw materials. The research is mainly focusing on: Application of microwave for the recovery and recycling of metals from different metallurgical industries wastes (i.e. electric arc furnace (EAF) dust, blast furnace (BF), basic oxygen furnace (BOF) sludge). Application of microwave for upgrading and recovery of valuable metals from primary raw materials (i.e. iron ore). The results indicated that microwave heating is a promising and effective technique for processing primary and secondary steelmaking wastes. After microwave treatment of iron ore for 60 s and 900 W, about a 28.30% increase in grindability.Wet high intensity magnetic separation (WHIMS) indicated that the magnetic separation increased from 34% to 98% after microwave treatment for 90 s and 900 W. In the case of EAF dust, after microwave processing at 1100 W for 20 min, Zinc removal from 64 % to ~ 97 %, depending on mixture ratio and treatment time.Keywords: dielectric properties, microwave heating, raw materials, secondary raw materials
Procedia PDF Downloads 9610364 Visual Search Based Indoor Localization in Low Light via RGB-D Camera
Authors: Yali Zheng, Peipei Luo, Shinan Chen, Jiasheng Hao, Hong Cheng
Abstract:
Most of traditional visual indoor navigation algorithms and methods only consider the localization in ordinary daytime, while we focus on the indoor re-localization in low light in the paper. As RGB images are degraded in low light, less discriminative infrared and depth image pairs are taken, as the input, by RGB-D cameras, the most similar candidates, as the output, are searched from databases which is built in the bag-of-word framework. Epipolar constraints can be used to relocalize the query infrared and depth image sequence. We evaluate our method in two datasets captured by Kinect2. The results demonstrate very promising re-localization results for indoor navigation system in low light environments.Keywords: indoor navigation, low light, RGB-D camera, vision based
Procedia PDF Downloads 46110363 Tumor Detection of Cerebral MRI by Multifractal Analysis
Authors: S. Oudjemia, F. Alim, S. Seddiki
Abstract:
This paper shows the application of multifractal analysis for additional help in cancer diagnosis. The medical image processing is a very important discipline in which many existing methods are in search of solutions to real problems of medicine. In this work, we present results of multifractal analysis of brain MRI images. The purpose of this analysis was to separate between healthy and cancerous tissue of the brain. A nonlinear method based on multifractal detrending moving average (MFDMA) which is a generalization of the detrending fluctuations analysis (DFA) is used for the detection of abnormalities in these images. The proposed method could make separation of the two types of brain tissue with success. It is very important to note that the choice of this non-linear method is due to the complexity and irregularity of tumor tissue that linear and classical nonlinear methods seem difficult to characterize completely. In order to show the performance of this method, we compared its results with those of the conventional method box-counting.Keywords: irregularity, nonlinearity, MRI brain images, multifractal analysis, brain tumor
Procedia PDF Downloads 44310362 A Multi-Criteria Model for Scheduling of Stochastic Single Machine Problem with Outsourcing and Solving It through Application of Chance Constrained
Authors: Homa Ghave, Parmis Shahmaleki
Abstract:
This paper presents a new multi-criteria stochastic mathematical model for a single machine scheduling with outsourcing allowed. There are multiple jobs processing in batch. For each batch, all of job or a quantity of it can be outsourced. The jobs have stochastic processing time and lead time and deterministic due dates arrive randomly. Because of the stochastic inherent of processing time and lead time, we use the chance constrained programming for modeling the problem. First, the problem is formulated in form of stochastic programming and then prepared in a form of deterministic mixed integer linear programming. The objectives are considered in the model to minimize the maximum tardiness and outsourcing cost simultaneously. Several procedures have been developed to deal with the multi-criteria problem. In this paper, we utilize the concept of satisfaction functions to increases the manager’s preference. The proposed approach is tested on instances where the random variables are normally distributed.Keywords: single machine scheduling, multi-criteria mathematical model, outsourcing strategy, uncertain lead times and processing times, chance constrained programming, satisfaction function
Procedia PDF Downloads 26410361 Meta-Analysis Comparing the Femoral Tunnel Length, Femoral Tunnel Position and Graft Bending Angle of Transtibial, Anteromedial and Outside-In Techniques for Single-Bundle Anterior Cruciate Ligament Reconstruction
Authors: Andrew Tan Hwee Chye, Yeo Zhen Ning
Abstract:
This study aims to meta-analyse clinical studies comparing femoral tunnel position (FTP), femoral tunnel length (FTL) and graft bending angle (GBA) of single-bundle Anterior Cruciate Ligament (ACL) reconstruction using Transtibial (TT), Anteromedial (AM) and Outside-in (OI) techniques. A meta-analysis comparing the FTP, FTL and GBA of single-bundle ACL reconstruction utilising the TT, AM and OI was performed. Prospective Comparative Studies (PCS) and Retrospective Comparative Studies (RCS) from PubMed, Cochrane Library, and Embase were included. A total of 17 studies were included in this study. TT had the longest FTL, when compared to AM (Mean difference = 7.38, 95% CI: 3.76 to 11.00, P < 0.001) and OI (Mean difference = 9.47, 95% CI: 4.89 to 14.05, P < 0.001). In the deep-to-shallow direction, the OI resulted in a significantly deeper femoral tunnel as compared to the TT (Mean difference = 4.36, 95% CI: 1.39 to 7.33, P = 0.004) (Figure 6B). The AM technique also contributed to a significantly lower tunnel position as compared to the OI technique (Mean difference = 2.34, 95% CI: 0.76 to 3.92, P = 0.004). There were no significant differences in the graft bending angle between TT, AM and OI techniques. AM and OI techniques provide a more anatomical position as compared to the TT. Although FTL in the TT is longer than the AM and OI, all three techniques exceed the critical length of 25mm. There are no differences in the GBA between the three techniques.Keywords: femoral tunnel position, femoral tunnel length, anterior cruciate ligament, transtibial, graft bending angle, anteromedial, outside-in
Procedia PDF Downloads 12610360 Solving 94-Bit ECDLP with 70 Computers in Parallel
Authors: Shunsuke Miyoshi, Yasuyuki Nogami, Takuya Kusaka, Nariyoshi Yamai
Abstract:
Elliptic curve discrete logarithm problem (ECDLP) is one of problems on which the security of pairing-based cryptography is based. This paper considers Pollard's rho method to evaluate the security of ECDLP on Barreto-Naehrig (BN) curve that is an efficient pairing-friendly curve. Some techniques are proposed to make the rho method efficient. Especially, the group structure on BN curve, distinguished point method, and Montgomery trick are well-known techniques. This paper applies these techniques and shows its optimization. According to the experimental results for which a large-scale parallel system with MySQL is applied, 94-bit ECDLP was solved about 28 hours by parallelizing 71 computers.Keywords: Pollard's rho method, BN curve, Montgomery multiplication
Procedia PDF Downloads 27210359 The Residual Effects of Special Merchandising Sections on Consumers' Shopping Behavior
Authors: Shih-Ching Wang, Mark Lang
Abstract:
This paper examines the secondary effects and consequences of special displays on subsequent shopping behavior. Special displays are studied as a prominent form of in-store or shopper marketing activity. Two experiments are performed using special value and special quality-oriented displays in an online simulated store environment. The impact of exposure to special displays on mindsets and resulting product choices are tested in a shopping task. Impact on store image is also tested. The experiments find that special displays do trigger shopping mindsets that affect product choices and shopping basket composition and value. There are intended and unintended positive and negative effects found. Special value displays improve store price image but trigger a price sensitive shopping mindset that causes more lower-priced items to be purchased, lowering total basket dollar value. Special natural food displays improve store quality image and trigger a quality-oriented mindset that causes fewer lower-priced items to be purchased, increasing total basket dollar value. These findings extend the theories of product categorization, mind-sets, and price sensitivity found in communication research into the retail store environment. Findings also warn retailers to consider the total effects and consequences of special displays when designing and executing in-store or shopper marketing activity.Keywords: special displays, mindset, shopping behavior, price consciousness, product categorization, store image
Procedia PDF Downloads 28310358 Extraction of Essential Oil From Orange Peels
Authors: Aayush Bhisikar, Neha Rajas, Aditya Bhingare, Samarth Bhandare, Amruta Amrurkar
Abstract:
Orange peels are currently thrown away as garbage in India after orange fruits' edible components are consumed. However, the nation depends on important essential oils for usage in companies that produce goods, including food, beverages, cosmetics, and medicines. This study was conducted to show how to effectively use it. By using various extraction techniques, orange peel is used in the creation of essential oils. Stream distillation, water distillation, and solvent extraction were the techniques taken into consideration in this paper. Due to its relative prevalence among the extraction techniques, Design Expert 7.0 was used to plan an experimental run for solvent extraction. Oil was examined to ascertain its physical and chemical characteristics after extraction. It was determined from the outcomes that the orange peels.Keywords: orange peels, extraction, essential oil, distillation
Procedia PDF Downloads 8710357 Extraction of Essential Oil from Orange Peels
Authors: Neha Rajas, Aayush Bhisikar, Samarth Bhandare, Aditya Bhingare, Amruta Amrutkar
Abstract:
Orange peels are currently thrown away as garbage in India after orange fruits' edible components are consumed. However, the nation depends on important essential oils for usage in companies that produce goods, including food, beverages, cosmetics, and medicines. This study was conducted to show how to effectively use it. By using various extraction techniques, orange peel is used in the creation of essential oils. Stream distillation, water distillation, and solvent extraction were the techniques taken into consideration in this paper. Due to its relative prevalence among the extraction techniques, Design Expert 7.0 was used to plan an experimental run for solvent extraction. Oil was examined to ascertain its physical and chemical characteristics after extraction. It was determined from the outcomes that the orange peels.Keywords: orange peels, extraction, distillation, essential oil
Procedia PDF Downloads 8010356 Translation Directionality: An Eye Tracking Study
Authors: Elahe Kamari
Abstract:
Research on translation process has been conducted for more than 20 years, investigating various issues and using different research methodologies. Most recently, researchers have started to use eye tracking to study translation processes. They believed that the observable, measurable data that can be gained from eye tracking are indicators of unobservable cognitive processes happening in the translators’ mind during translation tasks. The aim of this study was to investigate directionality in translation processes through using eye tracking. The following hypotheses were tested: 1) processing the target text requires more cognitive effort than processing the source text, in both directions of translation; 2) L2 translation tasks on the whole require more cognitive effort than L1 tasks; 3) cognitive resources allocated to the processing of the source text is higher in L1 translation than in L2 translation; 4) cognitive resources allocated to the processing of the target text is higher in L2 translation than in L1 translation; and 5) in both directions non-professional translators invest more cognitive effort in translation tasks than do professional translators. The performance of a group of 30 male professional translators was compared with that of a group of 30 male non-professional translators. All the participants translated two comparable texts one into their L1 (Persian) and the other into their L2 (English). The eye tracker measured gaze time, average fixation duration, total task length and pupil dilation. These variables are assumed to measure the cognitive effort allocated to the translation task. The data derived from eye tracking only confirmed the first hypothesis. This hypothesis was confirmed by all the relevant indicators: gaze time, average fixation duration and pupil dilation. The second hypothesis that L2 translation tasks requires allocation of more cognitive resources than L1 translation tasks has not been confirmed by all four indicators. The third hypothesis that source text processing requires more cognitive resources in L1 translation than in L2 translation and the fourth hypothesis that target text processing requires more cognitive effort in L2 translation than L1 translation were not confirmed. It seems that source text processing in L2 translation can be just as demanding as in L1 translation. The final hypothesis that non-professional translators allocate more cognitive resources for the same translation tasks than do the professionals was partially confirmed. One of the indicators, average fixation duration, indicated higher cognitive effort-related values for professionals.Keywords: translation processes, eye tracking, cognitive resources, directionality
Procedia PDF Downloads 46410355 Video Processing of a Football Game: Detecting Features of a Football Match for Automated Calculation of Statistics
Authors: Rishabh Beri, Sahil Shah
Abstract:
We have applied a range of filters and processing in order to extract out the various features of the football game, like the field lines of a football field. Another important aspect was the detection of the players in the field and tagging them according to their teams distinguished by their jersey colours. This extracted information combined about the players and field helped us to create a virtual field that consists of the playing field and the players mapped to their locations in it.Keywords: Detect, Football, Players, Virtual
Procedia PDF Downloads 33110354 Constraint-Directed Techniques for Transport Scheduling with Capacity Restrictions of Automotive Manufacturing Components
Authors: Martha Ndeley, John Ikome
Abstract:
In this paper, we expand the scope of constraint-directed techniques to deal with the case of transportation schedule with capacity restrictions where the scheduling problem includes alternative activities. That is, not only does the scheduling problem consist of determining when an activity is to be executed, but also determining which set of alternative activities is to be executed at all level of transportation from input to output. Such problems encompass both alternative resource problems and alternative process plan problems. We formulate a constraint-based representation of alternative activities to model problems containing such choices. We then extend existing constraint-directed scheduling heuristic commitment techniques and propagators to reason directly about the fact that an activity does not necessarily have to exist in a final transportation schedule without being completed. Tentative results show that an algorithm using a novel texture-based heuristic commitment technique propagators achieves the best overall performance of the techniques tested.Keywords: production, transportation, scheduling, integrated
Procedia PDF Downloads 36210353 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis
Authors: Meng Su
Abstract:
High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis
Procedia PDF Downloads 10810352 Integration of Quality Function Deployment and Modular Function Deployment in Product Development
Authors: Naga Velamakuri, Jyothi K. Reddy
Abstract:
Quality must be designed into a product and not inspected has become the main motto of all the companies globally. Due to the rapidly increasing technology in the past few decades, the nature of demands from the consumers has become more sophisticated. To sustain this global revolution of innovation in production systems, companies have to take steps to accommodate this technology growth. In this process of understanding the customers' expectations, all the firms globally take steps to deliver a perfect output. Most of these techniques also concentrate on the consistent development and optimization of the product to exceed the expectations. Quality Function Deployment(QFD) and Modular Function Deployment(MFD) are such techniques which rely on the voice of the customer and help deliver the needs. In this paper, Quality Function Deployment and Modular Function Deployment techniques which help in converting the quantitative descriptions to qualitative outcomes are discussed. The area of interest would be to understand the scope of each of the techniques and the application range in product development when these are applied together to any problem. The research question would be mainly aimed at comprehending the limitations using modularity in product development.Keywords: quality function deployment, modular function deployment, house of quality, methodology
Procedia PDF Downloads 32810351 Changing Body Ideals of Ethnically Diverse Gay and Heterosexual Men and the Proliferation of Social and Entertainment Media
Authors: Cristina Azocar, Ivana Markova
Abstract:
A survey of 565 male undergraduates examined the effects of exposure to social networking sites and entertainment media on young men’s body image. Exposure to social and to entertainment media was found to have negative effects on men’s body satisfaction, social comparison, and thin ideal internalization. Findings indicated significant differences in those men who were more exposed to social and to entertainment media than those who were not as exposed. Consistent with past studies, gay men were found to be more dissatisfied with their bodies than straight men. Gay men compared themselves to other better-looking individuals and internalized ideal body types seen in media significantly more than their straight counterparts. Surprisingly, straight men seem to care as much about their physical attractiveness/appearance as gay men do, but only in public settings such as at the beach, at athletic events (including gyms) and social events. Although on average ethnic groups were more similar than different, small but significant differences occurred with Asian men indicating significantly higher body dissatisfaction than White/European men and Middle Eastern/Arab men their counterparts. The study increases our knowledge about SNS and entertainment use and its associated body image, and body satisfaction affects among low-income ethnic minority men.Keywords: body dissatisfaction, body image, entertainment media, gay men, race and ethnicity, social economic status, social comparison, social media
Procedia PDF Downloads 13310350 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs
Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.
Abstract:
Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification
Procedia PDF Downloads 12510349 Relationship among Teams' Information Processing Capacity and Performance in Information System Projects: The Effects of Uncertainty and Equivocality
Authors: Ouafa Sakka, Henri Barki, Louise Cote
Abstract:
Uncertainty and equivocality are defined in the information processing literature as two task characteristics that require different information processing responses from managers. As uncertainty often stems from a lack of information, addressing it is thought to require the collection of additional data. On the other hand, as equivocality stems from ambiguity and a lack of understanding of the task at hand, addressing it is thought to require rich communication between those involved. Past research has provided weak to moderate empirical support to these hypotheses. The present study contributes to this literature by defining uncertainty and equivocality at the project level and investigating their moderating effects on the association between several project information processing constructs and project performance. The information processing constructs considered are the amount of information collected by the project team, and the richness and frequency of formal communications among the team members to discuss the project’s follow-up reports. Data on 93 information system development (ISD) project managers was collected in a questionnaire survey and analyzed it via the Fisher Test for correlation differences. The results indicate that the highest project performance levels were observed in projects characterized by high uncertainty and low equivocality in which project managers were provided with detailed and updated information on project costs and schedules. In addition, our findings show that information about user needs and technical aspects of the project is less useful to managing projects where uncertainty and equivocality are high. Further, while the strongest positive effect of interactive use of follow-up reports on performance occurred in projects where both uncertainty and equivocality levels were high, its weakest effect occurred when both of these were low.Keywords: uncertainty, equivocality, information processing model, management control systems, project control, interactive use, diagnostic use, information system development
Procedia PDF Downloads 294