Search results for: point of maximum convergence
8799 Explicit Iterative Scheme for Approximating a Common Solution of Generalized Mixed Equilibrium Problem and Fixed Point Problem for a Nonexpansive Semigroup in Hilbert Space
Authors: Mohammad Farid
Abstract:
In this paper, we introduce and study an explicit iterative method based on hybrid extragradient method to approximate a common solution of generalized mixed equilibrium problem and fixed point problem for a nonexpansive semigroup in Hilbert space. Further, we prove that the sequence generated by the proposed iterative scheme converge strongly to the common solution of generalized mixed equilibrium problem and fixed point problem for a nonexpansive semigroup. This common solution is the unique solution of a variational inequality problem and is the optimality condition for a minimization problem. The results presented in this paper are the supplement, extension and generalization of the previously known results in this area.Keywords: generalized mixed equilibrium problem, fixed-point problem, nonexpansive semigroup, variational inequality problem, iterative algorithms, hybrid extragradient method
Procedia PDF Downloads 4758798 Path Planning for Collision Detection between two Polyhedra
Authors: M. Khouil, N. Saber, M. Mestari
Abstract:
This study aimed to propose, a different architecture of a Path Planning using the NECMOP. where several nonlinear objective functions must be optimized in a conflicting situation. The ability to detect and avoid collision is very important for mobile intelligent machines. However, many artificial vision systems are not yet able to quickly and cheaply extract the wealth information. This network, which has been particularly reviewed, has enabled us to solve with a new approach the problem of collision detection between two convex polyhedra in a fixed time (O (1) time). We used two types of neurons linear and threshold logic, which simplified the actual implementation of all the networks proposed. This article represents a comprehensive algorithm that determine through the AMAXNET network a measure (a mini-maximum point) in a fixed time, which allows us to detect the presence of a potential collision.Keywords: path planning, collision detection, convex polyhedron, neural network
Procedia PDF Downloads 4388797 Hybrid Intelligent Optimization Methods for Optimal Design of Horizontal-Axis Wind Turbine Blades
Authors: E. Tandis, E. Assareh
Abstract:
Designing the optimal shape of MW wind turbine blades is provided in a number of cases through evolutionary algorithms associated with mathematical modeling (Blade Element Momentum Theory). Evolutionary algorithms, among the optimization methods, enjoy many advantages, particularly in stability. However, they usually need a large number of function evaluations. Since there are a large number of local extremes, the optimization method has to find the global extreme accurately. The present paper introduces a new population-based hybrid algorithm called Genetic-Based Bees Algorithm (GBBA). This algorithm is meant to design the optimal shape for MW wind turbine blades. The current method employs crossover and neighborhood searching operators taken from the respective Genetic Algorithm (GA) and Bees Algorithm (BA) to provide a method with good performance in accuracy and speed convergence. Different blade designs, twenty-one to be exact, were considered based on the chord length, twist angle and tip speed ratio using GA results. They were compared with BA and GBBA optimum design results targeting the power coefficient and solidity. The results suggest that the final shape, obtained by the proposed hybrid algorithm, performs better compared to either BA or GA. Furthermore, the accuracy and speed convergence increases when the GBBA is employedKeywords: Blade Design, Optimization, Genetic Algorithm, Bees Algorithm, Genetic-Based Bees Algorithm, Large Wind Turbine
Procedia PDF Downloads 3168796 Convergence of Media in New Era
Authors: Mohamad Reza Asariha
Abstract:
The development and extension of modern communication innovations at an extraordinary speed has caused crucial changes in all financial, social, social and political areas of the world. The improvement of toady and cable innovations, in expansion to expanding the generation and dissemination needs of worldwide programs; the financial defense made it more appealing. The alter of the administration of mechanical economy to data economy and benefit economy in created nations brought approximately uncommon advancements within the standards of world exchange and as a result, it caused the extension of media organizations in outside measurements, and the advancement of financial speculations in many Asian nations, beside the worldwide demand for the utilization of media merchandise, made new markets, and the media both within the household scene of the nations and within the universal field. Universal and financial are of great significance and have and viable and compelling nearness within the condition of picking up, keeping up and expanding financial control and riches within the world. Moreover, mechanical progresses and mechanical joining are critical components in media auxiliary alter. This auxiliary alter took put beneath the impact of digitalization. That’s, the method that broke the boundaries between electronic media administrations. Until presently, the direction of mass media was totally subordinate on certain styles of data transmission that were for the most part utilized. Digitization made it conceivable for any content to be effortlessly transmitted through distinctive electronic transmission styles, and this media merging has had clear impacts on media approaches and the way mass media are controlled.Keywords: media, digital era, digital ages, media convergence
Procedia PDF Downloads 748795 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics
Authors: Farhad Asadi, Mohammad Javad Mollakazemi
Abstract:
In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm
Procedia PDF Downloads 4278794 Assessment the Correlation of Rice Yield Traits by Simulation and Modelling Methods
Authors: Davood Barari Tari
Abstract:
In order to investigate the correlation of rice traits in different nitrogen management methods by modeling programming, an experiment was laid out in rice paddy field in an experimental field at Caspian Coastal Sea region from 2013 to 2014. Variety used was Shiroudi as a high yielding variety. Nitrogen management was in two methods. Amount of nitrogen at four levels (30, 60, 90, and 120 Kg N ha-1 and control) and nitrogen-splitting at four levels (T1: 50% in base + 50% in maximum tillering stage, T2= 33.33% basal +33.33% in maximum tillering stage +33.33% in panicle initiation stage, T3=25% basal+37.5% in maximum tillering stage +37.5% in panicle initiation stage, T4: 25% in basal + 25% in maximum tillering stage + 50% in panicle initiation stage). Results showed that nitrogen traits, total grain number, filled spikelets, panicle number per m2 had a significant correlation with grain yield. Results related to calibrated and validation of rice model methods indicated that correlation between rice yield and yield components was accurate. The correlation between panicle length and grain yield was minimum. Physiological indices was simulated with low accuracy. According to results, investigation of the correlation between rice traits in physiological, morphological and phenological characters and yield by modeling and simulation methods are very useful.Keywords: rice, physiology, modelling, simulation, yield traits
Procedia PDF Downloads 3438793 Finding the Optimal Meeting Point Based on Travel Plans in Road Networks
Authors: Mohammad H. Ahmadi, Vahid Haghighatdoost
Abstract:
Given a set of source locations for a group of friends, and a set of trip plans for each group member as a sequence of Categories-of-Interests (COIs) (e.g., restaurant), and finally a specific COI as a common destination that all group members will gather together, in Meeting Point Based on Trip Plans (MPTPs) queries our goal is to find a Point-of-Interest (POI) from different COIs, such that the aggregate travel distance for the group is minimized. In this work, we considered two cases for aggregate function as Sum and Max. For solving this query, we propose an efficient pruning technique for shrinking the search space. Our approach contains three steps. In the first step, it prunes the search space around the source locations. In the second step, it prunes the search space around the centroid of source locations. Finally, we compute the intersection of all pruned areas as the final refined search space. We prove that the POIs beyond the refined area cannot be part of optimal answer set. The paper also covers an extensive performance study of the proposed technique.Keywords: meeting point, trip plans, road networks, spatial databases
Procedia PDF Downloads 1858792 Numerical and Experimental Investigation of Mixed-Mode Fracture of Cement Paste and Interface Under Three-Point Bending Test
Authors: S. Al Dandachli, F. Perales, Y. Monerie, F. Jamin, M. S. El Youssoufi, C. Pelissou
Abstract:
The goal of this research is to study the fracture process and mechanical behavior of concrete under I–II mixed-mode stress, which is essential for ensuring the safety of concrete structures. For this purpose, two-dimensional simulations of three-point bending tests under variable load and geometry on notched cement paste samples of composite samples (cement paste/siliceous aggregate) are modeled by employing Cohesive Zone Models (CZMs). As a result of experimental validation of these tests, the CZM model demonstrates its capacity to predict fracture propagation at the local scale.Keywords: cement paste, interface, cohesive zone model, fracture, three-point flexural test bending
Procedia PDF Downloads 1508791 A Hybrid Heuristic for the Team Orienteering Problem
Authors: Adel Bouchakhchoukha, Hakim Akeb
Abstract:
In this work, we propose a hybrid heuristic in order to solve the Team Orienteering Problem (TOP). Given a set of points (or customers), each with associated score (profit or benefit), and a team that has a fixed number of members, the problem to solve is to visit a subset of points in order to maximize the total collected score. Each member performs a tour starting at the start point, visiting distinct customers and the tour terminates at the arrival point. In addition, each point is visited at most once, and the total time in each tour cannot be greater than a given value. The proposed heuristic combines beam search and a local optimization strategy. The algorithm was tested on several sets of instances and encouraging results were obtained.Keywords: team orienteering problem, vehicle routing, beam search, local search
Procedia PDF Downloads 4188790 Comparison of EMG Normalization Techniques Recommended for Back Muscles Used in Ergonomics Research
Authors: Saif Al-Qaisi, Alif Saba
Abstract:
Normalization of electromyography (EMG) data in ergonomics research is a prerequisite for interpreting the data. Normalizing accounts for variability in the data due to differences in participants’ physical characteristics, electrode placement protocols, time of day, and other nuisance factors. Typically, normalized data is reported as a percentage of the muscle’s isometric maximum voluntary contraction (%MVC). Various MVC techniques have been recommended in the literature for normalizing EMG activity of back muscles. This research tests and compares the recommended MVC techniques in the literature for three back muscles commonly used in ergonomics research, which are the lumbar erector spinae (LES), latissimus dorsi (LD), and thoracic erector spinae (TES). Six healthy males from a university population participated in this research. Five different MVC exercises were compared for each muscle using the Tringo wireless EMG system (Delsys Inc.). Since the LES and TES share similar functions in controlling trunk movements, their MVC exercises were the same, which included trunk extension at -60°, trunk extension at 0°, trunk extension while standing, hip extension, and the arch test. The MVC exercises identified in the literature for the LD were chest-supported shoulder extension, prone shoulder extension, lat-pull down, internal shoulder rotation, and abducted shoulder flexion. The maximum EMG signal was recorded during each MVC trial, and then the averages were computed across participants. A one-way analysis of variance (ANOVA) was utilized to determine the effect of MVC technique on muscle activity. Post-hoc analyses were performed using the Tukey test. The MVC technique effect was statistically significant for each of the muscles (p < 0.05); however, a larger sample of participants was needed to detect significant differences in the Tukey tests. The arch test was associated with the highest EMG average at the LES, and also it resulted in the maximum EMG activity more often than the other techniques (three out of six participants). For the TES, trunk extension at 0° was associated with the largest EMG average, and it resulted in the maximum EMG activity the most often (three out of six participants). For the LD, participants obtained their maximum EMG either from chest-supported shoulder extension (three out of six participants) or prone shoulder extension (three out of six participants). Chest-supported shoulder extension, however, had a larger average than prone shoulder extension (0.263 and 0.240, respectively). Although all the aforementioned techniques were superior in their averages, they did not always result in the maximum EMG activity. If an accurate estimate of the true MVC is desired, more than one technique may have to be performed. This research provides additional MVC techniques for each muscle that may elicit the maximum EMG activity.Keywords: electromyography, maximum voluntary contraction, normalization, physical ergonomics
Procedia PDF Downloads 1938789 The Grinding Influence on the Strength of Fan-Out Wafer-Level Packages
Authors: Z. W. Zhong, C. Xu, W. K. Choi
Abstract:
To build a thin fan-out wafer-level package, the package had to be ground to a thin level. In this work, the influence of the grinding processes on the strength of the fan-out wafer-level packages was investigated. After different grinding processes, all specimens were placed on a three-point-bending fixture installed on a universal tester for three-point-bending testing, and the strength of the fan-out wafer-level packages was measured. The experiments revealed that the average flexure strength increased with the decreasing surface roughness height of the fan-out wafer-level package tested. The grinding processes had a significant influence on the strength of the fan-out wafer-level packages investigated.Keywords: FOWLP strength, surface roughness, three-point bending, grinding
Procedia PDF Downloads 2788788 The Image of Victim and Criminal in Love Crimes on Social Media in Egypt: Facebook Discourse Analysis
Authors: Sherehan Hamdalla
Abstract:
Egypt has experienced a series of terrifying love crimes in the last few months. This ‘trend’ of love crimes started with a young man caught on video slaughtering his ex-girlfriend in the street in the city of El Mansoura. The crime shocked all Egyptian citizens at all levels; unfortunately, not less than three similar crimes took place in other different Egyptian cities with the same killing trigger. The characteristics and easy access and reach of social media consider the reason why it is one of the most crucial online communication channels; users utilize social media platforms for sharing and exchanging ideas, news, and many other activities; they can freely share posts that reflect their mindset or personal views regarding any issues, these posts are going viral in all social media account by reposting or numbers of shares for these posts to support the content included, or even to attack. The repetition of sharing certain posts could mobilize other supporters with the same point of view, especially when that crowd’s online participation is confronting a public opinion case’s consequences. The death of that young woman was followed by similar crimes in other cities, such as El Sharkia and Port Said. These love crimes provoked a massive wave of contention among all social classes in Egypt. Strangely, some were supporting the criminal and defending his side for several reasons, which the study will uncover. Facebook, the most popular social media platform for Egyptians, reflects the debate between supporters of the victim and supporters of the criminal. Facebook pages were created specifically to disseminate certain viewpoints online, for example, asking for the maximum penalty to be given to criminals. These pages aimed to mobilize the maximum number of supporters and to affect the outcome of the trials.Keywords: love crimes, victim, criminal, social media
Procedia PDF Downloads 768787 A Spatial Point Pattern Analysis to Recognize Fail Bit Patterns in Semiconductor Manufacturing
Authors: Youngji Yoo, Seung Hwan Park, Daewoong An, Sung-Shick Kim, Jun-Geol Baek
Abstract:
The yield management system is very important to produce high-quality semiconductor chips in the semiconductor manufacturing process. In order to improve quality of semiconductors, various tests are conducted in the post fabrication (FAB) process. During the test process, large amount of data are collected and the data includes a lot of information about defect. In general, the defect on the wafer is the main causes of yield loss. Therefore, analyzing the defect data is necessary to improve performance of yield prediction. The wafer bin map (WBM) is one of the data collected in the test process and includes defect information such as the fail bit patterns. The fail bit has characteristics of spatial point patterns. Therefore, this paper proposes the feature extraction method using the spatial point pattern analysis. Actual data obtained from the semiconductor process is used for experiments and the experimental result shows that the proposed method is more accurately recognize the fail bit patterns.Keywords: semiconductor, wafer bin map, feature extraction, spatial point patterns, contour map
Procedia PDF Downloads 3848786 Influence of Loudness Compression on Hearing with Bone Anchored Hearing Implants
Authors: Anja Kurz, Marc Flynn, Tobias Good, Marco Caversaccio, Martin Kompis
Abstract:
Bone Anchored Hearing Implants (BAHI) are routinely used in patients with conductive or mixed hearing loss, e.g. if conventional air conduction hearing aids cannot be used. New sound processors and new fitting software now allow the adjustment of parameters such as loudness compression ratios or maximum power output separately. Today it is unclear, how the choice of these parameters influences aided speech understanding in BAHI users. In this prospective experimental study, the effect of varying the compression ratio and lowering the maximum power output in a BAHI were investigated. Twelve experienced adult subjects with a mixed hearing loss participated in this study. Four different compression ratios (1.0; 1.3; 1.6; 2.0) were tested along with two different maximum power output settings, resulting in a total of eight different programs. Each participant tested each program during two weeks. A blinded Latin square design was used to minimize bias. For each of the eight programs, speech understanding in quiet and in noise was assessed. For speech in quiet, the Freiburg number test and the Freiburg monosyllabic word test at 50, 65, and 80 dB SPL were used. For speech in noise, the Oldenburg sentence test was administered. Speech understanding in quiet and in noise was improved significantly in the aided condition in any program, when compared to the unaided condition. However, no significant differences were found between any of the eight programs. In contrast, on a subjective level there was a significant preference for medium compression ratios of 1.3 to 1.6 and higher maximum power output.Keywords: Bone Anchored Hearing Implant, baha, compression, maximum power output, speech understanding
Procedia PDF Downloads 3878785 The Impact of Vertical Velocity Parameter Conditions and Its Relationship with Weather Parameters in the Hail Event
Authors: Nadine Ayasha
Abstract:
Hail happened in Sukabumi (August 23, 2020), Sekadau (August 22, 2020), and Bogor (September 23, 2020), where this extreme weather phenomenon occurred in the dry season. This study uses the ERA5 reanalysis model data, it aims to examine the vertical velocity impact on the hail occurrence in the dry season, as well as its relation to other weather parameters such as relative humidity, streamline, and wind velocity. Moreover, HCAI product satellite data is used as supporting data for the convective cloud development analysis. Based on the results of graphs, contours, and Hovmoller vertical cut from ERA5 modeling, the vertical velocity values in the 925 Mb-300 Mb layer in Sukabumi, Sekadau, and Bogor before the hail event ranged between -1.2-(-0.2), -1.5-(-0.2), -1-0 Pa/s. A negative value indicates that there is an upward motion from the air mass that trigger the convective cloud growth, which produces hail. It is evidenced by the presence of Cumulonimbus cloud on HCAI product when the hail falls. Therefore, the vertical velocity has significant effect on the hail event. In addition, the relative humidity in the 850-700 Mb layer is quite wet, which ranges from 80-90%. Meanwhile, the streamline and wind velocity in the three regions show the convergence with slowing wind velocity ranging from 2-4 knots. These results show that the upward motion of the vertical velocity is enough to form the wet atmospheric humidity and form a convergence for the growth of the convective cloud, which produce hail in the dry season.Keywords: hail, extreme weather, vertical velocity, relative humidity, streamline
Procedia PDF Downloads 1598784 Curvature Based-Methods for Automatic Coarse and Fine Registration in Dimensional Metrology
Authors: Rindra Rantoson, Hichem Nouira, Nabil Anwer, Charyar Mehdi-Souzani
Abstract:
Multiple measurements by means of various data acquisition systems are generally required to measure the shape of freeform workpieces for accuracy, reliability and holisticity. The obtained data are aligned and fused into a common coordinate system within a registration technique involving coarse and fine registrations. Standardized iterative methods have been established for fine registration such as Iterative Closest Points (ICP) and its variants. For coarse registration, no conventional method has been adopted yet despite a significant number of techniques which have been developed in the literature to supply an automatic rough matching between data sets. Two main issues are addressed in this paper: the coarse registration and the fine registration. For coarse registration, two novel automated methods based on the exploitation of discrete curvatures are presented: an enhanced Hough Transformation (HT) and an improved Ransac Transformation. The use of curvature features in both methods aims to reduce computational cost. For fine registration, a new variant of ICP method is proposed in order to reduce registration error using curvature parameters. A specific distance considering the curvature similarity has been combined with Euclidean distance to define the distance criterion used for correspondences searching. Additionally, the objective function has been improved by combining the point-to-point (P-P) minimization and the point-to-plane (P-Pl) minimization with automatic weights. These ones are determined from the preliminary calculated curvature features at each point of the workpiece surface. The algorithms are applied on simulated and real data performed by a computer tomography (CT) system. The obtained results reveal the benefit of the proposed novel curvature-based registration methods.Keywords: discrete curvature, RANSAC transformation, hough transformation, coarse registration, ICP variant, point-to-point and point-to-plane minimization combination, computer tomography
Procedia PDF Downloads 4248783 Generalized Additive Model Approach for the Chilean Hake Population in a Bio-Economic Context
Authors: Selin Guney, Andres Riquelme
Abstract:
The traditional bio-economic method for fisheries modeling uses some estimate of the growth parameters and the system carrying capacity from a biological model for the population dynamics (usually a logistic population growth model) which is then analyzed as a traditional production function. The stock dynamic is transformed into a revenue function and then compared with the extraction costs to estimate the maximum economic yield. In this paper, the logistic population growth model for the population is combined with a forecast of the abundance and location of the stock by using a generalized additive model approach. The paper focuses on the Chilean hake population. This method allows for the incorporation of climatic variables and the interaction with other marine species, which in turn will increase the reliability of the estimates and generate better extraction paths for different conservation objectives, such as the maximum biological yield or the maximum economic yield.Keywords: bio-economic, fisheries, GAM, production
Procedia PDF Downloads 2528782 Efficacy of Deep Learning for Below-Canopy Reconstruction of Satellite and Aerial Sensing Point Clouds through Fractal Tree Symmetry
Authors: Dhanuj M. Gandikota
Abstract:
Sensor-derived three-dimensional (3D) point clouds of trees are invaluable in remote sensing analysis for the accurate measurement of key structural metrics, bio-inventory values, spatial planning/visualization, and ecological modeling. Machine learning (ML) holds the potential in addressing the restrictive tradeoffs in cost, spatial coverage, resolution, and information gain that exist in current point cloud sensing methods. Terrestrial laser scanning (TLS) remains the highest fidelity source of both canopy and below-canopy structural features, but usage is limited in both coverage and cost, requiring manual deployment to map out large, forested areas. While aerial laser scanning (ALS) remains a reliable avenue of LIDAR active remote sensing, ALS is also cost-restrictive in deployment methods. Space-borne photogrammetry from high-resolution satellite constellations is an avenue of passive remote sensing with promising viability in research for the accurate construction of vegetation 3-D point clouds. It provides both the lowest comparative cost and the largest spatial coverage across remote sensing methods. However, both space-borne photogrammetry and ALS demonstrate technical limitations in the capture of valuable below-canopy point cloud data. Looking to minimize these tradeoffs, we explored a class of powerful ML algorithms called Deep Learning (DL) that show promise in recent research on 3-D point cloud reconstruction and interpolation. Our research details the efficacy of applying these DL techniques to reconstruct accurate below-canopy point clouds from space-borne and aerial remote sensing through learned patterns of tree species fractal symmetry properties and the supplementation of locally sourced bio-inventory metrics. From our dataset, consisting of tree point clouds obtained from TLS, we deconstructed the point clouds of each tree into those that would be obtained through ALS and satellite photogrammetry of varying resolutions. We fed this ALS/satellite point cloud dataset, along with the simulated local bio-inventory metrics, into the DL point cloud reconstruction architectures to generate the full 3-D tree point clouds (the truth values are denoted by the full TLS tree point clouds containing the below-canopy information). Point cloud reconstruction accuracy was validated both through the measurement of error from the original TLS point clouds as well as the error of extraction of key structural metrics, such as crown base height, diameter above root crown, and leaf/wood volume. The results of this research additionally demonstrate the supplemental performance gain of using minimum locally sourced bio-inventory metric information as an input in ML systems to reach specified accuracy thresholds of tree point cloud reconstruction. This research provides insight into methods for the rapid, cost-effective, and accurate construction of below-canopy tree 3-D point clouds, as well as the supported potential of ML and DL to learn complex, unmodeled patterns of fractal tree growth symmetry.Keywords: deep learning, machine learning, satellite, photogrammetry, aerial laser scanning, terrestrial laser scanning, point cloud, fractal symmetry
Procedia PDF Downloads 1038781 Re-Engineering of Traditional Indian Wadi into Ready-to-Use High Protein Quality and Fibre Rich Chunk
Authors: Radhika Jain, Sangeeta Goomer
Abstract:
In the present study an attempt has been made to re-engineer traditional wadi into wholesome ready-to-use cereal-pulse-based chunks rich in protein quality and fibre content. Chunks were made using extrusion-dehydration combination. Two formulations i.e., whole green gram dhal with instant oats and washed green gram dhal with whole oats were formulated. These chunks are versatile in nature as they can be easily incorporated in day-to-day home-made preparations such as pulao, potato curry and kadhi. Cereal-pulse ratio was calculated using NDpCal%. Limiting amino acids such as lysine, tryptophan, methionine, cysteine and threonine were calculated for maximum amino acid profile in cereal-pulse combination. Time-temperature combination for extrusion at 130oC and dehydration at 65oC for 7 hours and 15 minutes were standardized to obtain maximum protein and fibre content. Proximate analysis such as moisture, fat and ash content were analyzed. Protein content of formulation was 62.10% and 68.50% respectively. Fibre content of formulations was 2.99% and 2.45%, respectively. Using a 5-point hedonic scale, consumer preference trials of 102 consumers were conducted and analyzed. Evaluation of chunks prepared in potato curry, kadi and pulao showed preferences for colour 82%, 87%, 86%, texture and consistency 80%, 81%, 88%, flavour and aroma 74%, 82%, 86%, after taste 70%, 75%, 86% and overall acceptability 77%, 75%, 88% respectively. High temperature inactivates antinutritional compounds such as trypsin inhibitors, lectins, saponins etc. Hence, availability of protein content was increased. Developed products were palatable and easy to prepare.Keywords: extrusion, NDpCal%, protein quality, wadi
Procedia PDF Downloads 2248780 Energy and Exergy Analyses of Thin-Layer Drying of Pineapple Slices
Authors: Apolinar Picado, Steve Alfaro, Rafael Gamero
Abstract:
Energy and exergy analyses of thin-layer drying of pineapple slices (Ananas comosus L.) were conducted in a laboratory tunnel dryer. Drying experiments were carried out at three temperatures (100, 115 and 130 °C) and an air velocity of 1.45 m/s. The effects of drying variables on energy utilisation, energy utilisation ratio, exergy loss and exergy efficiency were studied. The enthalpy difference of the gas increased as the inlet gas temperature increase. It is observed that at the 75 minutes of the drying process the outlet gas enthalpy achieves a maximum value that is very close to the inlet value and remains constant until the end of the drying process. This behaviour is due to the reduction of the total enthalpy within the system, or in other words, the reduction of the effective heat transfer from the hot gas flow to the vegetable being dried. Further, the outlet entropy exhibits a significant increase that is not only due to the temperature variation, but also to the increase of water vapour phase contained in the hot gas flow. The maximum value of the exergy efficiency curve corresponds to the maximum value observed within the drying rate curves. This maximum value represents the stage when the available energy is efficiently used in the removal of the moisture within the solid. As the drying rate decreases, the available energy is started to be less employed. The exergetic efficiency was directly dependent on the evaporation flux and since the convective drying is less efficient that other types of dryer, it is likely that the exergetic efficiency has relatively low values.Keywords: efficiency, energy, exergy, thin-layer drying
Procedia PDF Downloads 2558779 Automatic Multi-Label Image Annotation System Guided by Firefly Algorithm and Bayesian Method
Authors: Saad M. Darwish, Mohamed A. El-Iskandarani, Guitar M. Shawkat
Abstract:
Nowadays, the amount of available multimedia data is continuously on the rise. The need to find a required image for an ordinary user is a challenging task. Content based image retrieval (CBIR) computes relevance based on the visual similarity of low-level image features such as color, textures, etc. However, there is a gap between low-level visual features and semantic meanings required by applications. The typical method of bridging the semantic gap is through the automatic image annotation (AIA) that extracts semantic features using machine learning techniques. In this paper, a multi-label image annotation system guided by Firefly and Bayesian method is proposed. Firstly, images are segmented using the maximum variance intra cluster and Firefly algorithm, which is a swarm-based approach with high convergence speed, less computation rate and search for the optimal multiple threshold. Feature extraction techniques based on color features and region properties are applied to obtain the representative features. After that, the images are annotated using translation model based on the Net Bayes system, which is efficient for multi-label learning with high precision and less complexity. Experiments are performed using Corel Database. The results show that the proposed system is better than traditional ones for automatic image annotation and retrieval.Keywords: feature extraction, feature selection, image annotation, classification
Procedia PDF Downloads 5868778 Temporally Coherent 3D Animation Reconstruction from RGB-D Video Data
Authors: Salam Khalifa, Naveed Ahmed
Abstract:
We present a new method to reconstruct a temporally coherent 3D animation from single or multi-view RGB-D video data using unbiased feature point sampling. Given RGB-D video data, in form of a 3D point cloud sequence, our method first extracts feature points using both color and depth information. In the subsequent steps, these feature points are used to match two 3D point clouds in consecutive frames independent of their resolution. Our new motion vectors based dynamic alignment method then fully reconstruct a spatio-temporally coherent 3D animation. We perform extensive quantitative validation using novel error functions to analyze the results. We show that despite the limiting factors of temporal and spatial noise associated to RGB-D data, it is possible to extract temporal coherence to faithfully reconstruct a temporally coherent 3D animation from RGB-D video data.Keywords: 3D video, 3D animation, RGB-D video, temporally coherent 3D animation
Procedia PDF Downloads 3738777 Estimating 3D-Position of a Stationary Random Acoustic Source Using Bispectral Analysis of 4-Point Detected Signals
Authors: Katsumi Hirata
Abstract:
To develop the useful acoustic environmental recognition system, the method of estimating 3D-position of a stationary random acoustic source using bispectral analysis of 4-point detected signals is proposed. The method uses information about amplitude attenuation and propagation delay extracted from amplitude ratios and angles of auto- and cross-bispectra of the detected signals. It is expected that using bispectral analysis affects less influence of Gaussian noises than using conventional power spectral one. In this paper, the basic principle of the method is mentioned first, and its validity and features are considered from results of the fundamental experiments assumed ideal circumstances.Keywords: 4-point detection, a stationary random acoustic source, auto- and cross-bispectra, estimation of 3D-position
Procedia PDF Downloads 3598776 Minimizing Vehicular Traffic via Integrated Land Use Development: A Heuristic Optimization Approach
Authors: Babu Veeregowda, Rongfang Liu
Abstract:
The current traffic impact assessment methodology and environmental quality review process for approval of land development project are conventional, stagnant, and one-dimensional. The environmental review policy and procedure lacks in providing the direction to regulate or seek alternative land uses and sizes that exploits the existing or surrounding elements of built environment (‘4 D’s’ of development – Density, Diversity, Design, and Distance to Transit) or smart growth principles which influence the travel behavior and have a significant effect in reducing vehicular traffic. Additionally, environmental review policy does not give directions on how to incorporate urban planning into the development in ways such as incorporating non-motorized roadway elements such as sidewalks, bus shelters, and access to community facilities. This research developed a methodology to optimize the mix of land uses and sizes using the heuristic optimization process to minimize the auto dependency development and to meet the interests of key stakeholders. A case study of Willets Point Mixed Use Development in Queens, New York, was used to assess the benefits of the methodology. The approved Willets Point Mixed Use project was based on maximum envelop of size and land use type allowed by current conventional urban renewal plans. This paper will also evaluate the parking accumulation for various land uses to explore the potential for shared parking to further optimize the mix of land uses and sizes. This research is very timely and useful to many stakeholders interested in understanding the benefits of integrated land uses and its development.Keywords: traffic impact, mixed use, optimization, trip generation
Procedia PDF Downloads 2148775 Stress Concentration around Countersunk Hole in Isotropic Plate under Transverse Loading
Authors: Parveen K. Saini, Tarun Agarwal
Abstract:
An investigation into the effect of countersunk depth, plate thickness, countersunk angle and plate width on the stress concentration around countersunk hole is carried out with the help of finite element analysis. The variation of stress concentration with respect to these parameters is studied for three types of loading viz. uniformly distributed load, uniformly varying load and functionally distributed load. The results of the finite element analysis are interpreted and some conclusions are drawn. The distribution of stress concentration around countersunk hole in isotropic plates simply supported at all the edges is found similar and is independent of loading. The maximum stress concentration also occurs at a particular point irrespective of the loading conditions.Keywords: stress concentration factor, countersunk hole, finite element, ANSYS
Procedia PDF Downloads 4488774 Etude 3D Quantum Numerical Simulation of Performance in the HEMT
Authors: A. Boursali, A. Guen-Bouazza
Abstract:
We present a simulation of a HEMT (high electron mobility transistor) structure with and without a field plate. We extract the device characteristics through the analysis of DC, AC and high frequency regimes, as shown in this paper. This work demonstrates the optimal device with a gate length of 15 nm, InAlN/GaN heterostructure and field plate structure, making it superior to modern HEMTs when compared with otherwise equivalent devices. This improves the ability to bear the burden of the current density passes in the channel. We have demonstrated an excellent current density, as high as 2.05 A/m, a peak extrinsic transconductance of 0.59S/m at VDS=2 V, and cutting frequency cutoffs of 638 GHz in the first HEMT and 463 GHz for Field plate HEMT., maximum frequency of 1.7 THz, maximum efficiency of 73%, maximum breakdown voltage of 400 V, leakage current density IFuite=1 x 10-26 A, DIBL=33.52 mV/V and an ON/OFF current density ratio higher than 1 x 1010. These values were determined through the simulation by deriving genetic and Monte Carlo algorithms that optimize the design and the future of this technology.Keywords: HEMT, silvaco, field plate, genetic algorithm, quantum
Procedia PDF Downloads 3498773 Studies on the Physicochemical Properties of Biolubricants Obtained from Vegetable Oils and Their Oxidative Stability
Authors: Expedito J. S. Parente Jr., Italo C. Rios, Joao Paulo C. Marques, Rosana M. A. Saboya, F. Murilo T. Luna, Célio L. Cavalcante Jr.
Abstract:
Increasing constraints of environmental regulation around the world have led to higher demand for biodegradable products. Vegetable oils present some properties that may favor their use as biolubricants; however, there are others, such as resistance to oxidation and pour point, which affect possible commercial applications. In this study, the physicochemical properties of biolubricants synthesized from different vegetable oils were evaluated and compared with petroleum-based lubricant and pure vegetable oil. Chemical modifications applied to the original vegetable oil improved their oxidative stability and pour point significantly. The addition of commercial antioxidants to the bio-based lubricants was evaluated, yielding values of oxidative stability close to those of mineral basestock oil.Keywords: biolubricant, vegetable oil, oxidative stability, pour point, antioxidants
Procedia PDF Downloads 3128772 Novel Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
In this paper, we propose a novel inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multi-class. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.Keywords: bayesian rule, gaussian process classification model with multiclass, gaussian process prior, human action classification, laplace approximation, variational EM algorithm
Procedia PDF Downloads 3348771 Surface Sterilization of Aquatic Plant, Cryptopcoryne affinis by Using Clorox and Mercury Chloride
Authors: Sridevi Devadas
Abstract:
This study was aimed to examine the combination efficiency of Clorox (5.25% Sodium Hypochlorite) and mercury chloride (HgCl2) as reagent for surface sterilization process of aquatic plant, Cryptocoryne affinis (C. affinis). The treatment applied 10% of the Clorox and 0.1 ppm of mercury chloride. The maximum exposure time for Clorox and mercury chloride was 10 min and 60 sec respectively. After exposed to the treatments protocols (T1-T15) the explants were transferred to culture room under control temperature at 25°C ± 2°C and subjected to 16 hours fluorescence light (2000 lumens) for 30 days. The both sterilizing agents were not applied on control specimens. Upon analysis, the result indicates all of the treatments protocols produced sterile explants at range of minimum 1.5 ± 0.7 (30%) to maximum 5.0 ± 0.0 (100%). Meanwhile, maximum 1.0 ± 0.7 numbers of leaves and 1.4 ± 0.6 numbers of roots have been produced. The optimized exposure time was 0 to 15 min for Clorox and 30 sec for HgCl2 whereby 90% to 100% sterilization was archived at this condition.Keywords: Cryptocoryne affinis, surface sterilization, tissue culture, clorox, mercury chloride
Procedia PDF Downloads 6008770 Fault Prognostic and Prediction Based on the Importance Degree of Test Point
Authors: Junfeng Yan, Wenkui Hou
Abstract:
Prognostics and Health Management (PHM) is a technology to monitor the equipment status and predict impending faults. It is used to predict the potential fault and provide fault information and track trends of system degradation by capturing characteristics signals. So how to detect characteristics signals is very important. The select of test point plays a very important role in detecting characteristics signal. Traditionally, we use dependency model to select the test point containing the most detecting information. But, facing the large complicated system, the dependency model is not built so easily sometimes and the greater trouble is how to calculate the matrix. Rely on this premise, the paper provide a highly effective method to select test point without dependency model. Because signal flow model is a diagnosis model based on failure mode, which focuses on system’s failure mode and the dependency relationship between the test points and faults. In the signal flow model, a fault information can flow from the beginning to the end. According to the signal flow model, we can find out location and structure information of every test point and module. We break the signal flow model up into serial and parallel parts to obtain the final relationship function between the system’s testability or prediction metrics and test points. Further, through the partial derivatives operation, we can obtain every test point’s importance degree in determining the testability metrics, such as undetected rate, false alarm rate, untrusted rate. This contributes to installing the test point according to the real requirement and also provides a solid foundation for the Prognostics and Health Management. According to the real effect of the practical engineering application, the method is very efficient.Keywords: false alarm rate, importance degree, signal flow model, undetected rate, untrusted rate
Procedia PDF Downloads 377