Search results for: graph-based optimization algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5912

Search results for: graph-based optimization algorithm

1982 Study Case of Spacecraft Instruments in Structural Modelling with Nastran-Patran

Authors: Francisco Borja de Lara, Ali Ravanbakhsh, Robert F. Wimmer-Schweingruber, Lars Seimetz, Fermín Navarro

Abstract:

The intense structural loads during the launch of a spacecraft represent a challenge for the space structure designers because enough resistance has to be achieved while maintaining at the same time the mass and volume within the allowable margins of the mission requirements and inside the limits of the budget project. In this conference, we present the structural analysis of the Lunar Lander Neutron Dosimetry (LND) experiment on the Chang'E4 mission, the first probe to land on the moon’s far side included in the Chinese’ Moon Exploration Program by the Chinese National Space Administration. To this target, the software Nastran/Patran has been used: a structural model in Patran and a structural analysis through Nastran have been realized. Next, the results obtained are used both for the optimization process of the spacecraft structure, and as input parameters for the model structural test campaign. In this way, the feasibility of the lunar instrument structure is demonstrated in terms of the modal modes, stresses, and random vibration and a better understanding of the structural tests design is provided by our results.

Keywords: Chang’E4, Chinese national space administration, lunar lander neutron dosimetry, nastran-patran, structural analysis

Procedia PDF Downloads 504
1981 Formulation Design and Optimization of Orodispersible Tablets of Diphenhydramine Hydrochloride Having Adequate Mechanical Strength

Authors: Jiwan P. Lavande, A. V. Chandewar

Abstract:

In the present study, orodispersible tablets of diphenhydramine hydrochloride were prepared using croscarmellose sodium, crospovidone and camphor, menthol (as subliming agents) in different ratios and ODTs prepared with superdisintegrants were compared with ODTs prepared with camphor and menthol (subliming agents) for the following evaluation of in vitro disintegration time, dispersion time, wetting time, hardness and water absorption ratio. Results revealed that the tablets of all formulations have acceptable physical parameters. The drug and excipients compatibility study was evaluated using FTIR technique and has not detected any incompatibility. The in vitro release of drug from DC6 formulation was quick when compared to other formulations. Stability study was carried out as per ICH guidelines for three months and results revealed that upon storage disintegration time of tablets had not shown any significant difference. Microscopic study of different formulations of sublimed tablets showed formation of pores for the tablets prepared by sublimation method. Thus, conclusion can be made that the stable orodispersible tablets of diphenhydramine hydrochloride can be developed for the rapid release of diphenhydramine hydrochloride.

Keywords: orodispersible tablet, subliming agent, super disintegrants, diphenhydramine hydrochloride

Procedia PDF Downloads 219
1980 Review of Urban Vitality in China: Exploring the Theoretical Framework, Characteristics, and Assessment Systems

Authors: Dong Wei, Wu Jinxiu

Abstract:

As China's urban construction enters a new phase of 'stock optimization,' the key point of urban development has shifted to the development and reuse of existing public space. However, cities still face a series of challenges, such as the shortage of space quantity and insufficient space quality, which indirectly affect urban vitality. A review of the vitality of urban public space will significantly contribute to optimizing the quality of the urban built environment. It firstly analyses the research hotspots of urban vitality at home and abroad, based on a semi-systematic literature review. Then this paper summarizes the theoretical definitions of the vitality of urban public space and sorts out the influencing factors from the perspectives of society, environment, and users. Lastly, the paper concludes with the mainstream quantitative and evaluation methods, such as linear evaluation and integrated evaluation. This paper renders a multi-theoretical perspective to understand the characteristics and evaluation system of the vitality of public space, which helps to acknowledge the dynamic relationship between users, urban environment, and vitality. It also looks forward to providing optimal design strategies for constructing a vigorous public space in future cities.

Keywords: public space, quantification of vitality, spatial vitality, urban vitality

Procedia PDF Downloads 86
1979 Optimal Closed-loop Input Shaping Control Scheme for a 3D Gantry Crane

Authors: Mohammad Javad Maghsoudi, Z. Mohamed, A. R. Husain

Abstract:

Input shaping has been utilized for vibration reduction of many oscillatory systems. This paper presents an optimal closed-loop input shaping scheme for control of a three dimensional (3D) gantry crane system including. This includes a PID controller and Zero Vibration shaper which consider two control objectives concurrently. The control objectives are minimum sway of a payload and fast and accurate positioning of a trolley. A complete mathematical model of a lab-scaled 3D gantry crane is simulated in Simulink. Moreover, by utilizing PSO algorithm and a proposed scheme the controller is designed to cater both control objectives concurrently. Simulation studies on a 3D gantry crane show that the proposed optimal controller has an acceptable performance. The controller provides good position response with satisfactory payload sway in both rail and trolley responses.

Keywords: 3D gantry crane, input shaping, closed-loop control, optimal scheme, PID

Procedia PDF Downloads 396
1978 Operator Optimization Based on Hardware Architecture Alignment Requirements

Authors: Qingqing Gai, Junxing Shen, Yu Luo

Abstract:

Due to the hardware architecture characteristics, some operators tend to acquire better performance if the input/output tensor dimensions are aligned to a certain minimum granularity, such as convolution and deconvolution commonly used in deep learning. Furthermore, if the requirements are not met, the general strategy is to pad with 0 to satisfy the requirements, potentially leading to the under-utilization of the hardware resources. Therefore, for the convolution and deconvolution whose input and output channels do not meet the minimum granularity alignment, we propose to transfer the W-dimensional data to the C-dimension for computation (W2C) to enable the C-dimension to meet the hardware requirements. This scheme also reduces the number of computations in the W-dimension. Although this scheme substantially increases computation, the operator’s speed can improve significantly. It achieves remarkable speedups on multiple hardware accelerators, including Nvidia Tensor cores, Qualcomm digital signal processors (DSPs), and Huawei neural processing units (NPUs). All you need to do is modify the network structure and rearrange the operator weights offline without retraining. At the same time, for some operators, such as the Reducemax, we observe that transferring the Cdimensional data to the W-dimension(C2W) and replacing the Reducemax with the Maxpool can accomplish acceleration under certain circumstances.

Keywords: convolution, deconvolution, W2C, C2W, alignment, hardware accelerator

Procedia PDF Downloads 84
1977 Improving the Frequency Response of a Circular Dual-Mode Resonator with a Reconfigurable Bandwidth

Authors: Muhammad Haitham Albahnassi, Adnan Malki, Shokri Almekdad

Abstract:

In this paper, a method for reconfiguring bandwidth in a circular dual-mode resonator is presented. The method concerns the optimized geometry of a structure that may be used to host the tuning elements, which are typically RF (Radio Frequency) switches. The tuning elements themselves, and their performance during tuning, are not the focus of this paper. The designed resonator is able to reconfigure its fractional bandwidth by adjusting the inter-coupling level between the degenerate modes, while at the same time improving its response by adjusting the external-coupling level and keeping the center frequency fixed. The inter-coupling level has been adjusted by changing the dimensions of the perturbation element, while the external-coupling level has been adjusted by changing one of the feeder dimensions. The design was arrived at via optimization. Agreeing simulation and measurement results of the designed and implemented filters showed good improvements in return loss values and the stability of the center frequency.

Keywords: dual-mode resonators, perturbation theory, reconfigurable filters, software defined radio, cognitine radio

Procedia PDF Downloads 146
1976 Research on Ultrafine Particles Classification Using Hydrocyclone with Annular Rinse Water

Authors: Tao Youjun, Zhao Younan

Abstract:

The separation effect of fine coal can be improved by the process of pre-desliming. It was significantly enhanced when the fine coal was processed using Falcon concentrator with the removal of -45um coal slime. Ultrafine classification tests using Krebs classification cyclone with annular rinse water showed that increasing feeding pressure can effectively avoid the phenomena of heavy particles passing into overflow and light particles slipping into underflow. The increase of rinse water pressure could reduce the content of fine-grained particles while increasing the classification size. The increase in feeding concentration had a negative effect on the efficiency of classification, meanwhile increased the classification size due to the enhanced hindered settling caused by high underflow concentration. As a result of optimization experiments with response indicator of classification efficiency which based on orthogonal design using Design-Expert software indicated that the optimal classification efficiency reached 91.32% with the feeding pressure of 0.03MPa, the rinse water pressure of 0.02MPa and the feeding concentration of 12.5%. Meanwhile, the classification size was 49.99 μm which had a good agreement with the predicted value.

Keywords: hydrocyclone, ultrafine classification, slime, classification efficiency, classification size

Procedia PDF Downloads 146
1975 A Grid Synchronization Phase Locked Loop Method for Grid-Connected Inverters Systems

Authors: Naima Ikken, Abdelhadi Bouknadel, Nour-eddine Tariba Ahmed Haddou, Hafsa El Omari

Abstract:

The operation of grid-connected inverters necessity a single-phase phase locked loop (PLL) is proposed in this article to accurately and quickly estimate and detect the grid phase angle. This article presents the improvement of a method of phase-locked loop. The novelty is to generate a method (PLL) of synchronizing the grid with a Notch filter based on adaptive fuzzy logic for inverter systems connected to the grid. The performance of the proposed method was tested under normal and abnormal operating conditions (amplitude, frequency and phase shift variations). In addition, simulation results with ISPM software are developed to verify the effectiveness of the proposed method strategy. Finally, the experimental test will be used to extract the result and discuss the validity of the proposed algorithm.

Keywords: phase locked loop, PLL, notch filter, fuzzy logic control, grid connected inverters

Procedia PDF Downloads 130
1974 Modeling Heat-Related Mortality Based on Greenhouse Emissions in OECD Countries

Authors: Anderson Ngowa Chembe, John Olukuru

Abstract:

Greenhouse emissions by human activities are known to irreversibly increase global temperatures through the greenhouse effect. This study seeks to propose a mortality model with sensitivity to heat-change effects as one of the underlying parameters in the model. As such, the study sought to establish the relationship between greenhouse emissions and mortality indices in five OECD countries (USA, UK, Japan, Canada & Germany). Upon the establishment of the relationship using correlation analysis, an additional parameter that accounts for the sensitivity of heat-changes to mortality rates was incorporated in the Lee-Carter model. Based on the proposed model, new parameter estimates were calculated using iterative algorithms for optimization. Finally, the goodness of fit for the original Lee-Carter model and the proposed model were compared using deviance comparison. The proposed model provides a better fit to mortality rates especially in USA, UK and Germany where the mortality indices have a strong positive correlation with the level of greenhouse emissions. The results of this study are of particular importance to actuaries, demographers and climate-risk experts who seek to use better mortality-modeling techniques in the wake of heat effects caused by increased greenhouse emissions.

Keywords: climate risk, greenhouse emissions, Lee-Carter model, OECD

Procedia PDF Downloads 321
1973 A Deterministic Large Deviation Model Based on Complex N-Body Systems

Authors: David C. Ni

Abstract:

In the previous efforts, we constructed N-Body Systems by an extended Blaschke product (EBP), which represents a non-temporal and nonlinear extension of Lorentz transformation. In this construction, we rely only on two parameters, nonlinear degree, and relative momentum to characterize the systems. We further explored root computation via iteration with an algorithm extended from Jenkins-Traub method. The solution sets demonstrate a form of σ+ i [-t, t], where σ and t are the real numbers, and the [-t, t] shows various canonical distributions. In this paper, we correlate the convergent sets in the original domain with solution sets, which demonstrating large-deviation distributions in the codomain. We proceed to compare our approach with the formula or principles, such as Donsker-Varadhan and Wentzell-Freidlin theories. The deterministic model based on this construction allows us to explore applications in the areas of finance and statistical mechanics.

Keywords: nonlinear Lorentz transformation, Blaschke equation, iteration solutions, root computation, large deviation distribution, deterministic model

Procedia PDF Downloads 377
1972 Vibration Control of Two Adjacent Structures Using a Non-Linear Damping System

Authors: Soltani Amir, Wang Xuan

Abstract:

The advantage of using non-linear passive damping system in vibration control of two adjacent structures is investigated under their base excitation. The base excitation is El Centro earthquake record acceleration. The damping system is considered as an optimum and effective non-linear viscous damper that is connected between two adjacent structures. A Matlab program is developed to produce the stiffness and damping matrices and to determine a time history analysis of the dynamic motion of the system. One structure is assumed to be flexible while the other has a rule as laterally supporting structure with rigid frames. The response of the structure has been calculated and the non-linear damping coefficient is determined using optimum LQR algorithm in an optimum vibration control system. The non-linear parameter of damping system is estimated and it has shown a significant advantage of application of this system device for vibration control of two adjacent tall building.

Keywords: active control, passive control, viscous dampers, structural control, vibration control, tall building

Procedia PDF Downloads 490
1971 Study of Behavior Tribological Cutting Tools Based on Coating

Authors: A. Achour L. Chekour, A. Mekroud

Abstract:

Tribology, the science of lubrication, friction and wear, plays an important role in science "crossroads" initiated by the recent developments in the industry. Its multidisciplinary nature reinforces its scientific interest. It covers all the sciences that deal with the contact between two solids loaded and relative motion. It is thus one of the many intersections more clearly established disciplines such as solid mechanics and the fluids, rheological, thermal, materials science and chemistry. As for his experimental approach, it is based on the physical and processing signals and images. The optimization of operating conditions by cutting tool must contribute significantly to the development and productivity of advanced automation of machining techniques because their implementation requires sufficient knowledge of how the process and in particular the evolution of tool wear. In addition, technological advances have developed the use of very hard materials, refractory difficult machinability, requiring highly resistant materials tools. In this study, we present the behavior wear a machining tool during the roughing operation according to the cutting parameters. The interpretation of the experimental results is based mainly on observations and analyzes of sharp edges e tool using the latest techniques: scanning electron microscopy (SEM) and optical rugosimetry laser beam.

Keywords: friction, wear, tool, cutting

Procedia PDF Downloads 313
1970 Optimization of Machine Learning Regression Results: An Application on Health Expenditures

Authors: Songul Cinaroglu

Abstract:

Machine learning regression methods are recommended as an alternative to classical regression methods in the existence of variables which are difficult to model. Data for health expenditure is typically non-normal and have a heavily skewed distribution. This study aims to compare machine learning regression methods by hyperparameter tuning to predict health expenditure per capita. A multiple regression model was conducted and performance results of Lasso Regression, Random Forest Regression and Support Vector Machine Regression recorded when different hyperparameters are assigned. Lambda (λ) value for Lasso Regression, number of trees for Random Forest Regression, epsilon (ε) value for Support Vector Regression was determined as hyperparameters. Study results performed by using 'k' fold cross validation changed from 5 to 50, indicate the difference between machine learning regression results in terms of R², RMSE and MAE values that are statistically significant (p < 0.001). Study results reveal that Random Forest Regression (R² ˃ 0.7500, RMSE ≤ 0.6000 ve MAE ≤ 0.4000) outperforms other machine learning regression methods. It is highly advisable to use machine learning regression methods for modelling health expenditures.

Keywords: machine learning, lasso regression, random forest regression, support vector regression, hyperparameter tuning, health expenditure

Procedia PDF Downloads 204
1969 Modeling Sustainable Truck Rental Operations Using Closed-Loop Supply Chain Network

Authors: Khaled S. Abdallah, Abdel-Aziz M. Mohamed

Abstract:

Moving industries consume numerous resources and dispose masses of used packaging materials. Proper sorting, recycling and disposing the packaging materials is necessary to avoid a sever pollution disaster. This research paper presents a conceptual model to propose sustainable truck rental operations instead of the regular one. An optimization model was developed to select the locations of truck rental centers, collection sites, maintenance and repair sites, and identify the rental fees to be charged for all routes that maximize the total closed supply chain profits. Fixed costs of vehicle purchasing, costs of constructing collection centers and repair centers, as well as the fixed costs paid to use disposal and recycling centers are considered. Operating costs include the truck maintenance, repair costs as well as the cost of recycling and disposing the packing materials, and the costs of relocating the truck are presented in the model. A mixed integer model is developed followed by a simulation model to examine the factors affecting the operation of the model.

Keywords: modeling, truck rental, supply chains management.

Procedia PDF Downloads 211
1968 Optimization of Fin Type and Fin per Inch on Heat Transfer and Pressure Drop of an Air Cooler

Authors: A. Falavand Jozaei, A. Ghafouri

Abstract:

Operation enhancement in an air cooler (heat exchanger) depends on the rate of heat transfer, and pressure drop. In this paper, for a given heat duty, study of the effects of FPI (fin per inch) and fin type (circular and hexagonal fins) on two parameters mentioned above is considered in an air cooler in Iran, Arvand petrochemical. A program in EES (Engineering Equations Solver) software moreover, Aspen B-JAC and HTFS+ software are used for this purpose to solve governing equations. At first the simulated results obtained from this program is compared to the experimental data for two cases of FPI. The effects of FPI from 3 to 15 over heat transfer (Q) to pressure drop ratio (Q/Δp ratio). This ratio is one of the main parameters in design, rating, and simulation heat exchangers. The results show that heat transfer (Q) and pressure drop increase with increasing FPI (fin per inch) steadily, and the Q/Δp ratio increases to FPI = 12 (for circular fins about 47% and for hexagonal fins about 69%) and then decreased gradually to FPI = 15 (for circular fins about 5% and for hexagonal fins about 8%), and Q/Δp ratio is maximum at FPI = 12. The FPI value selection between 8 and 12 obtained as a result to optimum heat transfer to pressure drop ratio. Also by contrast, between circular and hexagonal fins results, the Q/Δp ratio of hexagonal fins more than Q/Δp ratio of circular fins for FPI between 8 and 12 (optimum FPI).

Keywords: air cooler, circular and hexagonal fins, fin per inch, heat transfer and pressure drop

Procedia PDF Downloads 434
1967 Methodology to Achieve Non-Cooperative Target Identification Using High Resolution Range Profiles

Authors: Olga Hernán-Vega, Patricia López-Rodríguez, David Escot-Bocanegra, Raúl Fernández-Recio, Ignacio Bravo

Abstract:

Non-Cooperative Target Identification has become a key research domain in the Defense industry since it provides the ability to recognize targets at long distance and under any weather condition. High Resolution Range Profiles, one-dimensional radar images where the reflectivity of a target is projected onto the radar line of sight, are widely used for identification of flying targets. According to that, to face this problem, an approach to Non-Cooperative Target Identification based on the exploitation of Singular Value Decomposition to a matrix of range profiles is presented. Target Identification based on one-dimensional radar images compares a collection of profiles of a given target, namely test set, with the profiles included in a pre-loaded database, namely training set. The classification is improved by using Singular Value Decomposition since it allows to model each aircraft as a subspace and to accomplish recognition in a transformed domain where the main features are easier to extract hence, reducing unwanted information such as noise. Singular Value Decomposition permits to define a signal subspace which contain the highest percentage of the energy, and a noise subspace which will be discarded. This way, only the valuable information of each target is used in the recognition process. The identification algorithm is based on finding the target that minimizes the angle between subspaces and takes place in a transformed domain. Two metrics, F1 and F2, based on Singular Value Decomposition are accomplished in the identification process. In the case of F2, the angle is weighted, since the top vectors set the importance in the contribution to the formation of a target signal, on the contrary F1 simply shows the evolution of the unweighted angle. In order to have a wide database or radar signatures and evaluate the performance, range profiles are obtained through numerical simulation of seven civil aircraft at defined trajectories taken from an actual measurement. Taking into account the nature of the datasets, the main drawback of using simulated profiles instead of actual measured profiles is that the former implies an ideal identification scenario, since measured profiles suffer from noise, clutter and other unwanted information and simulated profiles don't. In this case, the test and training samples have similar nature and usually a similar high signal-to-noise ratio, so as to assess the feasibility of the approach, the addition of noise has been considered before the creation of the test set. The identification results applying the unweighted and weighted metrics are analysed for demonstrating which algorithm provides the best robustness against noise in an actual possible scenario. So as to confirm the validity of the methodology, identification experiments of profiles coming from electromagnetic simulations are conducted, revealing promising results. Considering the dissimilarities between the test and training sets when noise is added, the recognition performance has been improved when weighting is applied. Future experiments with larger sets are expected to be conducted with the aim of finally using actual profiles as test sets in a real hostile situation.

Keywords: HRRP, NCTI, simulated/synthetic database, SVD

Procedia PDF Downloads 336
1966 Statistical Manufacturing Cell/Process Qualification Sample Size Optimization

Authors: Angad Arora

Abstract:

In production operations/manufacturing, a cell or line is typically a bunch of similar machines (computer numerical control (CNCs), advanced cutting, 3D printing or special purpose machines. For qualifying a typical manufacturing line /cell / new process, Ideally, we need a sample of parts that can be flown through the process and then we make a judgment on the health of the line/cell. However, with huge volumes and mass production scope, such as in the mobile phone industry, for example, the actual cells or lines can go in thousands and to qualify each one of them with statistical confidence means utilizing samples that are very large and eventually add to product /manufacturing cost + huge waste if the parts are not intended to be customer shipped. To solve this, we come up with 2 steps statistical approach. We start with a small sample size and then objectively evaluate whether the process needs additional samples or not. For example, if a process is producing bad parts and we saw those samples early, then there is a high chance that the process will not meet the desired yield and there is no point in keeping adding more samples. We used this hypothesis and came up with 2 steps binomial testing approach. Further, we also prove through results that we can achieve an 18-25% reduction in samples while keeping the same statistical confidence.

Keywords: statistics, data science, manufacturing process qualification, production planning

Procedia PDF Downloads 75
1965 Effectiveness of Column Geometry in High-Rise Buildings

Authors: Man Singh Meena

Abstract:

Structural engineers are facing different kind of challenges due to innovative & bold ideas of architects who are trying to design every structure with uniqueness. In RCC frame structures different geometry of columns can be used in design and rectangular columns can be placed with different type orientation. The analysis is design of structures can also be carried out by different type of software available i.e., STAAD Pro, ETABS and TEKLA. In recent times high-rise building modeling & analysis is done by ETABS due to its certain features which are superior to other software. The case study in this paper mainly emphasizes on structural behavior of high rise building for different column shape configurations like Circular, Square, Rectangular and Rectangular with 90-degree Rotation and rectangular shape plan. In all these column shapes the areas of columns are kept same to study the effect on design of concrete area is same. Modelling of 20-storeys R.C.C. framed building is done on the ETABS software for analysis. Post analysis of the structure, maximum bending moments, shear forces and maximum longitudinal reinforcement are computed and compared for three different story structures to identify the effectiveness of geometry of column.

Keywords: high-rise building, column geometry, building modelling, ETABS analysis, building design, structural analysis, structural optimization

Procedia PDF Downloads 54
1964 Research on Urban Point of Interest Generalization Method Based on Mapping Presentation

Authors: Chengming Li, Yong Yin, Peipei Guo, Xiaoli Liu

Abstract:

Without taking account of the attribute richness of POI (point of interest) data and spatial distribution limited by roads, a POI generalization method considering both attribute information and spatial distribution has been proposed against the existing point generalization algorithm merely focusing on overall information of point groups. Hierarchical characteristic of urban POI information expression has been firstly analyzed to point out the measurement feature of the corresponding hierarchy. On this basis, an urban POI generalizing strategy has been put forward: POIs urban road network have been divided into three distribution pattern; corresponding generalization methods have been proposed according to the characteristic of POI data in different distribution patterns. Experimental results showed that the method taking into account both attribute information and spatial distribution characteristics of POI can better implement urban POI generalization in the mapping presentation.

Keywords: POI, road network, selection method, spatial information expression, distribution pattern

Procedia PDF Downloads 385
1963 The Impact of Artificial Intelligence on Qualty Conrol and Quality

Authors: Mary Moner Botros Fanawel

Abstract:

Many companies use the statistical tool named as statistical quality control, and which can have a high cost for the companies interested on these statistical tools. The evaluation of the quality of products and services is an important topic, but the reduction of the cost of the implantation of the statistical quality control also has important benefits for the companies. For this reason, it is important to implement a economic design for the various steps included into the statistical quality control. In this paper, we describe some relevant aspects related to the economic design of a quality control chart for the proportion of defective items. They are very important because the suggested issues can reduce the cost of implementing a quality control chart for the proportion of defective items. Note that the main purpose of this chart is to evaluate and control the proportion of defective items of a production process.

Keywords: model predictive control, hierarchical control structure, genetic algorithm, water quality with DBPs objectives proportion, type I error, economic plan, distribution function bootstrap control limit, p-value method, out-of-control signals, p-value, quality characteristics

Procedia PDF Downloads 37
1962 Optimizing Resource Allocation and Indoor Location Using Bluetooth Low Energy

Authors: Néstor Álvarez-Díaz, Pino Caballero-Gil, Héctor Reboso-Morales, Francisco Martín-Fernández

Abstract:

The recent tendency of "Internet of Things" (IoT) has developed in the last years, causing the emergence of innovative communication methods among multiple devices. The appearance of Bluetooth Low Energy (BLE) has allowed a push to IoT in relation to smartphones. In this moment, a set of new applications related to several topics like entertainment and advertisement has begun to be developed but not much has been done till now to take advantage of the potential that these technologies can offer on many business areas and in everyday tasks. In the present work, the application of BLE technology and smartphones is proposed on some business areas related to the optimization of resource allocation in huge facilities like airports. An indoor location system has been developed through triangulation methods with the use of BLE beacons. The described system can be used to locate all employees inside the building in such a way that any task can be automatically assigned to a group of employees. It should be noted that this system cannot only be used to link needs with employees according to distances, but it also takes into account other factors like occupation level or category. In addition, it has been endowed with a security system to manage business and personnel sensitive data. The efficiency of communications is another essential characteristic that has been taken into account in this work.

Keywords: bluetooth low energy, indoor location, resource assignment, smartphones

Procedia PDF Downloads 370
1961 The Laser Line Detection for Autonomous Mapping Based on Color Segmentation

Authors: Pavel Chmelar, Martin Dobrovolny

Abstract:

Laser projection or laser footprint detection is today widely used in many fields of robotics, measurement, or electronics. The system accuracy strictly depends on precise laser footprint detection on target objects. This article deals with the laser line detection based on the RGB segmentation and the component labeling. As a measurement device was used the developed optical rangefinder. The optical rangefinder is equipped with vertical sweeping of the laser beam and high quality camera. This system was developed mainly for automatic exploration and mapping of unknown spaces. In the first section is presented a new detection algorithm. In the second section are presented measurements results. The measurements were performed in variable light conditions in interiors. The last part of the article present achieved results and their differences between day and night measurements.

Keywords: color segmentation, component labelling, laser line detection, automatic mapping, distance measurement, vector map

Procedia PDF Downloads 411
1960 Possibility Theory Based Multi-Attribute Decision-Making: Application in Facility Location-Selection Problem under Uncertain and Extreme Environment

Authors: Bezhan Ghvaberidze

Abstract:

A fuzzy multi-objective facility location-selection problem (FLSP) under uncertain and extreme environments based on possibility theory is developed. The model’s uncertain parameters in the q-rung orthopair fuzzy values are presented and transformed in the Dempster-Shaper’s belief structure environment. An objective function – distribution centers’ selection ranking index as an extension of Dempster’s extremal expectations under discrimination q-rung orthopair fuzzy information is constructed. Experts evaluate each humanitarian aid from distribution centers (HADC) against each of the uncertain factors. HADCs location problem is reduced to the bicriteria problem of partitioning the set of customers by the set of centers: (1) – Minimization of transportation costs; (2) – Maximization of centers’ selection ranking indexes. Partitioning type constraints are also constructed. For an illustration of the obtained results, a numerical example is created from the facility location-selection problem.

Keywords: FLSP, multi-objective combinatorial optimization problem, evidence theory, HADC, q-rung orthopair fuzzy set, possibility theory

Procedia PDF Downloads 93
1959 The Use of Nano-Crystalline Starch in Probiotic Yogurt and Its Effects on the Physicochemical and Biological Properties

Authors: Ali Seirafi

Abstract:

The purpose of this study was to investigate the effect and application of starch nanocrystals on physicochemical and microbial properties in the industrial production of probiotic yogurt. In this study, probiotic yoghurt was manufactured by industrial method with the optimization and control of the technological factors affecting the probabilistic biomass, using probiotic bacteria Lactobacillus acidophilus and Bifidobacterium bifidum with commonly used yogurt primers. Afterwards, the effects of different levels of fat (1.3%, 2.5 and 4%), as well as the effects of various perbiotic compounds include starch nanocrystals (0.5%, 1 and 1.5%), galactolegalosaccharide (0.5% 1 and 1.5%) and fructooligosaccharide (0.5%, 1 and 1.5%) were evaluated. In addition, the effect of packaging (polyethylene and glass) was studied, while the effect of pH changes and final acidity were studied at each stage. In this research, all experiments were performed in 3 replications and the results were analyzed in a completely randomized design with SAS version 9.1 software. The results of this study showed that the addition of starch nanocrystal compounds as well as the use of glass packaging had the most positive effects on the survival of Lactobacillus acidophilus bacteria and the addition of nano-crystals and the increase in the cooling rate of the product, had the most positive effects on the survival of bacteria Bifidobacterium bifidum.

Keywords: Bifidobacterium bifidum, Lactobacillus acidophilus, prebiotics, probiotic yogurt

Procedia PDF Downloads 142
1958 Decomposition of the Customer-Server Interaction in Grocery Shops

Authors: Andreas Ahrens, Ojaras Purvinis, Jelena Zascerinska

Abstract:

A successful shopping experience without overcrowded shops and long waiting times undoubtedly leads to the release of happiness hormones and is generally considered the goal of any optimization. Factors influencing the shopping experience can be divided into internal and external ones. External factors are related, e. g. to the arrival of the customers to the shop, whereas internal are linked with the service process itself when checking out (waiting in the queue to the cash register and the scanning of the goods as well as the payment process itself) or any other non-expected delay when changing the status from a visitor to a buyer by choosing goods or items. This paper divides the customer-server interaction into five phases starting with the customer's arrival at the shop, the selection of goods, the buyer waiting in the queue to the cash register, the payment process, and ending with the customer or buyer's departure. Our simulation results show how five phases are intertwined and influence the overall shopping experience. Parameters for measuring the shopping experience are estimated based on the burstiness level in each of the five phases of the customer-server interaction.

Keywords: customers’ burstiness, cash register, customers’ wait-ing time, gap distribution function

Procedia PDF Downloads 127
1957 Bi-Criteria Objective Network Design Model for Multi Period Multi Product Green Supply Chain

Authors: Shahul Hamid Khan, S. Santhosh, Abhinav Kumar Sharma

Abstract:

Environmental performance along with social performance is becoming vital factors for industries to achieve global standards. With a good environmental policy global industries are differentiating them from their competitors. This paper concentrates on multi stage, multi product and multi period manufacturing network. Bi-objective mathematical models for total cost and total emission for the entire forward supply chain are considered. Here five different problems are considered by varying the number of suppliers, manufacturers, and environmental levels, for illustrating the taken mathematical model. GA, and Random search are used for finding the optimal solution. The input parameters of the optimal solution are used to find the tradeoff between the initial investment by the industry and the long term benefit of the environment.

Keywords: closed loop supply chain, genetic algorithm, random search, green supply chain

Procedia PDF Downloads 530
1956 Optimization of Surface Roughness by Taguchi’s Method for Turning Process

Authors: Ashish Ankus Yerunkar, Ravi Terkar

Abstract:

Study aimed at evaluating the best process environment which could simultaneously satisfy requirements of both quality as well as productivity with special emphasis on reduction of cutting tool flank wear, because reduction in flank wear ensures increase in tool life. The predicted optimal setting ensured minimization of surface roughness. Purpose of this paper is focused on the analysis of optimum cutting conditions to get lowest surface roughness in turning SCM 440 alloy steel by Taguchi method. Design for the experiment was done using Taguchi method and 18 experiments were designed by this process and experiments conducted. The results are analyzed using ANOVA method. Taguchi method has depicted that the depth of cut has significant role to play in producing lower surface roughness followed by feed. The Cutting speed has lesser role on surface roughness from the tests. The vibrations of the machine tool, tool chattering are the other factors which may contribute poor surface roughness to the results and such factors ignored for analyses. The inferences by this method will be useful to other researches for similar type of study and may be vital for further research on tool vibrations, cutting forces etc.

Keywords: surface roughness (ra), machining, dry turning, taguchi method, turning process, anova method, mahr perthometer

Procedia PDF Downloads 356
1955 A Background Subtraction Based Moving Object Detection Around the Host Vehicle

Authors: Hyojin Lim, Cuong Nguyen Khac, Ho-Youl Jung

Abstract:

In this paper, we propose moving object detection method which is helpful for driver to safely take his/her car out of parking lot. When moving objects such as motorbikes, pedestrians, the other cars and some obstacles are detected at the rear-side of host vehicle, the proposed algorithm can provide to driver warning. We assume that the host vehicle is just before departure. Gaussian Mixture Model (GMM) based background subtraction is basically applied. Pre-processing such as smoothing and post-processing as morphological filtering are added.We examine “which color space has better performance for detection of moving objects?” Three color spaces including RGB, YCbCr, and Y are applied and compared, in terms of detection rate. Through simulation, we prove that RGB space is more suitable for moving object detection based on background subtraction.

Keywords: gaussian mixture model, background subtraction, moving object detection, color space, morphological filtering

Procedia PDF Downloads 595
1954 A Study of Structural Damage Detection for Spacecraft In-Orbit Based on Acoustic Sensor Array

Authors: Lei Qi, Rongxin Yan, Lichen Sun

Abstract:

With the increasing of human space activities, the number of space debris has increased dramatically, and the possibility that spacecrafts on orbit are impacted by space debris is growing. A method is of the vital significance to real-time detect and assess spacecraft damage, determine of gas leak accurately, guarantee the life safety of the astronaut effectively. In this paper, acoustic sensor array is used to detect the acoustic signal which emits from the damage of the spacecraft on orbit. Then, we apply the time difference of arrival and beam forming algorithm to locate the damage and leakage. Finally, the extent of the spacecraft damage is evaluated according to the nonlinear ultrasonic method. The result shows that this method can detect the debris impact and the structural damage, locate the damage position, and identify the damage degree effectively. This method can meet the needs of structural damage detection for the spacecraft in-orbit.

Keywords: acoustic sensor array, spacecraft, damage assessment, leakage location

Procedia PDF Downloads 273
1953 An Energy Efficient Clustering Approach for Underwater ‎Wireless Sensor Networks

Authors: Mohammad Reza Taherkhani‎

Abstract:

Wireless sensor networks that are used to monitor a special environment, are formed from a large number of sensor nodes. The role of these sensors is to sense special parameters from ambient and to make a connection. In these networks, the most important challenge is the management of energy usage. Clustering is one of the methods that are broadly used to face this challenge. In this paper, a distributed clustering protocol based on learning automata is proposed for underwater wireless sensor networks. The proposed algorithm that is called LA-Clustering forms clusters in the same energy level, based on the energy level of nodes and the connection radius regardless of size and the structure of sensor network. The proposed approach is simulated and is compared with some other protocols with considering some metrics such as network lifetime, number of alive nodes, and number of transmitted data. The simulation results demonstrate the efficiency of the proposed approach.

Keywords: underwater sensor networks, clustering, learning automata, energy consumption

Procedia PDF Downloads 340