Search results for: reconstruction optimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3815

Search results for: reconstruction optimization

1145 Optimal Design of Propellant Grain Shape Based on Structural Strength Analysis

Authors: Chen Xiong, Tong Xin, Li Hao, Xu Jin-Sheng

Abstract:

Experiment and simulation researches on the structural integrity of propellant grain in solid rocket motor (SRM) with high volumetric fraction were conducted. First, by using SRM parametric modeling functions with secondary development tool Python of ABAQUS, the three dimensional parameterized modeling programs of star shaped grain, wheel shaped grain and wing cylindrical grain were accomplished. Then, the mechanical properties under different loads for star shaped grain were obtained with the application of automatically established finite element model in ABAQUS. Next, several optimization algorithms are introduced to optimize the star shaped grain, wheel shaped grain and wing cylindrical grain. After meeting the demands of burning surface changes and volumetric fraction, the optimum three dimensional shapes of grain were obtained. Finally, by means of parametric modeling functions, pressure data of SRM’s cold pressurization test was directly applied to simulation of grain in terms of mechanical performance. The results verify the reliability and practical of parameterized modeling program of SRM.

Keywords: cold pressurization test, ğarametric modeling, structural integrity, propellant grain, SRM

Procedia PDF Downloads 356
1144 Satellite Imagery Classification Based on Deep Convolution Network

Authors: Zhong Ma, Zhuping Wang, Congxin Liu, Xiangzeng Liu

Abstract:

Satellite imagery classification is a challenging problem with many practical applications. In this paper, we designed a deep convolution neural network (DCNN) to classify the satellite imagery. The contributions of this paper are twofold — First, to cope with the large-scale variance in the satellite image, we introduced the inception module, which has multiple filters with different size at the same level, as the building block to build our DCNN model. Second, we proposed a genetic algorithm based method to efficiently search the best hyper-parameters of the DCNN in a large search space. The proposed method is evaluated on the benchmark database. The results of the proposed hyper-parameters search method show it will guide the search towards better regions of the parameter space. Based on the found hyper-parameters, we built our DCNN models, and evaluated its performance on satellite imagery classification, the results show the classification accuracy of proposed models outperform the state of the art method.

Keywords: satellite imagery classification, deep convolution network, genetic algorithm, hyper-parameter optimization

Procedia PDF Downloads 295
1143 Optimal Scheduling of Load and Operational Strategy of a Load Aggregator to Maximize Profit with PEVs

Authors: Md. Shafiullah, Ali T. Al-Awami

Abstract:

This project proposes optimal scheduling of imported power of a load aggregator with the utilization of EVs to maximize its profit. As with the increase of renewable energy resources, electricity price in competitive market becomes more uncertain and, on the other hand, with the penetration of renewable distributed generators in the distribution network the predicted load of a load aggregator also becomes uncertain in real time. Though there is uncertainties in both load and price, the use of EVs storage capacity can make the operation of load aggregator flexible. LA submits its offer to day-ahead market based on predicted loads and optimized use of its EVs to maximize its profit, as well as in real time operation it uses its energy storage capacity in such a way that it can maximize its profit. In this project, load aggregators profit maximization algorithm is formulated and the optimization problem is solved with the help of CVX. As in real time operation the forecasted loads differ from actual load, the mismatches are settled in real time balancing market. Simulation results compare the profit of a load aggregator with a hypothetical group of 1000 EVs and without EVs.

Keywords: CVX, electricity market, load aggregator, load and price uncertainties, profit maximization, real time balancing operation

Procedia PDF Downloads 413
1142 A Bivariate Inverse Generalized Exponential Distribution and Its Applications in Dependent Competing Risks Model

Authors: Fatemah A. Alqallaf, Debasis Kundu

Abstract:

The aim of this paper is to introduce a bivariate inverse generalized exponential distribution which has a singular component. The proposed bivariate distribution can be used when the marginals have heavy-tailed distributions, and they have non-monotone hazard functions. Due to the presence of the singular component, it can be used quite effectively when there are ties in the data. Since it has four parameters, it is a very flexible bivariate distribution, and it can be used quite effectively for analyzing various bivariate data sets. Several dependency properties and dependency measures have been obtained. The maximum likelihood estimators cannot be obtained in closed form, and it involves solving a four-dimensional optimization problem. To avoid that, we have proposed to use an EM algorithm, and it involves solving only one non-linear equation at each `E'-step. Hence, the implementation of the proposed EM algorithm is very straight forward in practice. Extensive simulation experiments and the analysis of one data set have been performed. We have observed that the proposed bivariate inverse generalized exponential distribution can be used for modeling dependent competing risks data. One data set has been analyzed to show the effectiveness of the proposed model.

Keywords: Block and Basu bivariate distributions, competing risks, EM algorithm, Marshall-Olkin bivariate exponential distribution, maximum likelihood estimators

Procedia PDF Downloads 138
1141 Study Case of Spacecraft Instruments in Structural Modelling with Nastran-Patran

Authors: Francisco Borja de Lara, Ali Ravanbakhsh, Robert F. Wimmer-Schweingruber, Lars Seimetz, Fermín Navarro

Abstract:

The intense structural loads during the launch of a spacecraft represent a challenge for the space structure designers because enough resistance has to be achieved while maintaining at the same time the mass and volume within the allowable margins of the mission requirements and inside the limits of the budget project. In this conference, we present the structural analysis of the Lunar Lander Neutron Dosimetry (LND) experiment on the Chang'E4 mission, the first probe to land on the moon’s far side included in the Chinese’ Moon Exploration Program by the Chinese National Space Administration. To this target, the software Nastran/Patran has been used: a structural model in Patran and a structural analysis through Nastran have been realized. Next, the results obtained are used both for the optimization process of the spacecraft structure, and as input parameters for the model structural test campaign. In this way, the feasibility of the lunar instrument structure is demonstrated in terms of the modal modes, stresses, and random vibration and a better understanding of the structural tests design is provided by our results.

Keywords: Chang’E4, Chinese national space administration, lunar lander neutron dosimetry, nastran-patran, structural analysis

Procedia PDF Downloads 525
1140 Formulation Design and Optimization of Orodispersible Tablets of Diphenhydramine Hydrochloride Having Adequate Mechanical Strength

Authors: Jiwan P. Lavande, A. V. Chandewar

Abstract:

In the present study, orodispersible tablets of diphenhydramine hydrochloride were prepared using croscarmellose sodium, crospovidone and camphor, menthol (as subliming agents) in different ratios and ODTs prepared with superdisintegrants were compared with ODTs prepared with camphor and menthol (subliming agents) for the following evaluation of in vitro disintegration time, dispersion time, wetting time, hardness and water absorption ratio. Results revealed that the tablets of all formulations have acceptable physical parameters. The drug and excipients compatibility study was evaluated using FTIR technique and has not detected any incompatibility. The in vitro release of drug from DC6 formulation was quick when compared to other formulations. Stability study was carried out as per ICH guidelines for three months and results revealed that upon storage disintegration time of tablets had not shown any significant difference. Microscopic study of different formulations of sublimed tablets showed formation of pores for the tablets prepared by sublimation method. Thus, conclusion can be made that the stable orodispersible tablets of diphenhydramine hydrochloride can be developed for the rapid release of diphenhydramine hydrochloride.

Keywords: orodispersible tablet, subliming agent, super disintegrants, diphenhydramine hydrochloride

Procedia PDF Downloads 231
1139 Review of Urban Vitality in China: Exploring the Theoretical Framework, Characteristics, and Assessment Systems

Authors: Dong Wei, Wu Jinxiu

Abstract:

As China's urban construction enters a new phase of 'stock optimization,' the key point of urban development has shifted to the development and reuse of existing public space. However, cities still face a series of challenges, such as the shortage of space quantity and insufficient space quality, which indirectly affect urban vitality. A review of the vitality of urban public space will significantly contribute to optimizing the quality of the urban built environment. It firstly analyses the research hotspots of urban vitality at home and abroad, based on a semi-systematic literature review. Then this paper summarizes the theoretical definitions of the vitality of urban public space and sorts out the influencing factors from the perspectives of society, environment, and users. Lastly, the paper concludes with the mainstream quantitative and evaluation methods, such as linear evaluation and integrated evaluation. This paper renders a multi-theoretical perspective to understand the characteristics and evaluation system of the vitality of public space, which helps to acknowledge the dynamic relationship between users, urban environment, and vitality. It also looks forward to providing optimal design strategies for constructing a vigorous public space in future cities.

Keywords: public space, quantification of vitality, spatial vitality, urban vitality

Procedia PDF Downloads 106
1138 Adaptive Anchor Weighting for Improved Localization with Levenberg-Marquardt Optimization

Authors: Basak Can

Abstract:

This paper introduces an iterative and weighted localization method that utilizes a unique cost function formulation to significantly enhance the performance of positioning systems. The system employs locators, such as Gateways (GWs), to estimate and track the position of an End Node (EN). Performance is evaluated relative to the number of locators, with known locations determined through calibration. Performance evaluation is presented utilizing low cost single-antenna Bluetooth Low Energy (BLE) devices. The proposed approach can be applied to alternative Internet of Things (IoT) modulation schemes, as well as Ultra WideBand (UWB) or millimeter-wave (mmWave) based devices. In non-line-of-sight (NLOS) scenarios, using four or eight locators yields a 95th percentile localization performance of 2.2 meters and 1.5 meters, respectively, in a 4,305 square feet indoor area with BLE 5.1 devices. This method outperforms conventional RSSI-based techniques, achieving a 51% improvement with four locators and a 52 % improvement with eight locators. Future work involves modeling interference impact and implementing data curation across multiple channels to mitigate such effects.

Keywords: lateration, least squares, Levenberg-Marquardt algorithm, localization, path-loss, RMS error, RSSI, sensors, shadow fading, weighted localization

Procedia PDF Downloads 19
1137 Fuzzy Population-Based Meta-Heuristic Approaches for Attribute Reduction in Rough Set Theory

Authors: Mafarja Majdi, Salwani Abdullah, Najmeh S. Jaddi

Abstract:

One of the global combinatorial optimization problems in machine learning is feature selection. It concerned with removing the irrelevant, noisy, and redundant data, along with keeping the original meaning of the original data. Attribute reduction in rough set theory is an important feature selection method. Since attribute reduction is an NP-hard problem, it is necessary to investigate fast and effective approximate algorithms. In this paper, we proposed two feature selection mechanisms based on memetic algorithms (MAs) which combine the genetic algorithm with a fuzzy record to record travel algorithm and a fuzzy controlled great deluge algorithm to identify a good balance between local search and genetic search. In order to verify the proposed approaches, numerical experiments are carried out on thirteen datasets. The results show that the MAs approaches are efficient in solving attribute reduction problems when compared with other meta-heuristic approaches.

Keywords: rough set theory, attribute reduction, fuzzy logic, memetic algorithms, record to record algorithm, great deluge algorithm

Procedia PDF Downloads 450
1136 Operator Optimization Based on Hardware Architecture Alignment Requirements

Authors: Qingqing Gai, Junxing Shen, Yu Luo

Abstract:

Due to the hardware architecture characteristics, some operators tend to acquire better performance if the input/output tensor dimensions are aligned to a certain minimum granularity, such as convolution and deconvolution commonly used in deep learning. Furthermore, if the requirements are not met, the general strategy is to pad with 0 to satisfy the requirements, potentially leading to the under-utilization of the hardware resources. Therefore, for the convolution and deconvolution whose input and output channels do not meet the minimum granularity alignment, we propose to transfer the W-dimensional data to the C-dimension for computation (W2C) to enable the C-dimension to meet the hardware requirements. This scheme also reduces the number of computations in the W-dimension. Although this scheme substantially increases computation, the operator’s speed can improve significantly. It achieves remarkable speedups on multiple hardware accelerators, including Nvidia Tensor cores, Qualcomm digital signal processors (DSPs), and Huawei neural processing units (NPUs). All you need to do is modify the network structure and rearrange the operator weights offline without retraining. At the same time, for some operators, such as the Reducemax, we observe that transferring the Cdimensional data to the W-dimension(C2W) and replacing the Reducemax with the Maxpool can accomplish acceleration under certain circumstances.

Keywords: convolution, deconvolution, W2C, C2W, alignment, hardware accelerator

Procedia PDF Downloads 101
1135 Improving the Frequency Response of a Circular Dual-Mode Resonator with a Reconfigurable Bandwidth

Authors: Muhammad Haitham Albahnassi, Adnan Malki, Shokri Almekdad

Abstract:

In this paper, a method for reconfiguring bandwidth in a circular dual-mode resonator is presented. The method concerns the optimized geometry of a structure that may be used to host the tuning elements, which are typically RF (Radio Frequency) switches. The tuning elements themselves, and their performance during tuning, are not the focus of this paper. The designed resonator is able to reconfigure its fractional bandwidth by adjusting the inter-coupling level between the degenerate modes, while at the same time improving its response by adjusting the external-coupling level and keeping the center frequency fixed. The inter-coupling level has been adjusted by changing the dimensions of the perturbation element, while the external-coupling level has been adjusted by changing one of the feeder dimensions. The design was arrived at via optimization. Agreeing simulation and measurement results of the designed and implemented filters showed good improvements in return loss values and the stability of the center frequency.

Keywords: dual-mode resonators, perturbation theory, reconfigurable filters, software defined radio, cognitine radio

Procedia PDF Downloads 162
1134 Research on Ultrafine Particles Classification Using Hydrocyclone with Annular Rinse Water

Authors: Tao Youjun, Zhao Younan

Abstract:

The separation effect of fine coal can be improved by the process of pre-desliming. It was significantly enhanced when the fine coal was processed using Falcon concentrator with the removal of -45um coal slime. Ultrafine classification tests using Krebs classification cyclone with annular rinse water showed that increasing feeding pressure can effectively avoid the phenomena of heavy particles passing into overflow and light particles slipping into underflow. The increase of rinse water pressure could reduce the content of fine-grained particles while increasing the classification size. The increase in feeding concentration had a negative effect on the efficiency of classification, meanwhile increased the classification size due to the enhanced hindered settling caused by high underflow concentration. As a result of optimization experiments with response indicator of classification efficiency which based on orthogonal design using Design-Expert software indicated that the optimal classification efficiency reached 91.32% with the feeding pressure of 0.03MPa, the rinse water pressure of 0.02MPa and the feeding concentration of 12.5%. Meanwhile, the classification size was 49.99 μm which had a good agreement with the predicted value.

Keywords: hydrocyclone, ultrafine classification, slime, classification efficiency, classification size

Procedia PDF Downloads 163
1133 Modeling Heat-Related Mortality Based on Greenhouse Emissions in OECD Countries

Authors: Anderson Ngowa Chembe, John Olukuru

Abstract:

Greenhouse emissions by human activities are known to irreversibly increase global temperatures through the greenhouse effect. This study seeks to propose a mortality model with sensitivity to heat-change effects as one of the underlying parameters in the model. As such, the study sought to establish the relationship between greenhouse emissions and mortality indices in five OECD countries (USA, UK, Japan, Canada & Germany). Upon the establishment of the relationship using correlation analysis, an additional parameter that accounts for the sensitivity of heat-changes to mortality rates was incorporated in the Lee-Carter model. Based on the proposed model, new parameter estimates were calculated using iterative algorithms for optimization. Finally, the goodness of fit for the original Lee-Carter model and the proposed model were compared using deviance comparison. The proposed model provides a better fit to mortality rates especially in USA, UK and Germany where the mortality indices have a strong positive correlation with the level of greenhouse emissions. The results of this study are of particular importance to actuaries, demographers and climate-risk experts who seek to use better mortality-modeling techniques in the wake of heat effects caused by increased greenhouse emissions.

Keywords: climate risk, greenhouse emissions, Lee-Carter model, OECD

Procedia PDF Downloads 338
1132 Study of Behavior Tribological Cutting Tools Based on Coating

Authors: A. Achour L. Chekour, A. Mekroud

Abstract:

Tribology, the science of lubrication, friction and wear, plays an important role in science "crossroads" initiated by the recent developments in the industry. Its multidisciplinary nature reinforces its scientific interest. It covers all the sciences that deal with the contact between two solids loaded and relative motion. It is thus one of the many intersections more clearly established disciplines such as solid mechanics and the fluids, rheological, thermal, materials science and chemistry. As for his experimental approach, it is based on the physical and processing signals and images. The optimization of operating conditions by cutting tool must contribute significantly to the development and productivity of advanced automation of machining techniques because their implementation requires sufficient knowledge of how the process and in particular the evolution of tool wear. In addition, technological advances have developed the use of very hard materials, refractory difficult machinability, requiring highly resistant materials tools. In this study, we present the behavior wear a machining tool during the roughing operation according to the cutting parameters. The interpretation of the experimental results is based mainly on observations and analyzes of sharp edges e tool using the latest techniques: scanning electron microscopy (SEM) and optical rugosimetry laser beam.

Keywords: friction, wear, tool, cutting

Procedia PDF Downloads 329
1131 Optimization of Machine Learning Regression Results: An Application on Health Expenditures

Authors: Songul Cinaroglu

Abstract:

Machine learning regression methods are recommended as an alternative to classical regression methods in the existence of variables which are difficult to model. Data for health expenditure is typically non-normal and have a heavily skewed distribution. This study aims to compare machine learning regression methods by hyperparameter tuning to predict health expenditure per capita. A multiple regression model was conducted and performance results of Lasso Regression, Random Forest Regression and Support Vector Machine Regression recorded when different hyperparameters are assigned. Lambda (λ) value for Lasso Regression, number of trees for Random Forest Regression, epsilon (ε) value for Support Vector Regression was determined as hyperparameters. Study results performed by using 'k' fold cross validation changed from 5 to 50, indicate the difference between machine learning regression results in terms of R², RMSE and MAE values that are statistically significant (p < 0.001). Study results reveal that Random Forest Regression (R² ˃ 0.7500, RMSE ≤ 0.6000 ve MAE ≤ 0.4000) outperforms other machine learning regression methods. It is highly advisable to use machine learning regression methods for modelling health expenditures.

Keywords: machine learning, lasso regression, random forest regression, support vector regression, hyperparameter tuning, health expenditure

Procedia PDF Downloads 220
1130 Moral Rights: Judicial Evidence Insufficiency in the Determination of the Truth and Reasoning in Brazilian Morally Charged Cases

Authors: Rainner Roweder

Abstract:

Theme: The present paper aims to analyze the specificity of the judicial evidence linked to the subjects of dignity and personality rights, otherwise known as moral rights, in the determination of the truth and formation of the judicial reasoning in cases concerning these areas. This research is about the way courts in Brazilian domestic law search for truth and handles evidence in cases involving moral rights that are abundant and important in Brazil. The main object of the paper is to analyze the effectiveness of the evidence in the formation of judicial conviction in matters related to morally controverted rights, based on the Brazilian, and as a comparison, the Latin American legal systems. In short, the rights of dignity and personality are moral. However, the evidential legal system expects a rational demonstration of moral rights that generate judicial conviction or persuasion. Moral, in turn, tends to be difficult or impossible to demonstrate in court, generating the problem considered in this paper, that is, the study of the moral demonstration problem as proof in court. In this sense, the more linked to moral, the more difficult to be demonstrated in court that right is, expanding the field of judicial discretion, generating legal uncertainty. More specifically, the new personality rights, such as gender, and their possibility of alteration, further amplify the problem being essentially an intimate manner, which does not exist in the objective, rational evidential system, as normally occurs in other categories, such as contracts. Therefore, evidencing this legal category in court, with the level of security required by the law, is a herculean task. It becomes virtually impossible to use the same evidentiary system when judging the rights researched here; therefore, it generates the need for a new design of the evidential task regarding the rights of the personality, a central effort of the present paper. Methodology: Concerning the methodology, the Method used in the Investigation phase was Inductive, with the use of the comparative law method; in the data treatment phase, the Inductive Method was also used. Doctrine, Legislative, and jurisprudential comparison was the technique research used. Results: In addition to the peculiar characteristics of personality rights that are not found in other rights, part of them are essentially linked to morale and are not objectively verifiable by design, and it is necessary to use specific argumentative theories for their secure confirmation, such as interdisciplinary support. The traditional pragmatic theory of proof, for having an obvious objective character, when applied in the rights linked to the morale, aggravates decisionism and generates legal insecurity, being necessary its reconstruction for morally charged cases, with the possible use of the “predictive theory” ( and predictive facts) through algorithms in data collection and treatment.

Keywords: moral rights, proof, pragmatic proof theory, insufficiency, Brazil

Procedia PDF Downloads 107
1129 Improved Multi-Channel Separation Algorithm for Satellite-Based Automatic Identification System Signals Based on Artificial Bee Colony and Adaptive Moment Estimation

Authors: Peng Li, Luan Wang, Haifeng Fei, Renhong Xie, Yibin Rui, Shanhong Guo

Abstract:

The applications of satellite-based automatic identification system (S-AIS) pave the road for wide-range maritime traffic monitoring and management. But the coverage of satellite’s view includes multiple AIS self-organizing networks, which leads to the collision of AIS signals from different cells. The contribution of this work is to propose an improved multi-channel blind source separation algorithm based on Artificial Bee Colony (ABC) and advanced stochastic optimization to perform separation of the mixed AIS signals. The proposed approach adopts modified ABC algorithm to get an optimized initial separating matrix, which can expedite the initialization bias correction, and utilizes the Adaptive Moment Estimation (Adam) to update the separating matrix by adjusting the learning rate for each parameter dynamically. Simulation results show that the algorithm can speed up convergence and lead to better performance in separation accuracy.

Keywords: satellite-based automatic identification system, blind source separation, artificial bee colony, adaptive moment estimation

Procedia PDF Downloads 183
1128 Modeling Sustainable Truck Rental Operations Using Closed-Loop Supply Chain Network

Authors: Khaled S. Abdallah, Abdel-Aziz M. Mohamed

Abstract:

Moving industries consume numerous resources and dispose masses of used packaging materials. Proper sorting, recycling and disposing the packaging materials is necessary to avoid a sever pollution disaster. This research paper presents a conceptual model to propose sustainable truck rental operations instead of the regular one. An optimization model was developed to select the locations of truck rental centers, collection sites, maintenance and repair sites, and identify the rental fees to be charged for all routes that maximize the total closed supply chain profits. Fixed costs of vehicle purchasing, costs of constructing collection centers and repair centers, as well as the fixed costs paid to use disposal and recycling centers are considered. Operating costs include the truck maintenance, repair costs as well as the cost of recycling and disposing the packing materials, and the costs of relocating the truck are presented in the model. A mixed integer model is developed followed by a simulation model to examine the factors affecting the operation of the model.

Keywords: modeling, truck rental, supply chains management.

Procedia PDF Downloads 223
1127 Optimization of Fin Type and Fin per Inch on Heat Transfer and Pressure Drop of an Air Cooler

Authors: A. Falavand Jozaei, A. Ghafouri

Abstract:

Operation enhancement in an air cooler (heat exchanger) depends on the rate of heat transfer, and pressure drop. In this paper, for a given heat duty, study of the effects of FPI (fin per inch) and fin type (circular and hexagonal fins) on two parameters mentioned above is considered in an air cooler in Iran, Arvand petrochemical. A program in EES (Engineering Equations Solver) software moreover, Aspen B-JAC and HTFS+ software are used for this purpose to solve governing equations. At first the simulated results obtained from this program is compared to the experimental data for two cases of FPI. The effects of FPI from 3 to 15 over heat transfer (Q) to pressure drop ratio (Q/Δp ratio). This ratio is one of the main parameters in design, rating, and simulation heat exchangers. The results show that heat transfer (Q) and pressure drop increase with increasing FPI (fin per inch) steadily, and the Q/Δp ratio increases to FPI = 12 (for circular fins about 47% and for hexagonal fins about 69%) and then decreased gradually to FPI = 15 (for circular fins about 5% and for hexagonal fins about 8%), and Q/Δp ratio is maximum at FPI = 12. The FPI value selection between 8 and 12 obtained as a result to optimum heat transfer to pressure drop ratio. Also by contrast, between circular and hexagonal fins results, the Q/Δp ratio of hexagonal fins more than Q/Δp ratio of circular fins for FPI between 8 and 12 (optimum FPI).

Keywords: air cooler, circular and hexagonal fins, fin per inch, heat transfer and pressure drop

Procedia PDF Downloads 448
1126 Statistical Manufacturing Cell/Process Qualification Sample Size Optimization

Authors: Angad Arora

Abstract:

In production operations/manufacturing, a cell or line is typically a bunch of similar machines (computer numerical control (CNCs), advanced cutting, 3D printing or special purpose machines. For qualifying a typical manufacturing line /cell / new process, Ideally, we need a sample of parts that can be flown through the process and then we make a judgment on the health of the line/cell. However, with huge volumes and mass production scope, such as in the mobile phone industry, for example, the actual cells or lines can go in thousands and to qualify each one of them with statistical confidence means utilizing samples that are very large and eventually add to product /manufacturing cost + huge waste if the parts are not intended to be customer shipped. To solve this, we come up with 2 steps statistical approach. We start with a small sample size and then objectively evaluate whether the process needs additional samples or not. For example, if a process is producing bad parts and we saw those samples early, then there is a high chance that the process will not meet the desired yield and there is no point in keeping adding more samples. We used this hypothesis and came up with 2 steps binomial testing approach. Further, we also prove through results that we can achieve an 18-25% reduction in samples while keeping the same statistical confidence.

Keywords: statistics, data science, manufacturing process qualification, production planning

Procedia PDF Downloads 90
1125 Effectiveness of Column Geometry in High-Rise Buildings

Authors: Man Singh Meena

Abstract:

Structural engineers are facing different kind of challenges due to innovative & bold ideas of architects who are trying to design every structure with uniqueness. In RCC frame structures different geometry of columns can be used in design and rectangular columns can be placed with different type orientation. The analysis is design of structures can also be carried out by different type of software available i.e., STAAD Pro, ETABS and TEKLA. In recent times high-rise building modeling & analysis is done by ETABS due to its certain features which are superior to other software. The case study in this paper mainly emphasizes on structural behavior of high rise building for different column shape configurations like Circular, Square, Rectangular and Rectangular with 90-degree Rotation and rectangular shape plan. In all these column shapes the areas of columns are kept same to study the effect on design of concrete area is same. Modelling of 20-storeys R.C.C. framed building is done on the ETABS software for analysis. Post analysis of the structure, maximum bending moments, shear forces and maximum longitudinal reinforcement are computed and compared for three different story structures to identify the effectiveness of geometry of column.

Keywords: high-rise building, column geometry, building modelling, ETABS analysis, building design, structural analysis, structural optimization

Procedia PDF Downloads 76
1124 Suggestion of Methodology to Detect Building Damage Level Collectively with Flood Depth Utilizing Geographic Information System at Flood Disaster in Japan

Authors: Munenari Inoguchi, Keiko Tamura

Abstract:

In Japan, we were suffered by earthquake, typhoon, and flood disaster in 2019. Especially, 38 of 47 prefectures were affected by typhoon #1919 occurred in October 2019. By this disaster, 99 people were dead, three people were missing, and 484 people were injured as human damage. Furthermore, 3,081 buildings were totally collapsed, 24,998 buildings were half-collapsed. Once disaster occurs, local responders have to inspect damage level of each building by themselves in order to certificate building damage for survivors for starting their life reconstruction process. At that disaster, the total number to be inspected was so high. Based on this situation, Cabinet Office of Japan approved the way to detect building damage level efficiently, that is collectively detection. However, they proposed a just guideline, and local responders had to establish the concrete and infallible method by themselves. Against this issue, we decided to establish the effective and efficient methodology to detect building damage level collectively with flood depth. Besides, we thought that the flood depth was relied on the land height, and we decided to utilize GIS (Geographic Information System) for analyzing the elevation spatially. We focused on the analyzing tool of spatial interpolation, which is utilized to survey the ground water level usually. In establishing the methodology, we considered 4 key-points: 1) how to satisfy the condition defined in the guideline approved by Cabinet Office for detecting building damage level, 2) how to satisfy survivors for the result of building damage level, 3) how to keep equitability and fairness because the detection of building damage level was executed by public institution, 4) how to reduce cost of time and human-resource because they do not have enough time and human-resource for disaster response. Then, we proposed a methodology for detecting building damage level collectively with flood depth utilizing GIS with five steps. First is to obtain the boundary of flooded area. Second is to collect the actual flood depth as sampling over flooded area. Third is to execute spatial analysis of interpolation with sampled flood depth to detect two-dimensional flood depth extent. Fourth is to divide to blocks by four categories of flood depth (non-flooded, over the floor to 100 cm, 100 cm to 180 cm and over 180 cm) following lines of roads for getting satisfaction from survivors. Fifth is to put flood depth level to each building. In Koriyama city of Fukushima prefecture, we proposed the methodology of collectively detection for building damage level as described above, and local responders decided to adopt our methodology at typhoon #1919 in 2019. Then, we and local responders detect building damage level collectively to over 1,000 buildings. We have received good feedback that the methodology was so simple, and it reduced cost of time and human-resources.

Keywords: building damage inspection, flood, geographic information system, spatial interpolation

Procedia PDF Downloads 121
1123 Optimizing Resource Allocation and Indoor Location Using Bluetooth Low Energy

Authors: Néstor Álvarez-Díaz, Pino Caballero-Gil, Héctor Reboso-Morales, Francisco Martín-Fernández

Abstract:

The recent tendency of "Internet of Things" (IoT) has developed in the last years, causing the emergence of innovative communication methods among multiple devices. The appearance of Bluetooth Low Energy (BLE) has allowed a push to IoT in relation to smartphones. In this moment, a set of new applications related to several topics like entertainment and advertisement has begun to be developed but not much has been done till now to take advantage of the potential that these technologies can offer on many business areas and in everyday tasks. In the present work, the application of BLE technology and smartphones is proposed on some business areas related to the optimization of resource allocation in huge facilities like airports. An indoor location system has been developed through triangulation methods with the use of BLE beacons. The described system can be used to locate all employees inside the building in such a way that any task can be automatically assigned to a group of employees. It should be noted that this system cannot only be used to link needs with employees according to distances, but it also takes into account other factors like occupation level or category. In addition, it has been endowed with a security system to manage business and personnel sensitive data. The efficiency of communications is another essential characteristic that has been taken into account in this work.

Keywords: bluetooth low energy, indoor location, resource assignment, smartphones

Procedia PDF Downloads 388
1122 Possibility Theory Based Multi-Attribute Decision-Making: Application in Facility Location-Selection Problem under Uncertain and Extreme Environment

Authors: Bezhan Ghvaberidze

Abstract:

A fuzzy multi-objective facility location-selection problem (FLSP) under uncertain and extreme environments based on possibility theory is developed. The model’s uncertain parameters in the q-rung orthopair fuzzy values are presented and transformed in the Dempster-Shaper’s belief structure environment. An objective function – distribution centers’ selection ranking index as an extension of Dempster’s extremal expectations under discrimination q-rung orthopair fuzzy information is constructed. Experts evaluate each humanitarian aid from distribution centers (HADC) against each of the uncertain factors. HADCs location problem is reduced to the bicriteria problem of partitioning the set of customers by the set of centers: (1) – Minimization of transportation costs; (2) – Maximization of centers’ selection ranking indexes. Partitioning type constraints are also constructed. For an illustration of the obtained results, a numerical example is created from the facility location-selection problem.

Keywords: FLSP, multi-objective combinatorial optimization problem, evidence theory, HADC, q-rung orthopair fuzzy set, possibility theory

Procedia PDF Downloads 116
1121 The Use of Nano-Crystalline Starch in Probiotic Yogurt and Its Effects on the Physicochemical and Biological Properties

Authors: Ali Seirafi

Abstract:

The purpose of this study was to investigate the effect and application of starch nanocrystals on physicochemical and microbial properties in the industrial production of probiotic yogurt. In this study, probiotic yoghurt was manufactured by industrial method with the optimization and control of the technological factors affecting the probabilistic biomass, using probiotic bacteria Lactobacillus acidophilus and Bifidobacterium bifidum with commonly used yogurt primers. Afterwards, the effects of different levels of fat (1.3%, 2.5 and 4%), as well as the effects of various perbiotic compounds include starch nanocrystals (0.5%, 1 and 1.5%), galactolegalosaccharide (0.5% 1 and 1.5%) and fructooligosaccharide (0.5%, 1 and 1.5%) were evaluated. In addition, the effect of packaging (polyethylene and glass) was studied, while the effect of pH changes and final acidity were studied at each stage. In this research, all experiments were performed in 3 replications and the results were analyzed in a completely randomized design with SAS version 9.1 software. The results of this study showed that the addition of starch nanocrystal compounds as well as the use of glass packaging had the most positive effects on the survival of Lactobacillus acidophilus bacteria and the addition of nano-crystals and the increase in the cooling rate of the product, had the most positive effects on the survival of bacteria Bifidobacterium bifidum.

Keywords: Bifidobacterium bifidum, Lactobacillus acidophilus, prebiotics, probiotic yogurt

Procedia PDF Downloads 156
1120 Decomposition of the Customer-Server Interaction in Grocery Shops

Authors: Andreas Ahrens, Ojaras Purvinis, Jelena Zascerinska

Abstract:

A successful shopping experience without overcrowded shops and long waiting times undoubtedly leads to the release of happiness hormones and is generally considered the goal of any optimization. Factors influencing the shopping experience can be divided into internal and external ones. External factors are related, e. g. to the arrival of the customers to the shop, whereas internal are linked with the service process itself when checking out (waiting in the queue to the cash register and the scanning of the goods as well as the payment process itself) or any other non-expected delay when changing the status from a visitor to a buyer by choosing goods or items. This paper divides the customer-server interaction into five phases starting with the customer's arrival at the shop, the selection of goods, the buyer waiting in the queue to the cash register, the payment process, and ending with the customer or buyer's departure. Our simulation results show how five phases are intertwined and influence the overall shopping experience. Parameters for measuring the shopping experience are estimated based on the burstiness level in each of the five phases of the customer-server interaction.

Keywords: customers’ burstiness, cash register, customers’ wait-ing time, gap distribution function

Procedia PDF Downloads 146
1119 Additive Manufacturing with Ceramic Filler

Authors: Irsa Wolfram, Boruch Lorenz

Abstract:

Innovative solutions with additive manufacturing applying material extrusion for functional parts necessitate innovative filaments with persistent quality. Uniform homogeneity and a consistent dispersion of particles embedded in filaments generally require multiple cycles of extrusion or well-prepared primal matter by injection molding, kneader machines, or mixing equipment. These technologies commit to dedicated equipment that is rarely at the disposal in production laboratories unfamiliar with research in polymer materials. This stands in contrast to laboratories that investigate complex material topics and technology science to leverage the potential of 3-D printing. Consequently, scientific studies in labs are often constrained to compositions and concentrations of fillersofferedfrom the market. Therefore, we introduce a prototypal laboratory methodology scalable to tailoredprimal matter for extruding ceramic composite filaments with fused filament fabrication (FFF) technology. - A desktop single-screw extruder serves as a core device for the experiments. Custom-made filaments encapsulate the ceramic fillers and serve with polylactide (PLA), which is a thermoplastic polyester, as primal matter and is processed in the melting area of the extruder, preserving the defined concentration of the fillers. Validated results demonstrate that this approach enables continuously produced and uniform composite filaments with consistent homogeneity. Itis 3-D printable with controllable dimensions, which is a prerequisite for any scalable application. Additionally, digital microscopy confirms the steady dispersion of the ceramic particles in the composite filament. - This permits a 2D reconstruction of the planar distribution of the embedded ceramic particles in the PLA matrices. The innovation of the introduced method lies in the smart simplicity of preparing the composite primal matter. It circumvents the inconvenience of numerous extrusion operations and expensive laboratory equipment. Nevertheless, it deliversconsistent filaments of controlled, predictable, and reproducible filler concentration, which is the prerequisite for any industrial application. The introduced prototypal laboratory methodology seems capable for other polymer matrices and suitable to further utilitarian particle types beyond and above ceramic fillers. This inaugurates a roadmap for supplementary laboratory development of peculiar composite filaments, providing value for industries and societies. This low-threshold entry of sophisticated preparation of composite filaments - enabling businesses to create their own dedicated filaments - will support the mutual efforts for establishing 3D printing to new functional devices.

Keywords: additive manufacturing, ceramic composites, complex filament, industrial application

Procedia PDF Downloads 103
1118 Optimization of Surface Roughness by Taguchi’s Method for Turning Process

Authors: Ashish Ankus Yerunkar, Ravi Terkar

Abstract:

Study aimed at evaluating the best process environment which could simultaneously satisfy requirements of both quality as well as productivity with special emphasis on reduction of cutting tool flank wear, because reduction in flank wear ensures increase in tool life. The predicted optimal setting ensured minimization of surface roughness. Purpose of this paper is focused on the analysis of optimum cutting conditions to get lowest surface roughness in turning SCM 440 alloy steel by Taguchi method. Design for the experiment was done using Taguchi method and 18 experiments were designed by this process and experiments conducted. The results are analyzed using ANOVA method. Taguchi method has depicted that the depth of cut has significant role to play in producing lower surface roughness followed by feed. The Cutting speed has lesser role on surface roughness from the tests. The vibrations of the machine tool, tool chattering are the other factors which may contribute poor surface roughness to the results and such factors ignored for analyses. The inferences by this method will be useful to other researches for similar type of study and may be vital for further research on tool vibrations, cutting forces etc.

Keywords: surface roughness (ra), machining, dry turning, taguchi method, turning process, anova method, mahr perthometer

Procedia PDF Downloads 365
1117 Operational Excellence Performance in Pharmaceutical Quality Control Labs: An Empirical Investigation of the Effectiveness and Efficiency Relation

Authors: Stephan Koehler, Thomas Friedli

Abstract:

Performance measurement has evolved over time from a unidimensional short-term efficiency focused approach into a balanced multidimensional approach. Today, integrated performance measurement frameworks are often used to avoid local optimization and to encourage continuous improvement of an organization. In literature, the multidimensional characteristic of performance measurement is often described by competitive priorities. At the same time, on the highest abstraction level an effectiveness and efficiency dimension of performance measurement can be distinguished. This paper aims at a better understanding of the composition of effectiveness and efficiency and their relation in pharmaceutical quality control labs. The research comprises a lab-specific operationalization of effectiveness and efficiency and examines how the two dimensions are interlinked. The basis for the analysis represents a database of the University of St. Gallen including a divers set of 40 different pharmaceutical quality control labs. The research provides empirical evidence that labs with a high effectiveness also accompany a high efficiency. Lab effectiveness explains 29.5 % of the variance in lab efficiency. In addition, labs with an above median operational excellence performance have a statistically significantly higher lab effectiveness and lab efficiency compared to the below median performing labs.

Keywords: empirical study, operational excellence, performance measurement, pharmaceutical quality control lab

Procedia PDF Downloads 155
1116 Multi-Agent System for Irrigation Using Fuzzy Logic Algorithm and Open Platform Communication Data Access

Authors: T. Wanyama, B. Far

Abstract:

Automatic irrigation systems usually conveniently protect landscape investment. While conventional irrigation systems are known to be inefficient, automated ones have the potential to optimize water usage. In fact, there is a new generation of irrigation systems that are smart in the sense that they monitor the weather, soil conditions, evaporation and plant water use, and automatically adjust the irrigation schedule. In this paper, we present an agent based smart irrigation system. The agents are built using a mix of commercial off the shelf software, including MATLAB, Microsoft Excel and KEPServer Ex5 OPC server, and custom written code. The Irrigation Scheduler Agent uses fuzzy logic to integrate the information that affect the irrigation schedule. In addition, the Multi-Agent system uses Open Platform Connectivity (OPC) technology to share data. OPC technology enables the Irrigation Scheduler Agent to communicate over the Internet, making the system scalable to a municipal or regional agent based water monitoring, management, and optimization system. Finally, this paper presents simulation and pilot installation test result that show the operational effectiveness of our system.

Keywords: community water usage, fuzzy logic, irrigation, multi-agent system

Procedia PDF Downloads 292