Search results for: Nina Distribution Company branches
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2483

Search results for: Nina Distribution Company branches

2033 Sustainability Reporting and Performances of the Companies in the Istanbul Stock Exchange Sustainability Index

Authors: Zeynep Şahin, Züleyha Yılmaz, Fikret Çankaya

Abstract:

In today's business world, in which it is difficult to survive, the economic life of products, services or knowledge is considerably reduced. Competitors produce similar products or extra-featured ones instantly. In this environment, the contribution of companies to the social and economic environment is a preferred criterion by consumers alongside products or services. Therefore, consumers need to obtain more detailed information about companies. Besides, this drastic change in the market encourages companies to become sustainable. Sustainable business means the company puts consumed products back. Corporate sustainability, corresponds to sustainability at the level of the company, and gives equal importance to company growth and profitability together with environmental and social issues. The BIST Sustainability Index started to be calculated by the Istanbul Stock Exchange (BIST) in 2014 to evaluate the sustainability performance of companies in Turkey. The main objective of this study is to present the importance of sustainability reports in Turkey. To this aim, the performances of 15 companies in the BIST Sustainability Index were compared the periods before and after entering the index. On the other hand, sustainability reporting practices should be encouraged to increase studies on this issue. In this context, to remain on the agenda of the issue is a further objective of this study. To achieve these objectives, the financial data of the companies in the period before and after entering to the BIST Sustainability Index were analyzed using t-test in Statistical Package for the Social Sciences (SPSS) package. The results of the study showed that no significant difference between the performances of the companies in terms of the net profit margin, the return on assets and equity capital in these periods could be found. Therefore, it can be said that insufficient importance is given to sustainability issues in Turkey. The reasons for this situation might be considered as a lack of awareness due to the recent introduction and calculation of the index. It is expected that the awareness of firms and investors about sustainability will increase, and that they will demonstrate the necessary importance to this issue over time.

Keywords: BIST sustainability index, firm performance, sustainability, sustainability reporting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 940
2032 Accurate Crosstalk Analysis for RLC On-Chip VLSI Interconnect

Authors: Susmita Sahoo, Madhumanti Datta, Rajib Kar

Abstract:

This work proposes an accurate crosstalk noise estimation method in the presence of multiple RLC lines for the use in design automation tools. This method correctly models the loading effects of non switching aggressors and aggressor tree branches using resistive shielding effect and realistic exponential input waveforms. Noise peak and width expressions have been derived. The results obtained are at good agreement with SPICE results. Results show that average error for noise peak is 4.7% and for the width is 6.15% while allowing a very fast analysis.

Keywords: Crosstalk, distributed RLC segments, On-Chip interconnect, output response, VLSI, noise peak, noise width.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636
2031 Estimation of Bayesian Sample Size for Binomial Proportions Using Areas P-tolerance with Lowest Posterior Loss

Authors: H. Bevrani, N. Najafi

Abstract:

This paper uses p-tolerance with the lowest posterior loss, quadratic loss function, average length criteria, average coverage criteria, and worst outcome criterion for computing of sample size to estimate proportion in Binomial probability function with Beta prior distribution. The proposed methodology is examined, and its effectiveness is shown.

Keywords: Bayesian inference, Beta-binomial Distribution, LPLcriteria, quadratic loss function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740
2030 Key Exchange Protocol over Insecure Channel

Authors: Alaa Fahmy

Abstract:

Key management represents a major and the most sensitive part of cryptographic systems. It includes key generation, key distribution, key storage, and key deletion. It is also considered the hardest part of cryptography. Designing secure cryptographic algorithms is hard, and keeping the keys secret is much harder. Cryptanalysts usually attack both symmetric and public key cryptosystems through their key management. We introduce a protocol to exchange cipher keys over insecure communication channel. This protocol is based on public key cryptosystem, especially elliptic curve cryptosystem. Meanwhile, it tests the cipher keys and selects only the good keys and rejects the weak one.

Keywords: Key management and key distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1483
2029 An Evaluation Method of Accelerated Storage Life Test for Typical Mechanical and Electronic Products

Authors: Jinyong Yao, Hongzhi Li, Chao Du, Jiao Li

Abstract:

Reliability of long-term storage products is related to the availability of the whole system, and the evaluation of storage life is of great necessity. These products are usually highly reliable and little failure information can be collected. In this paper, an analytical method based on data from accelerated storage life test is proposed to evaluate the reliability index of the long-term storage products. Firstly, singularities are eliminated by data normalization and residual analysis. Secondly, with the preprocessed data, the degradation path model is built to obtain the pseudo life values. Then by life distribution hypothesis, we can get the estimator of parameters in high stress levels and verify failure mechanism consistency. Finally, the life distribution under the normal stress level is extrapolated via the acceleration model and evaluation of the actual average life is available. An application example with the camera stabilization device is provided to illustrate the methodology we proposed.

Keywords: Accelerated storage life test, failure mechanism consistency, life distribution, reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2269
2028 Genetic Mining: Using Genetic Algorithm for Topic based on Concept Distribution

Authors: S. M. Khalessizadeh, R. Zaefarian, S.H. Nasseri, E. Ardil

Abstract:

Today, Genetic Algorithm has been used to solve wide range of optimization problems. Some researches conduct on applying Genetic Algorithm to text classification, summarization and information retrieval system in text mining process. This researches show a better performance due to the nature of Genetic Algorithm. In this paper a new algorithm for using Genetic Algorithm in concept weighting and topic identification, based on concept standard deviation will be explored.

Keywords: Genetic Algorithm, Text Mining, Term Weighting, Concept Extraction, Concept Distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3693
2027 Analysis of Different Combining Schemes of Two Amplify-Forward Relay Branches with Individual Links Experiencing Nakagami Fading

Authors: Babu Sena Paul, Ratnajit Bhattacharjee

Abstract:

Relay based communication has gained considerable importance in the recent years. In this paper we find the end-toend statistics of a two hop non-regenerative relay branch, each hop being Nakagami-m faded. Closed form expressions for the probability density functions of the signal envelope at the output of a selection combiner and a maximal ratio combiner at the destination node are also derived and analytical formulations are verified through computer simulation. These density functions are useful in evaluating the system performance in terms of bit error rate and outage probability.

Keywords: co-operative diversity, diversity combining, maximal ratio combining, selection combining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590
2026 A Visual Control Flow Language and Its Termination Properties

Authors: László Lengyel, Tihamér Levendovszky, Hassan Charaf

Abstract:

This paper presents the visual control flow support of Visual Modeling and Transformation System (VMTS), which facilitates composing complex model transformations out of simple transformation steps and executing them. The VMTS Visual Control Flow Language (VCFL) uses stereotyped activity diagrams to specify control flow structures and OCL constraints to choose between different control flow branches. This work discusses the termination properties of VCFL and provides an algorithm to support the termination analysis of VCFL transformations.

Keywords: Control Flow, Metamodel-Based Visual Model Transformation, OCL, Termination Properties, UML.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2054
2025 Implementation of ADETRAN Language Using Message Passing Interface

Authors: Akiyoshi Wakatani

Abstract:

This paper describes the Message Passing Interface (MPI) implementation of ADETRAN language, and its evaluation on SX-ACE supercomputers. ADETRAN language includes pdo statement that specifies the data distribution and parallel computations and pass statement that specifies the redistribution of arrays. Two methods for implementation of pass statement are discussed and the performance evaluation using Splitting-Up CG method is presented. The effectiveness of the parallelization is evaluated and the advantage of one dimensional distribution is empirically confirmed by using the results of experiments.

Keywords: Iterative methods, array redistribution, translator, distributed memory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1191
2024 Lean Environmental Management Integration System (LEMIS) Framework Development

Authors: Puvanasvaran, A. P., Suresh V., N. Norazlin

Abstract:

The Lean Environmental Management Integration System (LEMIS) framework development is integration between lean core element and ISO 14001. The curiosity on the relationship between continuous improvement and sustainability of lean implementation has influenced this study toward LEMIS. Characteristic of ISO 14001 standard clauses and core elements of lean principles are explored from past studies and literature reviews. Survey was carried out on ISO 14001 certified companies to examine continual improvement by implementing the ISO 14001 standard. The study found that there is a significant and positive relationship between Lean Principles: value, value stream, flow, pull and perfection with the ISO 14001 requirements. LEMIS is significant to support the continuous improvement and sustainability. The integration system can be implemented to any manufacturing company. It gives awareness on the importance on why organizations need to sustain its environmental management system. In the meantime, the lean principle can be adapted in order to streamline daily activities of the company. Throughout the study, it had proven that there is no sacrifice or trade-off between lean principles with ISO 14001 requirements. The framework developed in the study can be further simplified in the future, especially the method of crossing each sub requirements of ISO 14001 standard with the core elements of Lean principles in this study.

Keywords: LEMIS, ISO 14001, integration, framework.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2374
2023 On Optimum Stratification

Authors: M. G. M. Khan, V. D. Prasad, D. K. Rao

Abstract:

In this manuscript, we discuss the problem of determining the optimum stratification of a study (or main) variable based on the auxiliary variable that follows a uniform distribution. If the stratification of survey variable is made using the auxiliary variable it may lead to substantial gains in precision of the estimates. This problem is formulated as a Nonlinear Programming Problem (NLPP), which turn out to multistage decision problem and is solved using dynamic programming technique.

Keywords: Auxiliary variable, Dynamic programming technique, Nonlinear programming problem, Optimum stratification, Uniform distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2099
2022 A Study of the Alumina Distribution in the Lab-Scale Cell during Aluminum Electrolysis

Authors: Olga Tkacheva, Pavel Arkhipov, Alexey Rudenko, Yurii Zaikov

Abstract:

The aluminum electrolysis process in the conventional cryolite-alumina electrolyte with cryolite ratio of 2.7 was carried out at an initial temperature of 970 °C and the anode current density of 0.5 A/cm2 in a 15A lab-scale cell in order to study the formation of the side ledge during electrolysis and the alumina distribution between electrolyte and side ledge. The alumina contained 35.97% α-phase and 64.03% γ-phase with the particles size in the range of 10-120 μm. The cryolite ratio and the alumina concentration were determined in molten electrolyte during electrolysis and in frozen bath after electrolysis. The side ledge in the electrolysis cell was formed only by the 13th hour of electrolysis. With a slight temperature decrease a significant increase in the side ledge thickness was observed. The basic components of the side ledge obtained by the XRD phase analysis were Na3AlF6, Na5Al3F14, Al2O3, and NaF.5CaF2.AlF3. As in the industrial cell, the increased alumina concentration in the side ledge formed on the cell walls and at the ledge-electrolyte-aluminum three-phase boundary during aluminum electrolysis in the lab cell was found (FTP No 05.604.21.0239, IN RFMEFI60419X0239).

Keywords: Alumina, alumina distribution, aluminum electrolyzer, cryolite-alumina electrolyte, side ledge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 733
2021 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength

Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos

Abstract:

Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.

Keywords: Statistical slope stability analysis, Skew distributions, Probability of failure, Functions of random variables.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
2020 Entropic Measures of a Probability Sample Space and Exponential Type (α, β) Entropy

Authors: Rajkumar Verma, Bhu Dev Sharma

Abstract:

Entropy is a key measure in studies related to information theory and its many applications. Campbell for the first time recognized that the exponential of the Shannon’s entropy is just the size of the sample space, when distribution is uniform. Here is the idea to study exponentials of Shannon’s and those other entropy generalizations that involve logarithmic function for a probability distribution in general. In this paper, we introduce a measure of sample space, called ‘entropic measure of a sample space’, with respect to the underlying distribution. It is shown in both discrete and continuous cases that this new measure depends on the parameters of the distribution on the sample space - same sample space having different ‘entropic measures’ depending on the distributions defined on it. It was noted that Campbell’s idea applied for R`enyi’s parametric entropy of a given order also. Knowing that parameters play a role in providing suitable choices and extended applications, paper studies parametric entropic measures of sample spaces also. Exponential entropies related to Shannon’s and those generalizations that have logarithmic functions, i.e. are additive have been studies for wider understanding and applications. We propose and study exponential entropies corresponding to non additive entropies of type (α, β), which include Havard and Charvˆat entropy as a special case.

Keywords: Sample space, Probability distributions, Shannon’s entropy, R`enyi’s entropy, Non-additive entropies .

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3385
2019 Voltage Stability Investigation of Grid Connected Wind Farm

Authors: Trinh Trong Chuong

Abstract:

At present, it is very common to find renewable energy resources, especially wind power, connected to distribution systems. The impact of this wind power on voltage distribution levels has been addressed in the literature. The majority of this works deals with the determination of the maximum active and reactive power that is possible to be connected on a system load bus, until the voltage at that bus reaches the voltage collapse point. It is done by the traditional methods of PV curves reported in many references. Theoretical expression of maximum power limited by voltage stability transfer through a grid is formulated using an exact representation of distribution line with ABCD parameters. The expression is used to plot PV curves at various power factors of a radial system. Limited values of reactive power can be obtained. This paper presents a method to study the relationship between the active power and voltage (PV) at the load bus to identify the voltage stability limit. It is a foundation to build a permitted working operation region in complying with the voltage stability limit at the point of common coupling (PCC) connected wind farm.

Keywords: Wind generator, Voltage stability, grid connected

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3643
2018 Effect of Fine-Ground Ceramic Admixture on Early Age Properties of Cement Paste

Authors: Z. Pavlík, M. Pavlíková, P. Volfová, M. Keppert, R. Černý

Abstract:

Properties of cement pastes with fine-ground ceramics used as an alternative binder replacing Portland cement up to 20% of its mass are investigated. At first, the particle size distribution of cement and fine-ground ceramics is measured using laser analyser. Then, the material properties are studied in the early hardening period up to 28 days. The hydration process of studied materials is monitored by electrical conductivity measurement using TDR sensors. The changes of materials- structures within the hardening are observed using pore size distribution measurement. The compressive strength measurements are done as well. Experimental results show that the replacement of Portland cement by fine-ground ceramics in the amount of up to 20% by mass is acceptable solution from the mechanical point of view. One can also assume similar physical properties of designed materials to the reference material with only Portland cement as binder.

Keywords: Fine-ground ceramics, cement pastes, early age properties, mechanical properties, pore size distribution, electrical conductivity measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1579
2017 The Application of Queuing Theory in Multi-Stage Production Lines

Authors: Hani Shafeek, Muhammed Marsudi

Abstract:

The purpose of this work is examining the multiproduct multi-stage in a battery production line. To improve the performances of an assembly production line by determine the efficiency of each workstation. Data collected from every workstation. The data are throughput rate, number of operator, and number of parts that arrive and leaves during part processing. Data for the number of parts that arrives and leaves are collected at least at the amount of ten samples to make the data is possible to be analyzed by Chi-Squared Goodness Test and queuing theory. Measures of this model served as the comparison with the standard data available in the company. Validation of the task time value resulted by comparing it with the task time value based on the company database. Some performance factors for the multi-product multi-stage in a battery production line in this work are shown. The efficiency in each workstation was also shown. Total production time to produce each part can be determined by adding the total task time in each workstation. To reduce the queuing time and increase the efficiency based on the analysis any probably improvement should be done. One probably action is by increasing the number of operators how manually operate this workstation.

Keywords: Production line, manufacturing, performance measurement, queuing theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3135
2016 Optimal Placement and Sizing of Energy Storage System in Distribution Network with Photovoltaic Based Distributed Generation Using Improved Firefly Algorithms

Authors: Ling Ai Wong, Hussain Shareef, Azah Mohamed, Ahmad Asrul Ibrahim

Abstract:

The installation of photovoltaic based distributed generation (PVDG) in active distribution system can lead to voltage fluctuation due to the intermittent and unpredictable PVDG output power. This paper presented a method in mitigating the voltage rise by optimally locating and sizing the battery energy storage system (BESS) in PVDG integrated distribution network. The improved firefly algorithm is used to perform optimal placement and sizing. Three objective functions are presented considering the voltage deviation and BESS off-time with state of charge as the constraint. The performance of the proposed method is compared with another optimization method such as the original firefly algorithm and gravitational search algorithm. Simulation results show that the proposed optimum BESS location and size improve the voltage stability.

Keywords: BESS, PVDG, firefly algorithm, voltage fluctuation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1305
2015 Analysis of Temperature Change under Global Warming Impact using Empirical Mode Decomposition

Authors: Md. Khademul Islam Molla, Akimasa Sumi, M. Sayedur Rahman

Abstract:

The empirical mode decomposition (EMD) represents any time series into a finite set of basis functions. The bases are termed as intrinsic mode functions (IMFs) which are mutually orthogonal containing minimum amount of cross-information. The EMD successively extracts the IMFs with the highest local frequencies in a recursive way, which yields effectively a set low-pass filters based entirely on the properties exhibited by the data. In this paper, EMD is applied to explore the properties of the multi-year air temperature and to observe its effects on climate change under global warming. This method decomposes the original time-series into intrinsic time scale. It is capable of analyzing nonlinear, non-stationary climatic time series that cause problems to many linear statistical methods and their users. The analysis results show that the mode of EMD presents seasonal variability. The most of the IMFs have normal distribution and the energy density distribution of the IMFs satisfies Chi-square distribution. The IMFs are more effective in isolating physical processes of various time-scales and also statistically significant. The analysis results also show that the EMD method provides a good job to find many characteristics on inter annual climate. The results suggest that climate fluctuations of every single element such as temperature are the results of variations in the global atmospheric circulation.

Keywords: Empirical mode decomposition, instantaneous frequency, Hilbert spectrum, Chi-square distribution, anthropogenic impact.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2136
2014 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance

Authors: Loai AbdAllah, Mahmoud Kaiyal

Abstract:

Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.

Keywords: Missing values, distance metric, Bhattacharyya distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 770
2013 Variational EM Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

In this paper, we propose the variational EM inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multiclass. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.

Keywords: Bayesian rule, Gaussian process classification model with multiclass, Gaussian process prior, human action classification, laplace approximation, variational EM algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1747
2012 Finite Element Simulation of Multi-Stage Deep Drawing Processes and Comparison with Experimental Results

Authors: A. Pourkamali Anaraki, M. Shahabizadeh, B. Babaee

Abstract:

The plastic forming process of sheet plate takes an important place in forming metals. The traditional techniques of tool design for sheet forming operations used in industry are experimental and expensive methods. Prediction of the forming results, determination of the punching force, blank holder forces and the thickness distribution of the sheet metal will decrease the production cost and time of the material to be formed. In this paper, multi-stage deep drawing simulation of an Industrial Part has been presented with finite element method. The entire production steps with additional operations such as intermediate annealing and springback has been simulated by ABAQUS software under axisymmetric conditions. The simulation results such as sheet thickness distribution, Punch force and residual stresses have been extracted in any stages and sheet thickness distribution was compared with experimental results. It was found through comparison of results, the FE model have proven to be in close agreement with those of experiment.

Keywords: Deep drawing, Finite element method, Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5060
2011 Reverse Logistics in Clothing Recycling: A Case Study in Chengdu

Authors: Guo Yan

Abstract:

Clothing recycling bin is a traditional way to collect textile waste in many areas. In the clothing recycling business, the transportation cost normally takes over 50% of total costs. This case gives a good way to reduce transportation cost by reverse logistics system. In this reverse logistics system, there are offline strategic alliance partners, such as transport firms, convenience stores, laundries, and post office which are integrated onto the mobile APP. Offline strategic alliance partners provide the service of textile waste collection, and transportation by their vacant vehicles return journey from convenience stores, laundries and post offices to sorting centers. The results of the case study provide the strategic alliance with a valuable and light - asset business model by using the logistics of offline memberships. The company in this case just focuses on textile waste sorting, reuse, recycling etc. The research method of this paper is a case study of a clothing recycling company in Chengdu by field research and interview; the analysis is based on the theory of the reverse logistics system.

Keywords: Closed-loop recycles system, clothing recycling, end-of-life clothing, sharing economy, strategic alliance, reverse logistics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1559
2010 Stochastic Estimation of Wireless Traffic Parameters

Authors: Somenath Mukherjee, Raj Kumar Samanta, Gautam Sanyal

Abstract:

Different services based on different switching techniques in wireless networks leads to drastic changes in the properties of network traffic. Because of these diversities in services, network traffic is expected to undergo qualitative and quantitative variations. Hence, assumption of traffic characteristics and the prediction of network events become more complex for the wireless networks. In this paper, the traffic characteristics have been studied by collecting traces from the mobile switching centre (MSC). The traces include initiation and termination time, originating node, home station id, foreign station id. Traffic parameters namely, call interarrival and holding times were estimated statistically. The results show that call inter-arrival and distribution time in this wireless network is heavy-tailed and follow gamma distributions. They are asymptotically long-range dependent. It is also found that the call holding times are best fitted with lognormal distribution. Based on these observations, an analytical model for performance estimation is also proposed.

Keywords: Wireless networks, traffic analysis, long-range dependence, heavy-tailed distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1886
2009 Probability Distribution of Rainfall Depth at Hourly Time-Scale

Authors: S. Dan'azumi, S. Shamsudin, A. A. Rahman

Abstract:

Rainfall data at fine resolution and knowledge of its characteristics plays a major role in the efficient design and operation of agricultural, telecommunication, runoff and erosion control as well as water quality control systems. The paper is aimed to study the statistical distribution of hourly rainfall depth for 12 representative stations spread across Peninsular Malaysia. Hourly rainfall data of 10 to 22 years period were collected and its statistical characteristics were estimated. Three probability distributions namely, Generalized Pareto, Exponential and Gamma distributions were proposed to model the hourly rainfall depth, and three goodness-of-fit tests, namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared tests were used to evaluate their fitness. Result indicates that the east cost of the Peninsular receives higher depth of rainfall as compared to west coast. However, the rainfall frequency is found to be irregular. Also result from the goodness-of-fit tests show that all the three models fit the rainfall data at 1% level of significance. However, Generalized Pareto fits better than Exponential and Gamma distributions and is therefore recommended as the best fit.

Keywords: Goodness-of-fit test, Hourly rainfall, Malaysia, Probability distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2905
2008 Further Investigation of Elastic Scattering of 16O on 12C at Different Energies

Authors: Sh. Hamada, N. Burtebayev, N. Amangeldi, A. Amar

Abstract:

The aim of this work is to study the elastic transfer phenomenon which takes place in the elastic scattering of 16O on 12C at energies near the Coulomb barrier. Where, the angular distribution decrease steadily with increasing the scattering angle, then the cross section will increase at backward angles due to the α-transfer process. This reaction was also studied at different energies for tracking the nuclear rainbow phenomenon. The experimental data of the angular distribution at these energies were compared to the calculation predictions. The optical potential codes such as SPIVAL and Distorted Wave Born Approximation (DWUCK5) were used in analysis.

Keywords: Transfer reaction, DWBA, Elastic Scattering, Optical Potential Codes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1348
2007 Distributional Effects of Tax and Benefit Reforms in the Czech Republic

Authors: L. Vítek

Abstract:

The Czech Republic has over the past decade carried out two waves of tax and benefit reforms. The first one took place in 2005–2006 during the left-wing government and the second one has been carried out in 2008 by the right-wing government. Using EUSILC data for selected types of households, the paper assesses changes in the distribution of gross incomes and effects of the changes in taxes and benefits on the distribution of incomes after taxes and a provision of social benefits. The analysis is carried out on four types of households with and without children. The analysis is performed using Lorenz curves and Gini coefficients. The results show that the tax system changes the distribution of incomes less significantly than benefits. The 2006 reform reduced the differential between the Gini coefficient for the gross income and the Gini coefficient after taxes and benefits for households with active parents and one child. Reform in 2008 supported families with children and an reduced the differential between the gross income and income after taxes and benefits for different types of families.

Keywords: Czech Republic, redistribution, tax reforms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1035
2006 Direct Measurements of Wind Data over 100 Meters above the Ground in the Site of Lendinara, Italy

Authors: A. Dal Monte, M. Raciti Castelli, G. B. Bellato, L. Stevanato, E. Benini

Abstract:

The wind resource in the Italian site of Lendinara (RO) is analyzed through a systematic anemometric campaign performed on the top of the bell tower, at an altitude of over 100 m above the ground. Both the average wind speed and the Weibull distribution are computed. The resulting average wind velocity is in accordance with the numerical predictions of the Italian Wind Atlas, confirming the accuracy of the extrapolation of wind data adopted for the evaluation of wind potential at higher altitudes with respect to the commonly placed measurement stations.

Keywords: Anemometric campaign, wind resource, Weibull distribution, wind atlas

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1948
2005 Measuring Relative Efficiency of Korean Construction Company using DEA/Window

Authors: Jung-Lo Park, Sung-Sik Kim, Sun-Young Choi, Ju-Hyung Kim, Jae-Jun Kim

Abstract:

Sub-prime mortgage crisis which began in the US is regarded as the most economic crisis since the Great Depression in the early 20th century. Especially, hidden problems on efficient operation of a business were disclosed at a time and many financial institutions went bankrupt and filed for court receivership. The collapses of physical market lead to bankruptcy of manufacturing and construction businesses. This study is to analyze dynamic efficiency of construction businesses during the five years at the turn of the global financial crisis. By discovering the trend and stability of efficiency of a construction business, this study-s objective is to improve management efficiency of a construction business in the ever-changing construction market. Variables were selected by analyzing corporate information on top 20 construction businesses in Korea and analyzed for static efficiency in 2008 and dynamic efficiency between 2006 and 2010. Unlike other studies, this study succeeded in deducing efficiency trend and stability of a construction business for five years by using the DEA/Window model. Using the analysis result, efficient and inefficient companies could be figured out. In addition, relative efficiency among DMU was measured by comparing the relationship between input and output variables of construction businesses. This study can be used as a literature to improve management efficiency for companies with low efficiency based on efficiency analysis of construction businesses.

Keywords: Construction Company, DEA, DEA/Window, Efficiency Analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1974
2004 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test

Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman

Abstract:

At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. This finding needs to be confirmed with a greater number of stations across other Australian states.

Keywords: Floods, FLIKE, probability distributions, flood frequency, outlier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3299