Search results for: time-cost trade-off
45 Multitasking Incentives and Employee Performance: Evidence from Call Center Field Experiments and Laboratory Experiments
Authors: Sung Ham, Chanho Song, Jiabin Wu
Abstract:
Employees are commonly incentivized on both quantity and quality performance and much of the extant literature focuses on demonstrating that multitasking incentives lead to tradeoffs. Alternatively, we consider potential solutions to the tradeoff problem from both a theoretical and an experimental perspective. Across two field experiments from a call center, we find that tradeoffs can be mitigated when incentives are jointly enhanced across tasks, where previous research has suggested that incentives be reduced instead of enhanced. In addition, we also propose and test, in a laboratory setting, the implications of revising the metric used to assess quality. Our results indicate that metrics can be adjusted to align quality and quantity more efficiently. Thus, this alignment has the potential to thwart the classic tradeoff problem. Finally, we validate our findings with an economic experiment that verifies that effort is largely consistent with our theoretical predictions.Keywords: incentives, multitasking, field experiment, experimental economics
Procedia PDF Downloads 15944 Linking Market Performance to Exploration and Exploitation in The Pharmaceutical Industry
Authors: Johann Valentowitsch, Wolfgang Burr
Abstract:
In organizational research, strategies of exploration and exploitation are often considered to be contradictory. Building on the tradeoff argument, many authors have assumed that a company's market performance should be positively dependent on its strategic balance between exploration and exploitation over time. In this study, we apply this reasoning to the pharmaceutical industry. Using exploratory regression analysis we show that the long-term market performance of a pharmaceutical company is linked to both its ability to carry out exploratory projects and its ability to develop exploitative competencies. In particular, our findings demonstrate that, on average, the company's annual sales performance is higher the better the strategic alignment between exploration and exploitation is balanced. The contribution of our research is twofold. On the one hand, we provide empirical evidence for the initial tradeoff hypothesis and thus support the theoretical position of those who understand exploration and exploitation as strategic substitutes. On the other hand, our findings show that a balanced relationship between exploration and exploitation is also important in research-intensive industries, which naturally tend to place more emphasis on exploration.Keywords: exploitation, exploration, market performance, pharmaceutical industry, strategy
Procedia PDF Downloads 21843 Choosing an Optimal Epsilon for Differentially Private Arrhythmia Analysis
Authors: Arin Ghazarian, Cyril Rakovski
Abstract:
Differential privacy has become the leading technique to protect the privacy of individuals in a database while allowing useful analysis to be done and the results to be shared. It puts a guarantee on the amount of privacy loss in the worst-case scenario. Differential privacy is not a toggle between full privacy and zero privacy. It controls the tradeoff between the accuracy of the results and the privacy loss using a single key parameter calledKeywords: arrhythmia, cardiology, differential privacy, ECG, epsilon, medi-cal data, privacy preserving analytics, statistical databases
Procedia PDF Downloads 15342 Quantitative Analysis of Three Sustainability Pillars for Water Tradeoff Projects in Amazon
Authors: Taha Anjamrooz, Sareh Rajabi, Hasan Mahmmud, Ghassan Abulebdeh
Abstract:
Water availability, as well as water demand, are not uniformly distributed in time and space. Numerous extra-large water diversion projects are launched in Amazon to alleviate water scarcities. This research utilizes statistical analysis to examine the temporal and spatial features of 40 extra-large water diversion projects in Amazon. Using a network analysis method, the correlation between seven major basins is measured, while the impact analysis method is employed to explore the associated economic, environmental, and social impacts. The study unearths that the development of water diversion in Amazon has witnessed four stages, from a preliminary or initial period to a phase of rapid development. It is observed that the length of water diversion channels and the quantity of water transferred have amplified significantly in the past five decades. As of 2015, in Amazon, more than 75 billion m³ of water was transferred amidst 12,000 km long channels. These projects extend over half of the Amazon Area. The River Basin E is currently the most significant source of transferred water. Through inter-basin water diversions, Amazon gains the opportunity to enhance the Gross Domestic Product (GDP) by 5%. Nevertheless, the construction costs exceed 70 billion US dollars, which is higher than any other country. The average cost of transferred water per unit has amplified with time and scale but reduced from western to eastern Amazon. Additionally, annual total energy consumption for pumping exceeded 40 billion kilowatt-hours, while the associated greenhouse gas emissions are assessed to be 35 million tons. Noteworthy to comprehend that ecological problems initiated by water diversion influence the River Basin B and River Basin D. Due to water diversion, more than 350 thousand individuals have been relocated, away from their homes. In order to enhance water diversion sustainability, four categories of innovative measures are provided for decision-makers: development of water tradeoff projects strategies, improvement of integrated water resource management, the formation of water-saving inducements, and pricing approach, and application of ex-post assessment.Keywords: sustainability, water trade-off projects, environment, Amazon
Procedia PDF Downloads 13041 3D Receiver Operator Characteristic Histogram
Authors: Xiaoli Zhang, Xiongfei Li, Yuncong Feng
Abstract:
ROC curves, as a widely used evaluating tool in machine learning field, are the tradeoff of true positive rate and negative rate. However, they are blamed for ignoring some vital information in the evaluation process, such as the amount of information about the target that each instance carries, predicted score given by each classification model to each instance. Hence, in this paper, a new classification performance method is proposed by extending the Receiver Operator Characteristic (ROC) curves to 3D space, which is denoted as 3D ROC Histogram. In the histogram, theKeywords: classification, performance evaluation, receiver operating characteristic histogram, hardness prediction
Procedia PDF Downloads 31440 Fast Accurate Detection of Frequency Jumps Using Kalman Filter with Non Linear Improvements
Authors: Mahmoud E. Mohamed, Ahmed F. Shalash, Hanan A. Kamal
Abstract:
In communication systems, frequency jump is a serious problem caused by the oscillators used. Kalman filters are used to detect that jump, Despite the tradeoff between the noise level and the speed of the detection. In this paper, An improvement is introduced in the Kalman filter, Through a nonlinear change in the bandwidth of the filter. Simulation results show a considerable improvement in the filter speed with a very low noise level. Additionally, The effect on the response to false alarms is also presented and false alarm rate show improvement.Keywords: Kalman filter, innovation, false detection, improvement
Procedia PDF Downloads 60339 A Discussion on Electrically Small Antenna Property
Authors: Riki H. Patel, Arpan Desia, Trushit Upadhayay
Abstract:
The demand of compact antenna is ever increasing since the inception of wireless communication devices. In the age of wireless communication, requirement of miniaturized antennas is quite high. It is quite often that antenna dimensions are decided based on application based requirement compared to practical antenna constraints. The tradeoff in efficiency and other antenna parameters against to antenna size is always a debatable issue. The article presents detailed review of fundamentals of electrically small antennas and its potential applications. In addition, constraints and challenges of electrically small antennas are also presented in the article.Keywords: bandwidth, communication, electrically small antenna, communication engineering
Procedia PDF Downloads 53138 Error Analysis of Wavelet-Based Image Steganograhy Scheme
Authors: Geeta Kasana, Kulbir Singh, Satvinder Singh
Abstract:
In this paper, a steganographic scheme for digital images using Integer Wavelet Transform (IWT) is proposed. The cover image is decomposed into wavelet sub bands using IWT. Each of the subband is divided into blocks of equal size and secret data is embedded into the largest and smallest pixel values of each block of the subband. Visual quality of stego images is acceptable as PSNR between cover image and stego is above 40 dB, imperceptibility is maintained. Experimental results show better tradeoff between capacity and visual perceptivity compared to the existing algorithms. Maximum possible error analysis is evaluated for each of the wavelet subbands of an image. Procedia PDF Downloads 50437 K-Means Based Matching Algorithm for Multi-Resolution Feature Descriptors
Authors: Shao-Tzu Huang, Chen-Chien Hsu, Wei-Yen Wang
Abstract:
Matching high dimensional features between images is computationally expensive for exhaustive search approaches in computer vision. Although the dimension of the feature can be degraded by simplifying the prior knowledge of homography, matching accuracy may degrade as a tradeoff. In this paper, we present a feature matching method based on k-means algorithm that reduces the matching cost and matches the features between images instead of using a simplified geometric assumption. Experimental results show that the proposed method outperforms the previous linear exhaustive search approaches in terms of the inlier ratio of matched pairs.Keywords: feature matching, k-means clustering, SIFT, RANSAC
Procedia PDF Downloads 35836 DAG Design and Tradeoff for Full Live Virtual Machine Migration over XIA Network
Authors: Dalu Zhang, Xiang Jin, Dejiang Zhou, Jianpeng Wang, Haiying Jiang
Abstract:
Traditional TCP/IP network is showing lots of shortages and research for future networks is becoming a hotspot. FIA (Future Internet Architecture) and FIA-NP (Next Phase) are supported by US NSF for future Internet designing. Moreover, virtual machine migration is a significant technique in cloud computing. As a network application, it should also be supported in XIA (expressive Internet Architecture), which is in both FIA and FIA-NP projects. This paper is an experimental study aims at verifying the feasibility of VM migration over XIA. We present three ways to maintain VM connectivity and communication states concerning DAG design and routing table modification. VM migration experiments are conducted intra-AD and inter-AD with KVM instances. The procedure is achieved by a migration control protocol which is suitable for the characters of XIA. Evaluation results show that our solutions can well supports full live VM migration over XIA network respectively, keeping services seamless.Keywords: DAG, downtime, virtual machine migration, XIA
Procedia PDF Downloads 85535 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing
Authors: Rowan P. Martnishn
Abstract:
During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding
Procedia PDF Downloads 3134 Implementation of a Virtual Testbed for Secure IoT Firmware Update Using Blockchain
Authors: Tarun Chand, Michael Jurczyk
Abstract:
With the increasing need and popularity of IoT devices and how integrated they are becoming in our daily lives and industries, these devices make for a very lucrative target for malicious actors. And since these devices have such limited resources, the implementation of robust security features is a tradeoff to be made for the actual functionality the device was intended for. This makes them an easy target with high returns. Several frameworks for the secure firmware update of these devices have been recently proposed in the literature. They focus on methods such as blockchains and distributed file systems to secure firmware updates, but do not go into the details of the actual implementation of these frameworks and the lower-level interactions among these methods used. This work integrates some of these security measures into one overall framework and details the actual lower-level implementation of this framework in a virtual dockerized testbed running on AWS.Keywords: blockchain, Ethereum, Geth, IPFS, secure IoT-firmware update, virtual testbed development
Procedia PDF Downloads 6733 Construction Time - Cost Trade-Off Analysis Using Fuzzy Set Theory
Authors: V. S. S. Kumar, B. Vikram, G. C. S. Reddy
Abstract:
Time and cost are the two critical objectives of construction project management and are not independent but intricately related. Trade-off between project duration and cost are extensively discussed during project scheduling because of practical relevance. Generally when the project duration is compressed, the project calls for an increase in labor and more productive equipments, which increases the cost. Thus, the construction time-cost optimization is defined as a process to identify suitable construction activities for speeding up to attain the best possible savings in both time and cost. As there is hidden tradeoff relationship between project time and cost, it might be difficult to predict whether the total cost would increase or decrease as a result of compressing the schedule. Different combinations of duration and cost for the activities associated with the project determine the best set in the time-cost optimization. Therefore, the contractors need to select the best combination of time and cost to perform each activity, all of which will ultimately determine the project duration and cost. In this paper, the fuzzy set theory is used to model the uncertainties in the project environment for time-cost trade off analysis.Keywords: fuzzy sets, uncertainty, qualitative factors, decision making
Procedia PDF Downloads 65232 Bi-Criteria Objective Network Design Model for Multi Period Multi Product Green Supply Chain
Authors: Shahul Hamid Khan, S. Santhosh, Abhinav Kumar Sharma
Abstract:
Environmental performance along with social performance is becoming vital factors for industries to achieve global standards. With a good environmental policy global industries are differentiating them from their competitors. This paper concentrates on multi stage, multi product and multi period manufacturing network. Bi-objective mathematical models for total cost and total emission for the entire forward supply chain are considered. Here five different problems are considered by varying the number of suppliers, manufacturers, and environmental levels, for illustrating the taken mathematical model. GA, and Random search are used for finding the optimal solution. The input parameters of the optimal solution are used to find the tradeoff between the initial investment by the industry and the long term benefit of the environment.Keywords: closed loop supply chain, genetic algorithm, random search, green supply chain
Procedia PDF Downloads 54931 Analysis of Joint Source Channel LDPC Coding for Correlated Sources Transmission over Noisy Channels
Authors: Marwa Ben Abdessalem, Amin Zribi, Ammar Bouallègue
Abstract:
In this paper, a Joint Source Channel coding scheme based on LDPC codes is investigated. We consider two concatenated LDPC codes, one allows to compress a correlated source and the second to protect it against channel degradations. The original information can be reconstructed at the receiver by a joint decoder, where the source decoder and the channel decoder run in parallel by transferring extrinsic information. We investigate the performance of the JSC LDPC code in terms of Bit-Error Rate (BER) in the case of transmission over an Additive White Gaussian Noise (AWGN) channel, and for different source and channel rate parameters. We emphasize how JSC LDPC presents a performance tradeoff depending on the channel state and on the source correlation. We show that, the JSC LDPC is an efficient solution for a relatively low Signal-to-Noise Ratio (SNR) channel, especially with highly correlated sources. Finally, a source-channel rate optimization has to be applied to guarantee the best JSC LDPC system performance for a given channel.Keywords: AWGN channel, belief propagation, joint source channel coding, LDPC codes
Procedia PDF Downloads 35730 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering
Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel
Abstract:
Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.Keywords: classification, data mining, spam filtering, naive bayes, decision tree
Procedia PDF Downloads 41229 Security Over OFDM Fading Channels with Friendly Jammer
Authors: Munnujahan Ara
Abstract:
In this paper, we investigate the effect of friendly jamming power allocation strategies on the achievable average secrecy rate over a bank of parallel fading wiretap channels. We investigate the achievable average secrecy rate in parallel fading wiretap channels subject to Rayleigh and Rician fading. The achievable average secrecy rate, due to the presence of a line-of-sight component in the jammer channel is also evaluated. Moreover, we study the detrimental effect of correlation across the parallel sub-channels, and evaluate the corresponding decrease in the achievable average secrecy rate for the various fading configurations. We also investigate the tradeoff between the transmission power and the jamming power for a fixed total power budget. Our results, which are applicable to current orthogonal frequency division multiplexing (OFDM) communications systems, shed further light on the achievable average secrecy rates over a bank of parallel fading channels in the presence of friendly jammers.Keywords: fading parallel channels, wire-tap channel, OFDM, secrecy capacity, power allocation
Procedia PDF Downloads 50328 Photocatalytic Packed‐Bed Flow Reactor for Continuous Room‐Temperature Hydrogen Release from Liquid Organic Carriers
Authors: Malek Y. S. Ibrahim, Jeffrey A. Bennett, Milad Abolhasani
Abstract:
Despite the potential of hydrogen (H2) storage in liquid organic carriers to achieve carbon neutrality, the energy required for H2 release and the cost of catalyst recycling has hindered its large-scale adoption. In response, a photo flow reactor packed with rhodium (Rh)/titania (TiO2) photocatalyst was reported for the continuous and selective acceptorless dehydrogenation of 1,2,3,4-tetrahydroquinoline to H2 gas and quinoline under visible light irradiation at room temperature. The tradeoff between the reactor pressure drop and its photocatalytic surface area was resolved by selective in-situ photodeposition of Rh in the photo flow reactor post-packing on the outer surface of the TiO2 microparticles available to photon flux, thereby reducing the optimal Rh loading by 10 times compared to a batch reactor, while facilitating catalyst reuse and regeneration. An example of using quinoline as a hydrogen acceptor to lower the energy of the hydrogen production step was demonstrated via the water-gas shift reaction.Keywords: hydrogen storage, flow chemistry, photocatalysis, solar hydrogen
Procedia PDF Downloads 10027 Achievable Average Secrecy Rates over Bank of Parallel Independent Fading Channels with Friendly Jamming
Authors: Munnujahan Ara
Abstract:
In this paper, we investigate the effect of friendly jamming power allocation strategies on the achievable average secrecy rate over a bank of parallel fading wiretap channels. We investigate the achievable average secrecy rate in parallel fading wiretap channels subject to Rayleigh and Rician fading. The achievable average secrecy rate, due to the presence of a line-of-sight component in the jammer channel is also evaluated. Moreover, we study the detrimental effect of correlation across the parallel sub-channels, and evaluate the corresponding decrease in the achievable average secrecy rate for the various fading configurations. We also investigate the tradeoff between the transmission power and the jamming power for a fixed total power budget. Our results, which are applicable to current orthogonal frequency division multiplexing (OFDM) communications systems, shed further light on the achievable average secrecy rates over a bank of parallel fading channels in the presence of friendly jammers.Keywords: fading parallel channels, wire-tap channel, OFDM, secrecy capacity, power allocation
Procedia PDF Downloads 51226 Spare Part Inventory Optimization Policy: A Study Literature
Authors: Zukhrof Romadhon, Nani Kurniati
Abstract:
Availability of Spare parts is critical to support maintenance tasks and the production system. Managing spare part inventory deals with some parameters and objective functions, as well as the tradeoff between inventory costs and spare parts availability. Several mathematical models and methods have been developed to optimize the spare part policy. Many researchers who proposed optimization models need to be considered to identify other potential models. This work presents a review of several pertinent literature on spare part inventory optimization and analyzes the gaps for future research. Initial investigation on scholars and many journal database systems under specific keywords related to spare parts found about 17K papers. Filtering was conducted based on five main aspects, i.e., replenishment policy, objective function, echelon network, lead time, model solving, and additional aspects of part classification. Future topics could be identified based on the number of papers that haven’t addressed specific aspects, including joint optimization of spare part inventory and maintenance.Keywords: spare part, spare part inventory, inventory model, optimization, maintenance
Procedia PDF Downloads 6425 Role of Cryptocurrency in Portfolio Diversification
Authors: Onur Arugaslan, Ajay Samant, Devrim Yaman
Abstract:
Financial advisors and investors seek new assets which could potentially increase portfolio returns and decrease portfolio risk. Cryptocurrencies represent a relatively new asset class which could serve in both these roles. There has been very little research done in the area of the risk/return tradeoff in a portfolio consisting of fixed income assets, stocks, and cryptocurrency. The objective of this study is a rigorous examination of this issue. The data used in the study are the monthly returns on 4-week US Treasury Bills, S&P Investment Grade Corporate Bond Index, Bitcoin and the S&P 500 Stock Index. The methodology used in the study is the application Modern Portfolio Theory to evaluate the risk-adjusted returns of portfolios with varying combinations of these assets, using Sharpe, Treynor and Jensen Indexes, as well as the Sortino and Modigliani measures. The results of the study would include the ranking of various investment portfolios based on their risk/return characteristics. The conclusions of the study would include objective empirical inference for investors who are interested in including cryptocurrency in their asset portfolios but are unsure of the risk/return implications.Keywords: financial economics, portfolio diversification, fixed income securities, cryptocurrency, stock indexes
Procedia PDF Downloads 7324 Impact of Profitability, Slack Resources and Natural Disasters on China's Corporate Philanthropic Practices
Authors: Nabeel Safdar, Qian Aimin
Abstract:
Corporate philanthropy is important, as the donations have been considered as a source to improve the image of business entity in modern era of high competition. We used data on annual basis from 2000 to 2014 for 1,248 firms listed at Shanghai and Shenzhen stock exchanges. Results for giving firms reveal that there is curve linear relation of profitability and CP, as profitable firms utilize cash in an efficient way and have fewer amounts of slack resource and tradeoff among stakeholder and agency cost made it more justifiable. We found that more profitability does not mean that the cash flows are available, actually good performing firms or profitable firm also good at cash management. Cash is utilized in an effective way by profitable firms, and have fewer extents of slack resources which generate curvilinear relationship of profitability with Corporate Philanthropy. We found that the trend of Corporate Philanthropy also got affected due to natural disasters. Analysis made by innovation, slack resources and directors salary revealed the positive significant relationship. It is not compulsory that firm should be only profitable for engaging in philanthropy rather they should have abundant slack resources to donate.Keywords: corporate philanthropy, free cash flows, natural disasters, profitability
Procedia PDF Downloads 31323 Quantum Decision Making with Small Sample for Network Monitoring and Control
Authors: Tatsuya Otoshi, Masayuki Murata
Abstract:
With the development and diversification of applications on the Internet, applications that require high responsiveness, such as video streaming, are becoming mainstream. Application responsiveness is not only a matter of communication delay but also a matter of time required to grasp changes in network conditions. The tradeoff between accuracy and measurement time is a challenge in network control. We people make countless decisions all the time, and our decisions seem to resolve tradeoffs between time and accuracy. When making decisions, people are known to make appropriate choices based on relatively small samples. Although there have been various studies on models of human decision-making, a model that integrates various cognitive biases, called ”quantum decision-making,” has recently attracted much attention. However, the modeling of small samples has not been examined much so far. In this paper, we extend the model of quantum decision-making to model decision-making with a small sample. In the proposed model, the state is updated by value-based probability amplitude amplification. By analytically obtaining a lower bound on the number of samples required for decision-making, we show that decision-making with a small number of samples is feasible.Keywords: quantum decision making, small sample, MPEG-DASH, Grover's algorithm
Procedia PDF Downloads 7922 Solid-Liquid-Polymer Mixed Matrix Membrane Using Liquid Additive Adsorbed on Activated Carbon Dispersed in Polymeric Membrane for CO2/CH4 Separation
Authors: P. Chultheera, T. Rirksomboon, S. Kulprathipanja, C. Liu, W. Chinsirikul, N. Kerddonfag
Abstract:
Gas separation by selective transport through polymeric membranes is one of the rapid growing branches of membrane technology. However, the tradeoff between the permeability and selectivity is one of the critical challenges encountered by pure polymer membranes, which in turn limits their large-scale application. To enhance gas separation performances, mixed matrix membranes (MMMs) have been developed. In this study, MMMs were prepared by a solution-coating method and tested for CO2/CH4 separation through permeability and selectivity using a membrane testing unit at room temperature and a pressure of 100 psig. The fabricated MMMs were composed of silicone rubber dispersed with the activated carbon individually absorbed with polyethylene glycol (PEG) as a liquid additive. PEG emulsified silicone rubber MMMs showed superior gas separation on cellulose acetate membrane with both high permeability and selectivity compared with silicone rubber membrane and alone support membrane. However, the MMMs performed limited stability resulting from the undesirable PEG leakage. To stabilize the MMMs, PEG was then incorporated into activated carbon by adsorption. It was found that the incorporation of solid and liquid was effective to improve the separation performance of MMMs.Keywords: mixed matrix membrane, membrane, CO₂/CH₄ separation, activated carbon
Procedia PDF Downloads 34221 Strategic Mine Planning: A SWOT Analysis Applied to KOV Open Pit Mine in the Democratic Republic of Congo
Authors: Patrick May Mukonki
Abstract:
KOV pit (Kamoto Oliveira Virgule) is located 10 km from Kolwezi town, one of the mineral rich town in the Lualaba province of the Democratic Republic of Congo. The KOV pit is currently operating under the Katanga Mining Limited (KML), a Glencore-Gecamines (a State Owned Company) join venture. Recently, the mine optimization process provided a life of mine of approximately 10 years withnice pushbacks using the Datamine NPV Scheduler software. In previous KOV pit studies, we recently outlined the impact of the accuracy of the geological information on a long-term mine plan for a big copper mine such as KOV pit. The approach taken, discussed three main scenarios and outlined some weaknesses on the geological information side, and now, in this paper that we are going to develop here, we are going to highlight, as an overview, those weaknesses, strengths and opportunities, in a global SWOT analysis. The approach we are taking here is essentially descriptive in terms of steps taken to optimize KOV pit and, at every step, we categorized the challenges we faced to have a better tradeoff between what we called strengths and what we called weaknesses. The same logic is applied in terms of the opportunities and threats. The SWOT analysis conducted in this paper demonstrates that, despite a general poor ore body definition, and very rude ground water conditions, there is room for improvement for such high grade ore body.Keywords: mine planning, mine optimization, mine scheduling, SWOT analysis
Procedia PDF Downloads 22520 A Study of Using Multiple Subproblems in Dantzig-Wolfe Decomposition of Linear Programming
Authors: William Chung
Abstract:
This paper is to study the use of multiple subproblems in Dantzig-Wolfe decomposition of linear programming (DW-LP). Traditionally, the decomposed LP consists of one LP master problem and one LP subproblem. The master problem and the subproblem is solved alternatively by exchanging the dual prices of the master problem and the proposals of the subproblem until the LP is solved. It is well known that convergence is slow with a long tail of near-optimal solutions (asymptotic convergence). Hence, the performance of DW-LP highly depends upon the number of decomposition steps. If the decomposition steps can be greatly reduced, the performance of DW-LP can be improved significantly. To reduce the number of decomposition steps, one of the methods is to increase the number of proposals from the subproblem to the master problem. To do so, we propose to add a quadratic approximation function to the LP subproblem in order to develop a set of approximate-LP subproblems (multiple subproblems). Consequently, in each decomposition step, multiple subproblems are solved for providing multiple proposals to the master problem. The number of decomposition steps can be reduced greatly. Note that each approximate-LP subproblem is nonlinear programming, and solving the LP subproblem must faster than solving the nonlinear multiple subproblems. Hence, using multiple subproblems in DW-LP is the tradeoff between the number of approximate-LP subproblems being formed and the decomposition steps. In this paper, we derive the corresponding algorithms and provide some simple computational results. Some properties of the resulting algorithms are also given.Keywords: approximate subproblem, Dantzig-Wolfe decomposition, large-scale models, multiple subproblems
Procedia PDF Downloads 16619 A Fast Calculation Approach for Position Identification in a Distance Space
Authors: Dawei Cai, Yuya Tokuda
Abstract:
The market of localization based service (LBS) is expanding. The acquisition of physical location is the fundamental basis for LBS. GPS, the de facto standard for outdoor localization, does not work well in indoor environment due to the blocking of signals by walls and ceiling. To acquire high accurate localization in an indoor environment, many techniques have been developed. Triangulation approach is often used for identifying the location, but a heavy and complex computation is necessary to calculate the location of the distances between the object and several source points. This computation is also time and power consumption, and not favorable to a mobile device that needs a long action life with battery. To provide a low power consumption approach for a mobile device, this paper presents a fast calculation approach to identify the location of the object without online solving solutions to simultaneous quadratic equations. In our approach, we divide the location identification into two parts, one is offline, and other is online. In offline mode, we make a mapping process that maps the location area to distance space and find a simple formula that can be used to identify the location of the object online with very light computation. The characteristic of the approach is a good tradeoff between the accuracy and computational amount. Therefore, this approach can be used in smartphone and other mobile devices that need a long work time. To show the performance, some simulation experimental results are provided also in the paper.Keywords: indoor localization, location based service, triangulation, fast calculation, mobile device
Procedia PDF Downloads 17418 The Interaction of Country-of-Manufacturing with Country-of-Design within Different Consumption Context
Authors: Ebru Genc, Shih-Ching Wang
Abstract:
In today’s globalized world, while companies move their production centers to developing countries in order to gain cost advantage, they receive negative responses from consumers because of the weak image of those countries. In this study, we looked at this tradeoff faced by multinational companies. Some companies that have headquarters in developed countries have devised a strategy of manipulating country-of-origin (COO) information by introducing the concept of country of design (COD). We analyzed the impact of country-of-manufacturing (COM) information on consumers’ product evaluation and purchase intention in the presence of different levels of COD information, namely, in terms of developed and developing countries. We found that it is not advantageous for a firm to publish a design location with a strong image if the firm is producing in a country that has a weak image. On the other hand, revealing COD information has a reinforcing effect on consumers’ product evaluation and purchase intention if the firm is producing in a country with a strong image. Second, we studied the impact of consumption context on this relationship (in terms of public or private use) and found that for products that are typically used in public, COM has significantly shown higher importance on product evaluation and purchase intention, compared to products typically used in private. However, our results show that consumption context shows no effect of an impact resulting from COD information.Keywords: consumption context, country of design, country of manufacturing, country of origin
Procedia PDF Downloads 24917 Life Cycle Assessment of Almond Processing: Off-ground Harvesting Scenarios
Authors: Jessica Bain, Greg Thoma, Marty Matlock, Jeyam Subbiah, Ebenezer Kwofie
Abstract:
The environmental impact and particulate matter emissions (PM) associated with the production and packaging of 1 kg of almonds were evaluated using life cycle assessment (LCA). The assessment began at the point of ready to harvest with a system boundary was a cradle-to-gate assessment of almond packaging in California. The assessment included three scenarios of off-ground harvesting of almonds. The three general off-ground harvesting scenarios with variations include the harvested almonds solar dried on a paper tarp in the orchard, the harvested almonds solar dried on the floor in a separate lot, and the harvested almonds dried mechanically. The life cycle inventory (LCI) data for almond production were based on previously published literature and data provided by Almond Board of California (ABC). The ReCiPe 2016 method was used to calculate the midpoint impacts. Using consequential LCA model, the global warming potential (GWP) for the three harvesting scenarios are 2.90, 2.86, and 3.09 kg CO2 eq/ kg of packaged almond for scenarios 1, 2a, and 3a, respectively. The global warming potential for conventional harvesting method was 2.89 kg CO2 eq/ kg of packaged almond. The particulate matter emissions for each scenario per hectare for each off-ground harvesting scenario is 77.14, 9.56, 66.86, and 8.75 for conventional harvesting and scenarios 1, 2, and 3, respectively. The most significant contributions to the overall emissions were from almond production. The farm gate almond production had a global warming potential of 2.12 kg CO2 eq/ kg of packaged almond, approximately 73% of the overall emissions. Based on comparisons between the GWP and PM emissions, scenario 2a was the best tradeoff between GHG and PM production.Keywords: life cycle assessment, low moisture foods, sustainability, LCA
Procedia PDF Downloads 8316 Design of Effective Decoupling Point in Build-To-Order Systems: Focusing on Trade-Off Relation between Order-To-Delivery Lead Time and Work in Progress
Authors: Zhiyong Li, Hiroshi Katayama
Abstract:
Since 1990s, e-commerce and internet business have been grown gradually over the word and customers tend to express their demand attributes in terms of specification requirement on parts, component, product structure etc. This paper deals with designing effective decoupling points for build to order systems under e-commerce environment, which can be realized through tradeoff relation analysis between two major criteria, customer order lead time and value of work in progress. These KPIs are critical for successful BTO business, namely time-based service effectiveness on coping with customer requirements for the first issue and cost effective ness with risk aversive operations for the second issue. Approach of this paper consists of investigation of successful business standing for BTO scheme, manufacturing model development of this scheme, quantitative evaluation of proposed models by calculation of two KPI values under various decoupling point distributions and discussion of the results brought by pattern of decoupling point distribution, where some cases provide the pareto optimum performances. To extract the relevant trade-off relation between considered KPIs among 2-dimensional resultant performance, useful logic developed by former research work, i.e. Katayama and Fonseca, is applied. Obtained characteristics are evaluated as effective information for managing BTO manufacturing businesses.Keywords: build-to-order (BTO), decoupling point, e-commerce, order-to-delivery lead time (ODLT), work in progress (WIP)
Procedia PDF Downloads 325