Search results for: efficiently computable endomorphism
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1019

Search results for: efficiently computable endomorphism

989 Concept to Enhance the Project Success and Promote the Implementation of Success Factors in Infrastructure Projects

Authors: A. Elbaz, K. Spang

Abstract:

Infrastructure projects are often subjected to delays and cost overruns and mistakenly described as unsuccessful projects. These projects have many peculiarities such as public attention, impact on the environment, subjected to special regulations, etc. They also deal with several stakeholders with different motivations and face unique risks. With this in mind we need to reconsider our approach to manage them, define their success factors and implement these success factors. Infrastructure projects are not only lacking a unified meaning of project success or a definition of success factors, but also a clear method to implement these factors. This paper investigates this gap and introduces a concept to implement success factors in an efficient way, taking into consideration the specific characteristics of infrastructure projects. This concept consists of six enablers such as project organization, project team, project management workflow, contract management, communication and knowledge transfer and project documentations. These enablers allow other success factors to be efficiently implemented in projects. In conclusion, this paper provides project managers as well as company managers with a tool to define and implement success factors efficiently in their projects, along with upgrading their assets for the coming projects. This tool consists of processes and validated checklists to ensure the best use of company resources and knowledge. Due to the special features of infrastructure projects this tool will be tested in the German infrastructure market. However, it is meant to be adaptable to other markets and industries.

Keywords: infrastructure projects, operative success factors, project success, success factors, transportation projects

Procedia PDF Downloads 88
988 Climate Change Adaptation in Agriculture: A General Equilibrium Analysis of Land Re-Allocation in Nepal

Authors: Sudarshan Chalise, Athula Naranpanawa

Abstract:

This paper attempts to investigate the viability of cropland re-allocation as an adaptation strategy to minimise the economy-wide costs of climate change on agriculture. Nepal makes an interesting case study as it is one of the most vulnerable agricultural economies within South Asia. This paper develops a comparative static multi-household Computable General Equilibrium (CGE) model for Nepal with a nested set of Constant Elasticity of Transformation (CET) functional forms to model the allocation of land within different agricultural sectors. Land transformation elasticities in these CET functions are allowed to reflect the ease of switching from one crop to another based on their agronomic characteristics. The results suggest that, in the long run, farmers in Nepal tend to allocate land to crops that are comparatively less impacted by climate change, such as paddy, thereby minimizing the economy-wide impacts of climate change. Furthermore, the results reveal that land re-allocation tends to reduce the income disparity among different household groups by significantly moderating the income losses of rural marginal farmers. Therefore, it is suggested that policy makers in Nepal should prioritise schemes such as providing climate-smart paddy varieties (i.e., those that are resistant to heat, drought and floods) to farmers, subsidising fertilizers, improving agronomic practices, and educating farmers to switch from crops that are highly impacted by climate change to those that are not, such as paddy.

Keywords: climate change, general equilibrium, land re-allocation, nepalese agriculture

Procedia PDF Downloads 306
987 Comprehensive Study of Data Science

Authors: Asifa Amara, Prachi Singh, Kanishka, Debargho Pathak, Akshat Kumar, Jayakumar Eravelly

Abstract:

Today's generation is totally dependent on technology that uses data as its fuel. The present study is all about innovations and developments in data science and gives an idea about how efficiently to use the data provided. This study will help to understand the core concepts of data science. The concept of artificial intelligence was introduced by Alan Turing in which the main principle was to create an artificial system that can run independently of human-given programs and can function with the help of analyzing data to understand the requirements of the users. Data science comprises business understanding, analyzing data, ethical concerns, understanding programming languages, various fields and sources of data, skills, etc. The usage of data science has evolved over the years. In this review article, we have covered a part of data science, i.e., machine learning. Machine learning uses data science for its work. Machines learn through their experience, which helps them to do any work more efficiently. This article includes a comparative study image between human understanding and machine understanding, advantages, applications, and real-time examples of machine learning. Data science is an important game changer in the life of human beings. Since the advent of data science, we have found its benefits and how it leads to a better understanding of people, and how it cherishes individual needs. It has improved business strategies, services provided by them, forecasting, the ability to attend sustainable developments, etc. This study also focuses on a better understanding of data science which will help us to create a better world.

Keywords: data science, machine learning, data analytics, artificial intelligence

Procedia PDF Downloads 45
986 An Efficient Traceability Mechanism in the Audited Cloud Data Storage

Authors: Ramya P, Lino Abraham Varghese, S. Bose

Abstract:

By cloud storage services, the data can be stored in the cloud, and can be shared across multiple users. Due to the unexpected hardware/software failures and human errors, which make the data stored in the cloud be lost or corrupted easily it affected the integrity of data in cloud. Some mechanisms have been designed to allow both data owners and public verifiers to efficiently audit cloud data integrity without retrieving the entire data from the cloud server. But public auditing on the integrity of shared data with the existing mechanisms will unavoidably reveal confidential information such as identity of the person, to public verifiers. Here a privacy-preserving mechanism is proposed to support public auditing on shared data stored in the cloud. It uses group signatures to compute verification metadata needed to audit the correctness of shared data. The identity of the signer on each block in shared data is kept confidential from public verifiers, who are easily verifying shared data integrity without retrieving the entire file. But on demand, the signer of the each block is reveal to the owner alone. Group private key is generated once by the owner in the static group, where as in the dynamic group, the group private key is change when the users revoke from the group. When the users leave from the group the already signed blocks are resigned by cloud service provider instead of owner is efficiently handled by efficient proxy re-signature scheme.

Keywords: data integrity, dynamic group, group signature, public auditing

Procedia PDF Downloads 365
985 Portfolio Optimization with Reward-Risk Ratio Measure Based on the Mean Absolute Deviation

Authors: Wlodzimierz Ogryczak, Michal Przyluski, Tomasz Sliwinski

Abstract:

In problems of portfolio selection, the reward-risk ratio criterion is optimized to search for a risky portfolio with the maximum increase of the mean return in proportion to the risk measure increase when compared to the risk-free investments. In the classical model, following Markowitz, the risk is measured by the variance thus representing the Sharpe ratio optimization and leading to the quadratic optimization problems. Several Linear Programming (LP) computable risk measures have been introduced and applied in portfolio optimization. In particular, the Mean Absolute Deviation (MAD) measure has been widely recognized. The reward-risk ratio optimization with the MAD measure can be transformed into the LP formulation with the number of constraints proportional to the number of scenarios and the number of variables proportional to the total of the number of scenarios and the number of instruments. This may lead to the LP models with huge number of variables and constraints in the case of real-life financial decisions based on several thousands scenarios, thus decreasing their computational efficiency and making them hardly solvable by general LP tools. We show that the computational efficiency can be then dramatically improved by an alternative model based on the inverse risk-reward ratio minimization and by taking advantages of the LP duality. In the introduced LP model the number of structural constraints is proportional to the number of instruments thus not affecting seriously the simplex method efficiency by the number of scenarios and therefore guaranteeing easy solvability. Moreover, we show that under natural restriction on the target value the MAD risk-reward ratio optimization is consistent with the second order stochastic dominance rules.

Keywords: portfolio optimization, reward-risk ratio, mean absolute deviation, linear programming

Procedia PDF Downloads 379
984 Evaluating the Performance of 28 EU Member Countries on Health2020: A Data Envelopment Analysis Evaluation of the Successful Implementation of Policies

Authors: Elias K. Maragos, Petros E. Maravelakis, Apostolos I. Linardis

Abstract:

Health2020 is a promising framework of policies provided by the World Health Organization (WHO) and aiming to diminish the health and well-being inequalities among the citizens of the European Union (EU) countries. The major demographic, social and environmental changes, in addition to the resent economic crisis prevent the unobstructed and successful implementation of this framework. The unemployment rates and the percentage of people at risk of poverty have increased among the citizens of EU countries. At the same time, the adopted fiscal, economic policies do not help governments to serve their social role and mitigate social and health inequalities. In those circumstances, there is a strong pressure to organize all health system resources efficiently and wisely. In order to provide a unified and value-based framework of valuation, we propose a valuation framework using data envelopment analysis (DEA) and dynamic DEA. We believe that the adopted methodology could provide a robust tool which can capture the degree of success with which policies have been implemented and is capable to determine which of the countries developed the requested policies efficiently and which of the countries have been lagged. Using the proposed methodology, we evaluated the performance of 28 EU member-countries in relation to the Health2020 peripheral targets. We adopted several versions of evaluation, measuring the effectiveness and the efficiency of EU countries from 2011 to 2016. Our results showed stability in technological changes and revealed a group of countries which were benchmarks in most of the years for the inefficient countries.

Keywords: DEA, Health2020, health inequalities, malmquist index, policies evaluation, well-being

Procedia PDF Downloads 121
983 Genetic Algorithm Methods for Determination Over Flow Coefficient of Medium Throat Length Morning Glory Spillway Equipped Crest Vortex Breakers

Authors: Roozbeh Aghamajidi

Abstract:

Shaft spillways are circling spillways used generally for emptying unexpected floods on earth and concrete dams. There are different types of shaft spillways: Stepped and Smooth spillways. Stepped spillways pass more flow discharges through themselves in comparison to smooth spillways. Therefore, awareness of flow behavior of these spillways helps using them better and more efficiently. Moreover, using vortex breaker has great effect on passing flow through shaft spillway. In order to use more efficiently, the risk of flow pressure decreases to less than fluid vapor pressure, called cavitations, should be prevented as far as possible. At this research, it has been tried to study different behavior of spillway with different vortex shapes on spillway crest on flow. From the viewpoint of the effects of flow regime changes on spillway, changes of step dimensions, and the change of type of discharge will be studied effectively. Therefore, two spillway models with three different vortex breakers and three arrangements have been used to assess the hydraulic characteristics of flow. With regard to the inlet discharge to spillway, the parameters of pressure and flow velocity on spillway surface have been measured at several points and after each run. Using these kinds of information leads us to create better design criteria of spillway profile. To achieve these purposes, optimization has important role and genetic algorithm are utilized to study the emptying discharge. As a result, it turned out that the best type of spillway with maximum discharge coefficient is smooth spillway with ogee shapes as vortex breaker and 3 number as arrangement. Besides it has been concluded that the genetic algorithm can be used to optimize the results.

Keywords: shaft spillway, vortex breaker, flow, genetic algorithm

Procedia PDF Downloads 348
982 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 376
981 Agile Smartphone Porting and App Integration of Signal Processing Algorithms Obtained through Rapid Development

Authors: Marvin Chibuzo Offiah, Susanne Rosenthal, Markus Borschbach

Abstract:

Certain research projects in Computer Science often involve research on existing signal processing algorithms and developing improvements on them. Research budgets are usually limited, hence there is limited time for implementing the algorithms from scratch. It is therefore common practice, to use implementations provided by other researchers as a template. These are most commonly provided in a rapid development, i.e. 4th generation, programming language, usually Matlab. Rapid development is a common method in Computer Science research for quickly implementing and testing new developed algorithms, which is also a common task within agile project organization. The growing relevance of mobile devices in the computer market also gives rise to the need to demonstrate the successful executability and performance measurement of these algorithms on a mobile device operating system and processor, particularly on a smartphone. Open mobile systems such as Android, are most suitable for this task, which is to be performed most efficiently. Furthermore, efficiently implementing an interaction between the algorithm and a graphical user interface (GUI) that runs exclusively on the mobile device is necessary in cases where the project’s goal statement also includes such a task. This paper examines different proposed solutions for porting computer algorithms obtained through rapid development into a GUI-based smartphone Android app and evaluates their feasibilities. Accordingly, the feasible methods are tested and a short success report is given for each tested method.

Keywords: SMARTNAVI, Smartphone, App, Programming languages, Rapid Development, MATLAB, Octave, C/C++, Java, Android, NDK, SDK, Linux, Ubuntu, Emulation, GUI

Procedia PDF Downloads 457
980 Improving the Global Competitiveness of SMEs by Logistics Transportation Management: Case Study Chicken Meat Supply Chain

Authors: P. Vanichkobchinda

Abstract:

The Logistics Transportation techniques, Open Vehicle Routing (OVR) is an approach toward transportation cost reduction, especially for long distance pickup and delivery nodes. The outstanding characteristic of OVR is that the route starting node and ending node are not necessary the same as in typical vehicle routing problems. This advantage enables the routing to flow continuously and the vehicle does not always return to its home base. This research aims to develop a heuristic for the open vehicle routing problem with pickup and delivery under time window and loading capacity constraints to minimize the total distance. The proposed heuristic is developed based on the Insertion method, which is a simple method and suitable for the rapid calculation that allows insertion of the new additional transportation requirements along the original paths. According to the heuristic analysis, cost comparisons between the proposed heuristic and companies are using method, nearest neighbor method show that the insertion heuristic. Moreover, the proposed heuristic gave superior solutions in all types of test problems. In conclusion, the proposed heuristic can effectively and efficiently solve the open vehicle routing. The research indicates that the improvement of new transport's calculation and the open vehicle routing with "Insertion Heuristic" represent a better outcome with 34.3 percent in average. in cost savings. Moreover, the proposed heuristic gave superior solutions in all types of test problems. In conclusion, the proposed heuristic can effectively and efficiently solve the open vehicle routing.

Keywords: business competitiveness, cost reduction, SMEs, logistics transportation, VRP

Procedia PDF Downloads 660
979 Earthquake Resistant Sustainable Steel Green Building

Authors: Arup Saha Chaudhuri

Abstract:

Structural steel is a very ductile material with high strength carrying capacity, thus it is very useful to make earthquake resistant buildings. It is a homogeneous material also. The member section and the structural system can be made very efficient for economical design. As the steel is recyclable and reused, it is a green material. The embodied energy for the efficiently designed steel structure is less than the RC structure. For sustainable green building steel is the best material nowadays. Moreover, pre-engineered and pre-fabricated faster construction methodologies help the development work to complete within the stipulated time. In this paper, the usefulness of Eccentric Bracing Frame (EBF) in steel structure over Moment Resisting Frame (MRF) and Concentric Bracing Frame (CBF) is shown. Stability of the steel structures against horizontal forces especially in seismic condition is efficiently possible by Eccentric bracing systems with economic connection details. The EBF is pin–ended, but the beam-column joints are designed for pin ended or for full connectivity. The EBF has several desirable features for seismic resistance. In comparison with CBF system, EBF system can be designed for appropriate stiffness and drift control. The link beam is supposed to yield in shear or flexure before initiation of yielding or buckling of the bracing member in tension or compression. The behavior of a 2-D steel frame is observed under seismic loading condition in the present paper. Ductility and brittleness of the frames are compared with respect to time period of vibration and dynamic base shear. It is observed that the EBF system is better than MRF system comparing the time period of vibration and base shear participation.

Keywords: steel building, green and sustainable, earthquake resistant, EBF system

Procedia PDF Downloads 320
978 Uptake of Copper by Dead Biomass of Burkholderia cenocepacia Isolated from a Metal Mine in Pará, Brazil

Authors: Ingrid R. Avanzi, Marcela dos P. G. Baltazar, Louise H. Gracioso, Luciana J. Gimenes, Bruno Karolski, Elen A. Perpetuo, Claudio Auguto Oller do Nascimento

Abstract:

In this study was developed a natural process using a biological system for the uptake of Copper and possible removal of copper from wastewater by dead biomass of the strain Burkholderia cenocepacia. Dead and live biomass of Burkholderia cenocepacia was used to analyze the equilibrium and kinetics of copper biosorption by this strain in function of the pH. Living biomass exhibited the highest biosorption capacity of copper, 50 mg g−1, which was achieved within 5 hours of contact, at pH 7.0, temperature of 30°C, and agitation speed of 150 rpm. The dead biomass of Burkholderia cenocepacia may be considered an efficiently bioprocess, being fast and low-cost to production of copper and also a probably nano-adsorbent of this metal ion in wastewater in bioremediation process. In this study was developed a natural process using a biological system for the uptake of Copper and possible removal of copper from wastewater by dead biomass of the strain Burkholderia cenocepacia. Dead and live biomass of Burkholderia cenocepacia was used to analyze the equilibrium and kinetics of copper biosorption by this strain in function of the pH. Living biomass exhibited the highest biosorption capacity of copper, 50 mg g−1, which was achieved within 5 hours of contact, at pH 7.0, temperature of 30°C, and agitation speed of 150 rpm. The dead biomass of Burkholderia cenocepacia may be considered an efficiently bioprocess, being fast and low-cost to production of copper and also a probably nano-adsorbent of this metal ion in wastewater in bioremediation process.

Keywords: biosorption, dead biomass, biotechnology, copper recovery

Procedia PDF Downloads 318
977 Bank, Stock Market Efficiency and Economic Growth: Lessons for ASEAN-5

Authors: Tan Swee Liang

Abstract:

This paper estimates bank and stock market efficiency associations with real per capita GDP growth by examining panel-data across three different regions using Panel-Corrected Standard Errors (PCSE) regression developed by Beck and Katz (1995). Data from five economies in ASEAN (Singapore, Malaysia, Thailand, Philippines, and Indonesia), five economies in Asia (Japan, China, Hong Kong SAR, South Korea, and India) and seven economies in OECD (Australia, Canada, Denmark, Norway, Sweden, United Kingdom U.K., and United States U.S.), between 1990 and 2017 are used. Empirical findings suggest one, for Asia-5 high bank net interest margin means greater bank profitability, hence spurring economic growth. Two, for OECD-7 low bank overhead costs (as a share of total assets) may reflect weak competition and weak investment in providing superior banking services, hence dampening economic growth. Three, stock market turnover ratio has negative association with OECD-7 economic growth, but a positive association with Asia-5, which suggest the relationship between liquidity and growth is ambiguous. Lastly, for ASEAN-5 high bank overhead costs (as a share of total assets) may suggest expenses have not been channelled efficiently to income generating activities. One practical implication of the findings is that policy makers should take necessary measures toward financial liberalisation policies that boost growth through the efficiency channel, so that funds are efficiently allocated through the financial system between financial and real sectors.

Keywords: financial development, banking system, capital markets, economic growth

Procedia PDF Downloads 112
976 CRM Cloud Computing: An Efficient and Cost Effective Tool to Improve Customer Interactions

Authors: Gaurangi Saxena, Ravindra Saxena

Abstract:

Lately, cloud computing is used to enhance the ability to attain corporate goals more effectively and efficiently at lower cost. This new computing paradigm “The Cloud Computing” has emerged as a powerful tool for optimum utilization of resources and gaining competitiveness through cost reduction and achieving business goals with greater flexibility. Realizing the importance of this new technique, most of the well known companies in computer industry like Microsoft, IBM, Google and Apple are spending millions of dollars in researching cloud computing and investigating the possibility of producing interface hardware for cloud computing systems. It is believed that by using the right middleware, a cloud computing system can execute all the programs a normal computer could run. Potentially, everything from most simple generic word processing software to highly specialized and customized programs designed for specific company could work successfully on a cloud computing system. A Cloud is a pool of virtualized computer resources. Clouds are not limited to grid environments, but also support “interactive user-facing applications” such as web applications and three-tier architectures. Cloud Computing is not a fundamentally new paradigm. It draws on existing technologies and approaches, such as utility Computing, Software-as-a-service, distributed computing, and centralized data centers. Some companies rent physical space to store servers and databases because they don’t have it available on site. Cloud computing gives these companies the option of storing data on someone else’s hardware, removing the need for physical space on the front end. Prominent service providers like Amazon, Google, SUN, IBM, Oracle, Salesforce etc. are extending computing infrastructures and platforms as a core for providing top-level services for computation, storage, database and applications. Application services could be email, office applications, finance, video, audio and data processing. By using cloud computing system a company can improve its customer relationship management. A CRM cloud computing system may be highly useful in delivering a sales team a blend of unique functionalities to improve agent/customer interactions. This paper attempts to first define the cloud computing as a tool for running business activities more effectively and efficiently at a lower cost; and then it distinguishes cloud computing with grid computing. Based on exhaustive literature review, authors discuss application of cloud computing in different disciplines of management especially in the field of marketing with special reference to use of cloud computing in CRM. Study concludes that CRM cloud computing platform helps a company track any data, such as orders, discounts, references, competitors and many more. By using CRM cloud computing, companies can improve its customer interactions and by serving them more efficiently that too at a lower cost can help gaining competitive advantage.

Keywords: cloud computing, competitive advantage, customer relationship management, grid computing

Procedia PDF Downloads 280
975 Improving Search Engine Performance by Removing Indexes to Malicious URLs

Authors: Durga Toshniwal, Lokesh Agrawal

Abstract:

As the web continues to play an increasing role in information exchange, and conducting daily activities, computer users have become the target of miscreants which infects hosts with malware or adware for financial gains. Unfortunately, even a single visit to compromised web site enables the attacker to detect vulnerabilities in the user’s applications and force the downloading of multitude of malware binaries. We provide an approach to effectively scan the so-called drive-by downloads on the Internet. Drive-by downloads are result of URLs that attempt to exploit their visitors and cause malware to be installed and run automatically. To scan the web for malicious pages, the first step is to use a crawler to collect URLs that live on the Internet, and then to apply fast prefiltering techniques to reduce the amount of pages that are needed to be examined by precise, but slower, analysis tools (such as honey clients or antivirus programs). Although the technique is effective, it requires a substantial amount of resources. A main reason is that the crawler encounters many pages on the web that are legitimate and needs to be filtered. In this paper, to characterize the nature of this rising threat, we present implementation of a web crawler on Python, an approach to search the web more efficiently for pages that are likely to be malicious, filtering benign pages and passing remaining pages to antivirus program for detection of malwares. Our approaches starts from an initial seed of known, malicious web pages. Using these seeds, our system generates search engines queries to identify other malicious pages that are similar to the ones in the initial seed. By doing so, it leverages the crawling infrastructure of search engines to retrieve URLs that are much more likely to be malicious than a random page on the web. The results shows that this guided approach is able to identify malicious web pages more efficiently when compared to random crawling-based approaches.

Keywords: web crawler, malwares, seeds, drive-by-downloads, security

Procedia PDF Downloads 209
974 Synthesis and Characterization of Cobalt Oxide and Cu-Doped Cobalt Oxide as Photocatalyst for Model Dye Degradation

Authors: Vrinda P. S. Borker

Abstract:

Major water pollutants are dyes from effluents of industries. Different methods have been tried to degrade or treat the effluent before it is left to the environment. In order to understand the degradation process and later apply it to effluents, solar degradation study of methylene blue (MB) and methyl red (MR), the model dyes was carried out in the presence of photo-catalysts, the oxides of cobalt oxide Co₃O₄, and copper doped cobalt oxides (Co₀.₉Cu₀.₁)₃O₄ and (Co₀.₉₅Cu₀.₀₅)₃O₄. They were prepared from oxalate complex and hydrazinated oxalate complex of cobalt as well as mix metals, copper, and cobalt. The complexes were synthesized and characterized by FTIR. Complexes were decomposed to form oxides and were characterized by XRD. They were found to be monophasic. Solar degradation of MR and MB was carried out in presence of these oxides in acidic and basic medium. Degradation was faster in alkaline medium in the presence of Co₃O₄ obtained from hydrazinated oxalate. Doping of nanomaterial oxides modifies their characteristics. Doped cobalt oxides are found to photo-decolourise MR in alkaline media efficiently. In the absence of photocatalyst, solar degradation of alkaline MR does not occur. In acidic medium, MR is minimally decolorized even in the presence of photocatalysts. The industrial textile effluent contains chemicals like NaCl and Na₂CO₃ along with the unabsorbed dye. It is reported that these two chemicals hamper the degradation of dye. The chemicals like K₂S₂O₈ and H₂O₂ are reported to enhance degradation. The solar degradation study of MB in presence of photocatalyst (Co₀.₉Cu₀.₁)₃O₄ and these four chemicals reveals that presence of K₂S₂O₈ and H₂O₂ enhances degradation. It proves that H₂O₂ generates hydroxyl ions required for degradation of dye and the sulphate anion radical being strong oxidant attacks dye molecules leading to its fragmentation rapidly. Thus addition of K₂S₂O₈ and H₂O₂ during solar degradation in presence of (Co₀.₉Cu₀.₁)₃O₄ helps to break the organic moiety efficiently.

Keywords: cobalt oxides, Cu-doped cobalt oxides, H₂O₂ in dye degradation, photo-catalyst, solar dye degradation

Procedia PDF Downloads 149
973 Harmonizing Spatial Plans: A Methodology to Integrate Sustainable Mobility and Energy Plans to Promote Resilient City Planning

Authors: B. Sanchez, D. Zambrana-Vasquez, J. Fresner, C. Krenn, F. Morea, L. Mercatelli

Abstract:

Local administrations are facing established targets on sustainable development from different disciplines at the heart of different city departments. Nevertheless, some of these targets, such as CO2 reduction, relate to two or more disciplines, as it is the case of sustainable mobility and energy plans (SUMP & SECAP/SEAP). This opens up the possibility to efficiently cooperate among different city departments and to create and develop harmonized spatial plans by using available resources and together achieving more ambitious goals in cities. The steps of the harmonization processes developed result in the identification of areas to achieve common strategic objectives. Harmonization, in other words, helps different departments in local authorities to work together and optimize the use or resources by sharing the same vision, involving key stakeholders, and promoting common data assessment to better optimize the resources. A methodology to promote resilient city planning via the harmonization of sustainable mobility and energy plans is presented in this paper. In order to validate the proposed methodology, a representative city engaged in an innovation process in efficient spatial planning is used as a case study. The harmonization process of sustainable mobility and energy plans covers identifying matching targets between different fields, developing different spatial plans with dual benefit and common indicators guaranteeing the continuous improvement of the harmonized plans. The proposed methodology supports local administrations in consistent spatial planning, considering both energy efficiency and sustainable mobility. Thus, municipalities can use their human and economic resources efficiently. This guarantees an efficient upgrade of land use plans integrating energy and mobility aspects in order to achieve sustainability targets, as well as to improve the wellbeing of its citizens.

Keywords: integrated multi-sector planning, spatial plans harmonization, sustainable energy and climate action plan, sustainable urban mobility plan

Procedia PDF Downloads 148
972 Integration of LCA and BIM for Sustainable Construction

Authors: Laura Álvarez Antón, Joaquín Díaz

Abstract:

The construction industry is turning towards sustainability. It is a well-known fact that sustainability is based on a balance between environmental, social and economic aspects. In order to achieve sustainability efficiently, these three criteria should be taken into account in the initial project phases, since that is when a project can be influenced most effectively. Thus the aim must be to integrate important tools like BIM and LCA at an early stage in order to make full use of their potential. With the synergies resulting from the integration of BIM and LCA, a wider approach to sustainability becomes possible, covering the three pillars of sustainability.

Keywords: building information modeling (BIM), construction industry, design phase, life cycle assessment (LCA), sustainability

Procedia PDF Downloads 420
971 Development and Evaluation of a Portable Ammonia Gas Detector

Authors: Jaheon Gu, Wooyong Chung, Mijung Koo, Seonbok Lee, Gyoutae Park, Sangguk Ahn, Hiesik Kim, Jungil Park

Abstract:

In this paper, we present a portable ammonia gas detector for performing the gas safety management efficiently. The display of the detector is separated from its body. The display module is received the data measured from the detector using ZigBee. The detector has a rechargeable li-ion battery which can be use for 11~12 hours, and a Bluetooth module for sending the data to the PC or the smart devices. The data are sent to the server and can access using the web browser or mobile application. The range of the detection concentration is 0~100ppm.

Keywords: ammonia, detector, gas, portable

Procedia PDF Downloads 387
970 Coordinated Multi-Point Scheme Based on Channel State Information in MIMO-OFDM System

Authors: Su-Hyun Jung, Chang-Bin Ha, Hyoung-Kyu Song

Abstract:

Recently, increasing the quality of experience (QoE) is an important issue. Since performance degradation at cell edge extremely reduces the QoE, several techniques are defined at LTE/LTE-A standard to remove inter-cell interference (ICI). However, the conventional techniques have disadvantage because there is a trade-off between resource allocation and reliable communication. The proposed scheme reduces the ICI more efficiently by using channel state information (CSI) smartly. It is shown that the proposed scheme can reduce the ICI with less resources.

Keywords: adaptive beamforming, CoMP, LTE-A, ICI reduction

Procedia PDF Downloads 437
969 A Non-Parametric Analysis of District Disaster Management Authorities in Punjab, Pakistan

Authors: Zahid Hussain

Abstract:

Provincial Disaster Management Authority (PDMA) Punjab was established under NDM Act 2010 and now working under Senior Member Board of Revenue, deals with the whole spectrum of disasters including preparedness, mitigation, early warning, response, relief, rescue, recovery and rehabilitation. The District Disaster Management Authorities (DDMA) are acting as implementing arms of PDMA in the districts to respond any disaster. DDMAs' role is very important in disaster mitigation, response and recovery as they are the first responder and closest tier to the community. Keeping in view the significant role of DDMAs, technical and human resource capacity are need to be checked. For calculating the technical efficiencies of District Disaster Management Authority (DDMA) in Punjab, three inputs like number of labour, the number of transportation and number of equipment, two outputs like relief assistance and the number of rescue and 25 districts as decision making unit have been selected. For this purpose, 8 years secondary data from 2005 to 2012 has been used. Data Envelopment Analysis technique has been applied. DEA estimates the relative efficiency of peer entities or entities performing the similar tasks. The findings show that all decision making unit (DMU) (districts) are inefficient on techonological and scale efficiency scale while technically efficient on pure and total factor productivity efficiency scale. All DMU are found technically inefficient only in the year 2006. Labour and equipment were not efficiently used in the year 2005, 2007, 2008, 2009 and 2012. Furthermore, only three years 2006, 2010 and 2011 show that districts could not efficiently use transportation in a disaster situation. This study suggests that all districts should curtail labour, transportation and equipment to be efficient. Similarly, overall all districts are not required to achieve number of rescue and relief assistant, these should be reduced.

Keywords: DEA, DMU, PDMA, DDMA

Procedia PDF Downloads 218
968 Transformation of ectA Gene From Halomonas elongata in Tomato Plant

Authors: Narayan Moger, Divya B., Preethi Jambagi, Krishnaveni C. K., Apsana M. R., B. R. Patil, Basvaraj Bagewadi

Abstract:

Salinity is one of the major threats to world food security. Considering the requirement for salt tolerant crop plants in the present study was undertaken to clone and transferred the salt tolerant ectA gene from marine ecosystem into agriculture crop system to impart salinity tolerance. Ectoine is the compatible solute which accumulates in the cell membrane, is known to be involved in salt tolerance activity in most of the Halophiles. The present situation is insisting to development of salt tolerant transgenic lines to combat abiotic stress. In this background, the investigation was conducted to develop transgenic tomato lines by cloning and transferring of ectA gene is an ectoine derivative capable of enzymatic action for the production of acetyl-diaminobutyric acid. The gene ectA is involved in maintaining the osmotic balance of plants. The PCR amplified ectA gene (579bp) was cloned into T/A cloning vector (pTZ57R/T). The construct pDBJ26 containing ectA gene was sequenced by using gene specific forward and reverse primers. Sequence was analyzed using BLAST algorithm to check similarity of ectA gene with other isolates. Highest homology of 99.66 per cent was found with ectA gene sequences of isolates Halomonas elongata with the available sequence information in NCBI database. The ectA gene was further sub cloned into pRI101-AN plant expression vector and transferred into E. coli DH5α for its maintenance. Further pDNM27 was mobilized into A. tumefaciens LBA4404 through tri-parental mating system. The recombinant Agrobacterium containing pDNM27 was transferred into tomato plants through In planta plant transformation method. Out of 300 seedlings, co-cultivated only twenty-seven plants were able to well establish under the greenhouse condition. Among twenty-seven transformants only twelve plants showed amplification with gene specific primers. Further work must be extended to evaluate the transformants at T1 and T2 generations for ectoine accumulation, salinity tolerance, plant growth and development and yield.

Keywords: salinity, computable solutes, ectA, transgenic, in planta transformation

Procedia PDF Downloads 56
967 Detecting Black Hole Attacks in Body Sensor Networks

Authors: Sara Alshehri, Bayan Alenzi, Atheer Alshehri, Samia Chelloug, Zainab Almry, Hussah Albugmai

Abstract:

This paper concerns body area networks sensor that collect signals around a human body. The black hole attacks are the main security challenging problem because the data traffic can be dropped at any node. The focus of our proposed solution is to efficiently route data packets while detecting black hole nodes.

Keywords: body sensor networks, security, black hole, routing, broadcasting, OMNeT++

Procedia PDF Downloads 613
966 Performance and Specific Emissions of an SI Engine Using Anhydrous Ethanol–Gasoline Blends in the City of Bogota

Authors: Alexander García Mariaca, Rodrigo Morillo Castaño, Juan Rolón Ríos

Abstract:

The government of Colombia has promoted the use of biofuels in the last 20 years through laws and resolutions, which regulate their use, with the objective to improve the atmospheric air quality and to promote Colombian agricultural industry. However, despite the use of blends of biofuels with fossil fuels, the air quality in large cities does not get better, this deterioration in the air is mainly caused by mobile sources that working with spark ignition internal combustion engines (SI-ICE), operating with a mixture in volume of 90 % gasoline and 10 % ethanol called E10, that for the case of Bogota represent 84 % of the fleet. Another problem is that Colombia has big cities located above 2200 masl and there are no accurate studies on the impact that the E10 mixture could cause in the emissions and performance of SI-ICE. This study aims to establish the optimal blend between gasoline ethanol in which an SI engine operates more efficiently in urban centres located at 2600 masl. The test was developed on SI engine four-stroke, single cylinder, naturally aspirated and with carburettor for the fuel supply using blends of gasoline and anhydrous ethanol in different ratios E10, E15, E20, E40, E60, E85 and E100. These tests were conducted in the city of Bogota, which is located at 2600 masl, with the engine operating at 3600 rpm and at 25, 50, 75 and 100% of load. The results show that the performance variables as engine brake torque, brake power and brake thermal efficiency decrease, while brake specific fuel consumption increases with the rise in the percentage of ethanol in the mixture. On the other hand, the specific emissions of CO2 and NOx present increases while specific emissions of CO and HC decreases compared to those produced by gasoline. From the tests, it is concluded that the SI-ICE worked more efficiently with the E40 mixture, where was obtained an increases of the brake power of 8.81 % and a reduction on brake specific fuel consumption of 2.5 %, coupled with a reduction in the specific emissions of CO2, HC and CO in 9.72, 52.88 and 76.66 % respectively compared to the results obtained with the E10 blend. This behaviour is because the E40 mixture provides the appropriate amount of the oxygen for the combustion process, which leads to better utilization of available energy in this process, thus generating a comparable power output to the E10 mixing and producing lower emissions CO and HC with the other test blends. Nevertheless, the emission of NOx increases in 106.25 %.

Keywords: emissions, ethanol, gasoline, engine, performance

Procedia PDF Downloads 303
965 Series Solutions to Boundary Value Differential Equations

Authors: Armin Ardekani, Mohammad Akbari

Abstract:

We present a method of generating series solutions to large classes of nonlinear differential equations. The method is well suited to be adapted in mathematical software and unlike the available commercial solvers, we are capable of generating solutions to boundary value ODEs and PDEs. Many of the generated solutions converge to closed form solutions. Our method can also be applied to systems of ODEs or PDEs, providing all the solutions efficiently. As examples, we present results to many difficult differential equations in engineering fields.

Keywords: computational mathematics, differential equations, engineering, series

Procedia PDF Downloads 312
964 Developing a Smart Card Using Internet of Things-Uni-C

Authors: Enji E. Alzamzami, Kholod A. Almwallad, Rahaf J. Alwafi, Roaa H. Alansari, Shatha S. Alshehri, Aeshah A. Alsiyami

Abstract:

This paper demonstrates a system that helps solve the congestion problem at the entrance gates and limits the spread of viruses among people in crowded environments, such as COVID-19, using the IoT (Internet of Things). This system may assist in organizing the campus entry process efficiently by developing a smart card application supported by NFC (Near Field Communication) technology through which users' information could be sent to a reader to share it with the server and allow the server to perform its tasks and send a confirmation response for the request either by acceptance or rejection.

Keywords: COVID-19, IoT, NFC technology, smart card

Procedia PDF Downloads 87
963 The Resistance of Fish Outside of Water Medium

Authors: Febri Ramadhan

Abstract:

Water medium is a vital necessity for the survival of fish. Fish can survive inside/outside of water medium within a certain time. By knowing the level of survival fish at outside of water medium, a person can transport the fish to a place with more efficiently. Transport of live fish from one place to another can be done with wet and dry media system. In this experiment the treatment-given the observed differences in fish species. This experiment aimed to test the degree of resilience of fish out of water media. Based on the ANOVA table is obtained, it can be concluded that the type of fish affects the level of resilience of fish outside the water (Fhit> Ftab).

Keywords: fish, transport, retention rate, fish resiliance

Procedia PDF Downloads 306
962 Knowledge Capital and Manufacturing Firms’ Innovation Management: Exploring the Impact of Transboundary Investment and Assimilative Capacity.

Authors: Suleman Bawa, Ayiku Emmanuel Lartey

Abstract:

Purpose - This paper aims to examine the association between knowledge capital and multinational firms’ innovation management. We again explored the impact of transboundary investment and assimilative capacity between knowledge capital and multinational firms’ innovation management. The vital position of knowledge capital and multinational firms’ innovation management in today’s increasingly volatile environment coupled with fierce competition has been extensively acknowledged by academics and industry investment capitals. Design/methodology/approach - The theoretical association model and an empirical correlation analysis were constructed based on relevant research using data collected from 19 multinational firms in Ghana as the subject, and path analysis was constructed using SPSS 22.0 and AMOS 24.0 to test the formulated hypotheses. Findings - Varied conclusions are drawn consequential from theoretical inferences and empirical tests. For multinational firms, knowledge capital relics positively significant to multinational firms’ innovation management. Multinational firms with advanced knowledge capital likely spawn greater corporations’ innovation management. Second, transboundary investment efficiently intermediates the association between knowledge physical capital, knowledge interactive capital, and corporations’ innovation management. At the same time, this impact is insignificant between knowledge of empirical capital and corporations’ innovation management. Lastly, the impact of transboundary investment and assimilative capacity on the association between knowledge capital and corporations’ innovation management is established. We summarized the implications for managers based on our outcomes. Research limitations/implications - Multinational firms must dynamically build knowledge capital to augment corporations’ innovation management. Conversely, knowledge capital motivates multinational firms to implement transboundary investment and cultivate assimilative capacity. Accordingly, multinational firms can efficiently exploit diverse information to augment their corporate innovation management. Practical implications – This paper presents a comprehensive justification of knowledge capital and manufacturing firms’ innovation management by exploring the impact of transboundary investment and assimilative capacity within the manufacturing industry, its sequential progress, and its associated challenges. Originality/value – This paper is amongst the first to find empirical results to back knowledge capital and manufacturing firms’ innovation management by exploring the impact of transboundary investment and assimilative capacity within the manufacturing industry. Additionally, aligning knowledge as a coordinative instrument is a significant input to our discernment in this area.

Keywords: knowledge capital, transboundary investment, innovation management, assimilative capacity

Procedia PDF Downloads 35
961 Production of Human BMP-7 with Recombinant E. coli and B. subtilis

Authors: Jong Il Rhee

Abstract:

The polypeptide representing the mature part of human BMP-7 was cloned and efficiently expressed in Escherichia coli and Bacillus subtilis, which had a clear band for hBMP-7, a homodimeric protein with an apparent molecular weight of 15.4 kDa. Recombinant E.coli produced 111 pg hBMP-7/mg of protein hBMP-7 through IPTG induction. Recombinant B. subtilis also produced 350 pg hBMP-7/ml of culture medium. The hBMP-7 was purified in 2 steps using an FPLC system with an ion exchange column and a gel filtration column. The hBMP-7 produced in this work also stimulated the alkaline phosphatase (ALP) activity in a dose-dependent manner, i.e. 2.5- and 8.9-fold at 100 and 300 ng hBMP-7/ml, respectively, and showed intact biological activity.

Keywords: B. subtilis, E. coli, fermentation, hBMP-7

Procedia PDF Downloads 395
960 On the Construction of Lightweight Circulant Maximum Distance Separable Matrices

Authors: Qinyi Mei, Li-Ping Wang

Abstract:

MDS matrices are of great significance in the design of block ciphers and hash functions. In the present paper, we investigate the problem of constructing MDS matrices which are both lightweight and low-latency. We propose a new method of constructing lightweight MDS matrices using circulant matrices which can be implemented efficiently in hardware. Furthermore, we provide circulant MDS matrices with as few bit XOR operations as possible for the classical dimensions 4 × 4, 8 × 8 over the space of linear transformations over finite field F42 . In contrast to previous constructions of MDS matrices, our constructions have achieved fewer XORs.

Keywords: linear diffusion layer, circulant matrix, lightweight, maximum distance separable (MDS) matrix

Procedia PDF Downloads 380