Search results for: system analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38934

Search results for: system analysis

35544 Kinematic Analysis of Heel Height Effect on Knee Direction Correction in a Patient with Genu Recurvatum: A Case Study

Authors: Parya Salimitari, Farhad Tabatabai Ghomsheh, Siyamak Khorramymehr, Hossein Taghadosi, Mohammad Hossein Dashti

Abstract:

The aim of this study was to evaluate the effect of heel height on the knee joint direction in Genu recurvatum patients compared to normal state. The test was performed on a patient with Genu recurvatum and a healthy person with similar and match biomechanical conditions. Subjects were tested under six different positions of shoes with heels 0, 1, 2, 3, 4 and 5 cm after marking during the gate. The results of the spatial temporal geometry obtained from Vicon Motion System (six-camera T10 model, Oxford Metrics Ltd., Oxford, UK), and were used to compute and analyze the kinematic results. In this study, we tried to determine the effect of shoe heel intervention on knee joint direction correction. The results indicate that the 1 cm heel has been optimized and significantly improved in knee joint flexion and flexion-extension angle so that the difference in knee flexion-extension angle between the patient and the healthy person at some stages of walking has reached zero (good posture). The 3 cm heel compared with the 0 cm heel has reduced the knee recurvatum index (KRI) by up to 21.74% in the patient (from 219.233 mm to 47.6714 mm). According to the findings of this study, it can be concluded that heel increase is effective in correcting knee joints in Genu recurvatum and the optimum heel height is 1 cm.

Keywords: joint alignment of knee, gait analysis, genu recurvatum, heel lift, kinematics, motion-analysis

Procedia PDF Downloads 194
35543 Design of Collection and Transportation System of Municipal Solid Waste in Meshkinshahr City

Authors: Ebrahim Fataei, Seyed Ali Hosseini, Zahra Arabi, Habib farhadi, Mehdi Aalipour Erdi, Seiied Taghi Seiied Safavian

Abstract:

Solid waste production is an integral part of human life and management of waste require full scientific approach and essential planning. The allocation of most management cost to collection and transportation and also the necessity of operational efficiency in this system, by limiting time consumption, and on the other hand optimum collection system and transportation is the base of waste design and management. This study was done to optimize the exits collection and transportation system of solid waste in Meshkinshahr city. So based on the analyzed data of municipal solid waste components in seven zones of Meshkinshahr city, and GIS software, applied to design storage place based on origin recycling and a route to collect and transport. It was attempted to represent an appropriate model to store, collect and transport municipal solid waste. The result shows that GIS can be applied to locate the waste container and determine a waste collection direction in an appropriate way.

Keywords: municipal solid waste management, transportation, optimizing, GIS, Iran

Procedia PDF Downloads 520
35542 A Multi-criteria Decision Support System for Migrating Legacies into Open Systems

Authors: Nasser Almonawer

Abstract:

Timely reaction to an evolving global business environment and volatile market conditions necessitates system and process flexibility, which in turn demands agile and adaptable architecture and a steady infusion of affordable new technologies. On the contrary, a large number of organizations utilize systems characterized by inflexible and obsolete legacy architectures. To effectively respond to the dynamic contemporary business environments, such architectures must be migrated to robust and modular open architectures. To this end, this paper proposes an integrated decision support system for a seamless migration to open systems. The proposed decision support system (DSS) integrates three well-established quantitative and qualitative decision-making models—namely, the Delphi method, Analytic Hierarchy Process (AHP) and Goal Programming (GP) to (1) assess risks and establish evaluation criteria; (2) formulate migration strategy and rank candidate systems; and (3) allocate resources among the selected systems.

Keywords: decision support systems, open systems architecture, analytic hierarchy process (AHP), goal programming (GP), delphi method

Procedia PDF Downloads 22
35541 An Abductive Approach to Policy Analysis: Policy Analysis as Informed Guessing

Authors: Adrian W. Chew

Abstract:

This paper argues that education policy analysis tends to be steered towards empiricist oriented approaches, which place emphasis on objective and measurable data. However, this paper argues that empiricist oriented approaches are generally based on inductive and/or deductive reasoning, which are unable to generate new ideas/knowledge. This paper will outline the logical structure of induction, deduction, and abduction, and argues that only abduction provides possibilities for the creation of new ideas/knowledge. This paper proposes the neologism of ‘informed guessing’ as a reformulation of abduction, and also as an approach to education policy analysis. On one side, the signifier ‘informed’ encapsulates the idea that abductive policy analysis needs to be informed by descriptive conceptualization theory to be able to make relations and connections between, and within, observed phenomenon and unobservable general structures. On the other side, the signifier ‘guessing’ captures the cyclical and unsystematic process of abduction. This paper will end with a brief example of utilising ‘informed guessing’ for a policy analysis of school choice lotteries in the United States.

Keywords: abductive reasoning, empiricism, informed guessing, policy analysis

Procedia PDF Downloads 342
35540 Learner's Difficulties Acquiring English: The Case of Native Speakers of Rio de La Plata Spanish Towards Justifying the Need for Corpora

Authors: Maria Zinnia Bardas Hoffmann

Abstract:

Contrastive Analysis (CA) is the systematic comparison between two languages. It stems from the notion that errors are caused by interference of the L1 system in the acquisition process of an L2. CA represents a useful tool to understand the nature of learning and acquisition. Also, this particular method promises a path to un-derstand the nature of underlying cognitive processes, even when other factors such as intrinsic motivation and teaching strategies were found to best explain student’s problems in acquisition. CA study is justified not only from the need to get a deeper understanding of the nature of SLA, but as an invaluable source to provide clues, at a cognitive level, for those general processes involved in rule formation and abstract thought. It is relevant for cross disciplinary studies and the fields of Computational Thought, Natural Language processing, Applied Linguistics, Cognitive Linguistics and Math Theory. That being said, this paper intends to address here as well its own set of constraints and limitations. Finally, this paper: (a) aims at identifying some of the difficulties students may find in their learning process due to the nature of their specific variety of L1, Rio de la Plata Spanish (RPS), (b) represents an attempt to discuss the necessity for specific models to approach CA.

Keywords: second language acquisition, applied linguistics, contrastive analysis, applied contrastive analysis English language department, meta-linguistic rules, cross-linguistics studies, computational thought, natural language processing

Procedia PDF Downloads 138
35539 Validation of a Reloading Vehicle Design by Finite Element Analysis

Authors: Tuğrul Aksoy, Hüseyin Karabıyık

Abstract:

Reloading vehicles are the vehicles which are generally equipped with a crane and used to carry a stowage from a point and locate onto the vehicle or vice versa. In this study, structural analysis of a reloading vehicle was performed under the loads which are predicted to be exposed under operating conditions via the finite element method. Among the finite element analysis results, the stress and displacement distributions of the vehicle and the contact pressure distributions of the guide rings within the stabilization legs were examined. Vehicle design was improved by strengthening certain parts according to the analysis results. The analyses performed for the final design were verified by the experiments involving strain gauge measurements.

Keywords: structural analysis, reloading vehicle, crane, strain gauge

Procedia PDF Downloads 58
35538 Performance Evaluation of Fingerprint, Auto-Pin and Password-Based Security Systems in Cloud Computing Environment

Authors: Emmanuel Ogala

Abstract:

Cloud computing has been envisioned as the next-generation architecture of Information Technology (IT) enterprise. In contrast to traditional solutions where IT services are under physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the management of the data and services may not be fully trustworthy. This is due to the fact that the systems are opened to the whole world and as people tries to have access into the system, many people also are there trying day-in day-out on having unauthorized access into the system. This research contributes to the improvement of cloud computing security for better operation. The work is motivated by two problems: first, the observed easy access to cloud computing resources and complexity of attacks to vital cloud computing data system NIC requires that dynamic security mechanism evolves to stay capable of preventing illegitimate access. Second; lack of good methodology for performance test and evaluation of biometric security algorithms for securing records in cloud computing environment. The aim of this research was to evaluate the performance of an integrated security system (ISS) for securing exams records in cloud computing environment. In this research, we designed and implemented an ISS consisting of three security mechanisms of biometric (fingerprint), auto-PIN and password into one stream of access control and used for securing examination records in Kogi State University, Anyigba. Conclusively, the system we built has been able to overcome guessing abilities of hackers who guesses people password or pin. We are certain about this because the added security system (fingerprint) needs the presence of the user of the software before a login access can be granted. This is based on the placement of his finger on the fingerprint biometrics scanner for capturing and verification purpose for user’s authenticity confirmation. The study adopted the conceptual of quantitative design. Object oriented and design methodology was adopted. In the analysis and design, PHP, HTML5, CSS, Visual Studio Java Script, and web 2.0 technologies were used to implement the model of ISS for cloud computing environment. Note; PHP, HTML5, CSS were used in conjunction with visual Studio front end engine design tools and MySQL + Access 7.0 were used for the backend engine and Java Script was used for object arrangement and also validation of user input for security check. Finally, the performance of the developed framework was evaluated by comparing with two other existing security systems (Auto-PIN and password) within the school and the results showed that the developed approach (fingerprint) allows overcoming the two main weaknesses of the existing systems and will work perfectly well if fully implemented.

Keywords: performance evaluation, fingerprint, auto-pin, password-based, security systems, cloud computing environment

Procedia PDF Downloads 128
35537 Implementation of Achterbahn-128 for Images Encryption and Decryption

Authors: Aissa Belmeguenai, Khaled Mansouri

Abstract:

In this work, an efficient implementation of Achterbahn-128 for images encryption and decryption was introduced. The implementation for this simulated project is written by MATLAB.7.5. At first two different original images are used for validate the proposed design. Then our developed program was used to transform the original images data into image digits file. Finally, we used our implemented program to encrypt and decrypt images data. Several tests are done for proving the design performance including visual tests and security analysis; we discuss the security analysis of the proposed image encryption scheme including some important ones like key sensitivity analysis, key space analysis, and statistical attacks.

Keywords: Achterbahn-128, stream cipher, image encryption, security analysis

Procedia PDF Downloads 521
35536 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 395
35535 Analysis of CO₂ Capture Products from Carbon Capture and Utilization Plant

Authors: Bongjae Lee, Beom Goo Hwang, Hye Mi Park

Abstract:

CO₂ capture products manufactured through Carbon Capture and Utilization (CCU) Plant that collect CO₂ directly from power plants require accurate measurements of the amount of CO₂ captured. For this purpose, two tests were carried out on the weight loss test. And one was analyzed using a carbon dioxide quantification device. First, the ignition loss analysis was performed by measuring the weight of the sample at 550°C after the first conversation and then confirming the loss when ignited at 950°C. Second, in the thermogravimetric analysis, the sample was divided into two sections of 40 to 500°C and 500 to 800°C to confirm the reduction. The results of thermal weight loss analysis and thermogravimetric analysis were confirmed to be almost similar. However, the temperature of the ignition loss analysis method was 950°C, which was 150°C higher than that of the thermogravimetric method at a temperature of 800°C, so that the difference in the amount of weight loss was 3 to 4% higher by the heat loss analysis method. In addition, the tendency that the CO₂ content increases as the reaction time become longer is similarly confirmed. Third, the results of the wet titration method through the carbon dioxide quantification device were found to be significantly lower than the weight loss method. Therefore, based on the results obtained through the above three analysis methods, we will establish a method to analyze the accurate amount of CO₂. Acknowledgements: This work was supported by the Korea Institute of Energy Technology Evaluation and planning (No. 20152010201850).

Keywords: carbon capture and utilization, CCU, CO2, CO2 capture products, analysis method

Procedia PDF Downloads 208
35534 Development of the Ontology of Engineering Design Complexity

Authors: Victor E. Lopez, L. Dale Thomas

Abstract:

As engineered systems become more complex, the difficulty associated with predicting, developing, and operating engineered systems also increases, resulting in increased costs, failure rates, and unexpected consequences. Successfully managing the complexity of the system should reduce these negative consequences. The study of complexity in the context of engineering development has suffered due to the ambiguity of the nature of complexity, what makes a system complex and how complexity translates to real world engineering attributes and consequences. This paper argues that the use of an ontology of engineering design complexity would i) improve the clarity of the research being performed by allowing researchers to use a common conceptualization of complexity, with more precise terminology, and ii) elucidate the connections between certain types of complexity and their consequences for system development. The ontology comprises concepts of complexity found in the literature and the different relations that exists between them. The ontology maps different complexity concepts such as structural complexity, creation complexity, and information entropy, and then relates the to system aspects such as interfaces, development effort, and modularity. The ontology is represented using the Web Ontology Language (OWL). This paper presents the current status of the ontology of engineering design complexity, the main challenges encountered, and the future plans for the ontology.

Keywords: design complexity, ontology, design effort, complexity ontology

Procedia PDF Downloads 174
35533 Development of a French to Yorùbá Machine Translation System

Authors: Benjamen Nathaniel, Eludiora Safiriyu Ijiyemi, Egume Oneme Lucky

Abstract:

A review on machine translation systems shows that a lot of computational artefacts has been carried out to translate written or spoken texts from a source language to Yorùbá language through Machine Translation systems. However, there are no work on French to Yorùbá language machine translation system; hence, the study investigated the process involved in the translation of French-to-Yorùbá language equivalent with the view to adopting a rule- based MT approach to build a Machine Translation framework from simple sentences administered through questionnaire. Articles and relevant textbooks were reviewed with key speakers of both languages interviewed to find out the processes involved in the translation of French language and their equivalent in Yorùbálanguage simple sentences using home domain terminologies. Achieving this, a model was formulated using phrase grammar structure, re-write rule, parse tree, automata theory- based techniques, designed and implemented respectively with unified modeling language (UML) and python programming language. Analysing the result, it was observed when carrying out the result that, the Machine Translation system performed 18.45% above Experimental Subject Respondent and 2.7% below Linguistics Expert when analysed with word orthography, sentence syntax and semantic correctness of the sentences. And, when compared with Google Machine Translation system, it was noticed that the developed system performed better on lexicons of the target language.

Keywords: machine translation (MT), rule-based, French language, Yoru`ba´ language

Procedia PDF Downloads 59
35532 An Improved Robust Algorithm Based on Cubature Kalman Filter for Single-Frequency Global Navigation Satellite System/Inertial Navigation Tightly Coupled System

Authors: Hao Wang, Shuguo Pan

Abstract:

The Global Navigation Satellite System (GNSS) signal received by the dynamic vehicle in the harsh environment will be frequently interfered with and blocked, which generates gross error affecting the positioning accuracy of the GNSS/Inertial Navigation System (INS) integrated navigation. Therefore, this paper put forward an improved robust Cubature Kalman filter (CKF) algorithm for single-frequency GNSS/INS tightly coupled system ambiguity resolution. Firstly, the dynamic model and measurement model of a single-frequency GNSS/INS tightly coupled system was established, and the method for GNSS integer ambiguity resolution with INS aided is studied. Then, we analyzed the influence of pseudo-range observation with gross error on GNSS/INS integrated positioning accuracy. To reduce the influence of outliers, this paper improved the CKF algorithm and realized an intelligent selection of robust strategies by judging the ill-conditioned matrix. Finally, a field navigation test was performed to demonstrate the effectiveness of the proposed algorithm based on the double-differenced solution mode. The experiment has proved the improved robust algorithm can greatly weaken the influence of separate, continuous, and hybrid observation anomalies for enhancing the reliability and accuracy of GNSS/INS tightly coupled navigation solutions.

Keywords: GNSS/INS integrated navigation, ambiguity resolution, Cubature Kalman filter, Robust algorithm

Procedia PDF Downloads 85
35531 Tagging a corpus of Media Interviews with Diplomats: Challenges and Solutions

Authors: Roberta Facchinetti, Sara Corrizzato, Silvia Cavalieri

Abstract:

Increasing interconnection between data digitalization and linguistic investigation has given rise to unprecedented potentialities and challenges for corpus linguists, who need to master IT tools for data analysis and text processing, as well as to develop techniques for efficient and reliable annotation in specific mark-up languages that encode documents in a format that is both human and machine-readable. In the present paper, the challenges emerging from the compilation of a linguistic corpus will be taken into consideration, focusing on the English language in particular. To do so, the case study of the InterDiplo corpus will be illustrated. The corpus, currently under development at the University of Verona (Italy), represents a novelty in terms both of the data included and of the tag set used for its annotation. The corpus covers media interviews and debates with diplomats and international operators conversing in English with journalists who do not share the same lingua-cultural background as their interviewees. To date, this appears to be the first tagged corpus of international institutional spoken discourse and will be an important database not only for linguists interested in corpus analysis but also for experts operating in international relations. In the present paper, special attention will be dedicated to the structural mark-up, parts of speech annotation, and tagging of discursive traits, that are the innovational parts of the project being the result of a thorough study to find the best solution to suit the analytical needs of the data. Several aspects will be addressed, with special attention to the tagging of the speakers’ identity, the communicative events, and anthropophagic. Prominence will be given to the annotation of question/answer exchanges to investigate the interlocutors’ choices and how such choices impact communication. Indeed, the automated identification of questions, in relation to the expected answers, is functional to understand how interviewers elicit information as well as how interviewees provide their answers to fulfill their respective communicative aims. A detailed description of the aforementioned elements will be given using the InterDiplo-Covid19 pilot corpus. The data yielded by our preliminary analysis of the data will highlight the viable solutions found in the construction of the corpus in terms of XML conversion, metadata definition, tagging system, and discursive-pragmatic annotation to be included via Oxygen.

Keywords: spoken corpus, diplomats’ interviews, tagging system, discursive-pragmatic annotation, english linguistics

Procedia PDF Downloads 174
35530 A Comparative Analysis of Zotero and Mendeley Reference Management Software

Authors: Sujit K. Basak

Abstract:

This paper presents a comparison of the reference management software between Zotero and Mendeley and the results were drawn by comparing the two software’s. The novelty of this paper is the comparative analysis of the software and it has shown that Mendeley can import more information from the Google Scholar for the researchers. This finding can help to know researchers to use the reference management software.

Keywords: analysis, comparative analysis, zotero, researchers, Mendeley

Procedia PDF Downloads 605
35529 Implementation of Green Deal Policies and Targets in Energy System Optimization Models: The TEMOA-Europe Case

Authors: Daniele Lerede, Gianvito Colucci, Matteo Nicoli, Laura Savoldi

Abstract:

The European Green Deal is the first internationally agreed set of measures to contrast climate change and environmental degradation. Besides the main target of reducing emissions by at least 55% by 2030, it sets the target of accompanying European countries through an energy transition to make the European Union into a modern, resource-efficient, and competitive net-zero emissions economy by 2050, decoupling growth from the use of resources and ensuring a fair adaptation of all social categories to the transformation process. While the general purpose to allow the realization of the purposes of the Green Deal already dates back to 2019, strategies and policies keep being developed coping with recent circumstances and achievements. However, general long-term measures like the Circular Economy Action Plan, the proposals to shift from fossil natural gas to renewable and low-carbon gases, in particular biomethane and hydrogen, and to end the sale of gasoline and diesel cars by 2035, will all have significant effects on energy supply and demand evolution across the next decades. The interactions between energy supply and demand over long-term time frames are usually assessed via energy system models to derive useful insights for policymaking and to address technological choices and research and development. TEMOA-Europe is a newly developed energy system optimization model instance based on the minimization of the total cost of the system under analysis, adopting a technologically integrated, detailed, and explicit formulation and considering the evolution of the system in partial equilibrium in competitive markets with perfect foresight. TEMOA-Europe is developed on the TEMOA platform, an open-source modeling framework totally implemented in Python, therefore ensuring third-party verification even on large and complex models. TEMOA-Europe is based on a single-region representation of the European Union and EFTA countries on a time scale between 2005 and 2100, relying on a set of assumptions for socio-economic developments based on projections by the International Energy Outlook and a large technological dataset including 7 sectors: the upstream and power sectors for the production of all energy commodities and the end-use sectors, including industry, transport, residential, commercial and agriculture. TEMOA-Europe also includes an updated hydrogen module considering its production, storage, transportation, and utilization. Besides, it can rely on a wide set of innovative technologies, ranging from nuclear fusion and electricity plants equipped with CCS in the power sector to electrolysis-based steel production processes and steel in the industrial sector – with a techno-economic characterization based on public literature – to produce insightful energy scenarios and especially to cope with the very long analyzed time scale. The aim of this work is to examine in detail the scheme of measures and policies for the realization of the purposes of the Green Deal and to transform them into a set of constraints and new socio-economic development pathways. Based on them, TEMOA-Europe will be used to produce and comparatively analyze scenarios to assess the consequences of Green Deal-related measures on the future evolution of the energy mix over the whole energy system in an economic optimization environment.

Keywords: European Green Deal, energy system optimization modeling, scenario analysis, TEMOA-Europe

Procedia PDF Downloads 95
35528 Performance Analysis of Arithmetic Units for IoT Applications

Authors: Nithiya C., Komathi B. J., Praveena N. G., Samuda Prathima

Abstract:

At present, the ultimate aim in digital system designs, especially at the gate level and lower levels of design abstraction, is power optimization. Adders are a nearly universal component of today's integrated circuits. Most of the research was on the design of high-speed adders to execute addition based on various adder structures. This paper discusses the ideal path for selecting an arithmetic unit for IoT applications. Based on the analysis of eight types of 16-bit adders, we found out Carry Look-ahead (CLA) produces low power. Additionally, multiplier and accumulator (MAC) unit is implemented with the Booth multiplier by using the low power adders in the order of preference. The design is synthesized and verified using Synopsys Design Compiler and VCS. Then it is implemented by using Cadence Encounter. The total power consumed by the CLA based booth multiplier is 0.03527mW, the total area occupied is 11260 um², and the speed is 2034 ps.

Keywords: carry look-ahead, carry select adder, CSA, internet of things, ripple carry adder, design rule check, power delay product, multiplier and accumulator

Procedia PDF Downloads 110
35527 Hydrothermal Energy Application Technology Using Dam Deep Water

Authors: Yooseo Pang, Jongwoong Choi, Yong Cho, Yongchae Jeong

Abstract:

Climate crisis, such as environmental problems related to energy supply, is getting emerged issues, so the use of renewable energy is essentially required to solve these problems, which are mainly managed by the Paris Agreement, the international treaty on climate change. The government of the Republic of Korea announced that the key long-term goal for a low-carbon strategy is “Carbon neutrality by 2050”. It is focused on the role of the internet data centers (IDC) in which large amounts of data, such as artificial intelligence (AI) and big data as an impact of the 4th industrial revolution, are managed. The demand for the cooling system market for IDC was about 9 billion US dollars in 2020, and 15.6% growth a year is expected in Korea. It is important to control the temperature in IDC with an efficient air conditioning system, so hydrothermal energy is one of the best options for saving energy in the cooling system. In order to save energy and optimize the operating conditions, it has been considered to apply ‘the dam deep water air conditioning system. Deep water at a specific level from the dam can supply constant water temperature year-round. It will be tested & analyzed the amount of energy saving with a pilot plant that has 100RT cooling capacity. Also, a target of this project is 1.2 PUE (Power Usage Effectiveness) which is the key parameter to check the efficiency of the cooling system.

Keywords: hydrothermal energy, HVAC, internet data center, free-cooling

Procedia PDF Downloads 70
35526 Should Local Governments Expect Benefits from Special Economic Zones: The Case of Poland

Authors: Radosław Pastusiak, Anna Kaźmierska, Magdalena Jasiniak

Abstract:

The impact of Special Economic Zones (SEZs) has been analyzed for many years by researchers. There are lot of theoretical studies proving the SEZs importance for regional development, however, there is lack of empirical studies (and they are mainly focused on China market) that are based on available data. The theoretical studies indicate the various impacts of enterprises operating within SEZs on the economy. The article proves that, in case of Poland, locating SEZs in municipalities is an important part of increasing municipalities’ income. Therefore SEZs have a positive impact on regional development. Municipality income is understood as taxes paid by taxpayers who depend on SEZ companies’ performance. The analysis includes the Corporate Income Tax (CIT), Personal Income Tax (PIT) and real estate tax. The effects of SEZs on regional development were narrowed to a few variables that are most significant for the financial system. The analysis indicates the significant impact of SEZs on the amount of taxes influencing the municipality budget.

Keywords: special economic zone, local finance, municipal finance, government

Procedia PDF Downloads 327
35525 Preparing Entrepreneurial Women: A Challenge for Indian Education System

Authors: Dinesh Khanduja, Pardeep Kumar Sharma

Abstract:

Education as the most important resource in any country has multiplying effects on all facets of development in a society. The new social realities, particularly, the interplay between democratization of education; unprecedented developments in the IT sector; emergence of knowledge society, liberalization of economy, and globalization have greatly influenced the educational process of all nations. This turbulence entails upon education to undergo dramatic changes to keep up with the new expectations. Growth of entrepreneurship among Indian women is highly important for empowering them and this is highly essential for the socio-economic development of a society. Unfortunately, in India, there is poor acceptance of entrepreneurship among women as unfounded myths and fears restrain them to be enterprising. To remove these inhibitions, the education system needs to be re-engineered to make entrepreneurship more acceptable. This paper empirically analyses the results of a survey done on around 500 female graduates in North India to measure and evaluate various entrepreneurial traits present in them. A formative model has been devised in this context, which should improve the teaching-learning process in our education system, which can lead to a sustainable growth of women entrepreneurship in India.

Keywords: women empowerment, entrepreneurship, education system, women entrepreneurship, sustainable development

Procedia PDF Downloads 342
35524 Optimal Allocation of PHEV Parking Lots to Minimize Dstribution System Losses

Authors: Mohsen Mazidi, Ali Abbaspour, Mahmud Fotuhi-Firuzabad, Mohamamd Rastegar

Abstract:

To tackle the air pollution issues, Plug-in Hybrid Electric Vehicles (PHEVs) are proposed as an appropriate solution. Charging a large amount of PHEV batteries, if not controlled, would have negative impacts on the distribution system. The control process of charging of these vehicles can be centralized in parking lots that may provide a chance for better coordination than the individual charging in houses. In this paper, an optimization-based approach is proposed to determine the optimum PHEV parking capacities in candidate nodes of the distribution system. In so doing, a profile for charging and discharging of PHEVs is developed in order to flatten the network load profile. Then, this profile is used in solving an optimization problem to minimize the distribution system losses. The outputs of the proposed method are the proper place for PHEV parking lots and optimum capacity for each parking. The application of the proposed method on the IEEE-34 node test feeder verifies the effectiveness of the method.

Keywords: loss, plug-in hybrid electric vehicle (PHEV), PHEV parking lot, V2G

Procedia PDF Downloads 530
35523 Sky Farming: The Alternative Concept of Green Building Using Vertical Landscape Model in Urban Area as an Effort to Achieve Sustainable Development

Authors: Nadiah Yola Putri, Nesia Putri Sharfina, Traviata Prakarti

Abstract:

This paper is a literature review presented descriptively to review the concept of green building to face the challenge of sustainable development and food in urban areas. In this paper, researchers initiated the concept of green building with sky farming method. Sky farming use vertical landscape system in order to realizing food self-sufficient green city. Sky farming relying on plantings and irrigation system efficiency in the building which is adopted the principles of green building. Planting system is done by applying hydroponic plants with Nutrient Film Technique (NFT) using energy source of solar cell and grey water from the processing of waste treatment plant. The application of sky farming in urban areas can be a recommendation for the design of environmental-friendly construction. In order to keep the land and distance efficiency, this system is a futuristic idea that would be the connector of human civilization in the future.

Keywords: green building, urban area, sky farming, vertical landscape

Procedia PDF Downloads 351
35522 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process

Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum

Abstract:

Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.

Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact

Procedia PDF Downloads 191
35521 V0 Physics at LHCb. RIVET Analysis Module for Z Boson Decay to Di-Electron

Authors: A. E. Dumitriu

Abstract:

The LHCb experiment is situated at one of the four points around CERN’s Large Hadron Collider, being a single-arm forward spectrometer covering 10 mrad to 300 (250) mrad in the bending (non-bending) plane, designed primarily to study particles containing b and c quarks. Each one of LHCb’s sub-detectors specializes in measuring a different characteristic of the particles produced by colliding protons, its significant detection characteristics including a high precision tracking system and 2 ring-imaging Cherenkov detectors for particle identification. The major two topics that I am currently concerned in are: the RIVET project (Robust Independent Validation of Experiment and Theory) which is an efficient and portable tool kit of C++ class library useful for validation and tuning of Monte Carlo (MC) event generator models by providing a large collection of standard experimental analyses useful for High Energy Physics MC generator development, validation, tuning and regression testing and V0 analysis for 2013 LHCb NoBias type data (trigger on bunch + bunch crossing) at √s=2.76 TeV.

Keywords: LHCb physics, RIVET plug-in, RIVET, CERN

Procedia PDF Downloads 413
35520 Non-Contact Digital Music Instrument Using Light Sensing Technology

Authors: Aishwarya Ravichandra, Kirtana Kirtivasan, Adithi Mahesh, Ashwini S.Savanth

Abstract:

A Non-Contact Digital Music System has been conceptualized and implemented to create a new era of digital music. This system replaces the strings of a traditional stringed instrument with laser beams to avoid bruising of the user’s hand. The system consists of seven laser modules, detector modules and distance sensors that form the basic hardware blocks of this instrument. Arduino ATmega2560 microcontroller is used as the primary interface between the hardware and the software. MIDI (Musical Instrument Digital Interface) is used as the protocol to establish communication between the instrument and the virtual synthesizer software.

Keywords: Arduino, detector, laser, MIDI, note on, note off, pitch bend, Sharp IR distance sensor

Procedia PDF Downloads 393
35519 Model-Based Field Extraction from Different Class of Administrative Documents

Authors: Jinen Daghrir, Anis Kricha, Karim Kalti

Abstract:

The amount of incoming administrative documents is massive and manually processing these documents is a costly task especially on the timescale. In fact, this problem has led an important amount of research and development in the context of automatically extracting fields from administrative documents, in order to reduce the charges and to increase the citizen satisfaction in administrations. In this matter, we introduce an administrative document understanding system. Given a document in which a user has to select fields that have to be retrieved from a document class, a document model is automatically built. A document model is represented by an attributed relational graph (ARG) where nodes represent fields to extract, and edges represent the relation between them. Both of vertices and edges are attached with some feature vectors. When another document arrives to the system, the layout objects are extracted and an ARG is generated. The fields extraction is translated into a problem of matching two ARGs which relies mainly on the comparison of the spatial relationships between layout objects. Experimental results yield accuracy rates from 75% to 100% tested on eight document classes. Our proposed method has a good performance knowing that the document model is constructed using only one single document.

Keywords: administrative document understanding, logical labelling, logical layout analysis, fields extraction from administrative documents

Procedia PDF Downloads 204
35518 Developing a SOA-Based E-Healthcare Systems

Authors: Hend Albassam, Nouf Alrumaih

Abstract:

Nowadays we are in the age of technologies and communication and there is no doubt that technologies such as the Internet can offer many advantages for many business fields, and the health field is no execution. In fact, using the Internet provide us with a new path to improve the quality of health care throughout the world. The e-healthcare offers many advantages such as: efficiency by reducing the cost and avoiding duplicate diagnostics, empowerment of patients by enabling them to access their medical records, enhancing the quality of healthcare and enabling information exchange and communication between healthcare organizations. There are many problems that result from using papers as a way of communication, for example, paper-based prescriptions. Usually, the doctor writes a prescription and gives it to the patient who in turn carries it to the pharmacy. After that, the pharmacist takes the prescription to fill it and give it to the patient. Sometimes the pharmacist might find difficulty in reading the doctor’s handwriting; the patient could change and counterfeit the prescription. These existing problems and many others heighten the need to improve the quality of the healthcare. This project is set out to develop a distributed e-healthcare system that offers some features of e-health and addresses some of the above-mentioned problems. The developed system provides an electronic health record (EHR) and enables communication between separate health care organizations such as the clinic, pharmacy and laboratory. To develop this system, the Service Oriented Architecture (SOA) is adopted as a design approach, which helps to design several independent modules that communicate by using web services. The layering design pattern is used in designing each module as it provides reusability that allows the business logic layer to be reused by different higher layers such as the web service or the website in our system. The experimental analysis has shown that the project has successfully achieved its aims toward solving the problems related to the paper-based healthcare systems and it enables different health organization to communicate effectively. It implements four independent modules including healthcare provider, pharmacy, laboratory and medication information provider. Each module provides different functionalities and is used by a different type of user. These modules interoperate with each other using a set of web services.

Keywords: e-health, services oriented architecture (SOA), web services, interoperability

Procedia PDF Downloads 300
35517 Fuzzy Analytic Hierarchy Process for Determination of Supply Chain Performance Evaluation Criteria

Authors: Ibrahim Cil, Onur Kurtcu, H. Ibrahim Demir, Furkan Yener, Yusuf. S. Turkan, Muharrem Unver, Ramazan Evren

Abstract:

Fuzzy AHP (Analytic Hierarchy Process) method is decision-making way at the end of integrating the current AHP method with fuzzy structure. In this study, the processes of production planning, inventory management and purchasing department of a system were analysed and were requested to decide the performance criteria of each area. At this point, the current work processes were analysed by various decision-makers and comparing each criteria by giving points according to 1-9 scale were completed. The criteria were listed in order to their weights by using Fuzzy AHP approach and top three performance criteria of each department were determined. After that, the performance criteria of supply chain consisting of three departments were asked to determine. The processes of each department were compared by decision-makers at the point of building the supply chain performance system and getting the performance criteria. According to the results, the criteria of performance system of supply chain by using Fuzzy AHP were determined for which will be used in the supply chain performance system in the future.

Keywords: AHP, fuzzy, performance evaluation, supply chain

Procedia PDF Downloads 333
35516 The Relevance of PISA Tests in the Decentralization of the Educational System in Romania

Authors: Nitu Marilena Cristina

Abstract:

Decentralization of the education system is an educational policy option necessary from the perspective of democratizing internal life and streamlining service administration public. The experience of recent years has shown that decisions taken at central level do not to take into account all situations and especially all the specific needs and interests of the various institutions and individuals. A democratic society implies that the decision-making process is brought closer to the place of application, allowing citizens to take part in the decision-making that affects them directly or indirectly. Essentially decentralization of pre-university education is the transfer of authority, responsibility and resources in decision-making and general management, and financially to the educational units and the local community. This creates a frame of an effective collaboration between school and community. Modern theories on the leadership of education advocate the adoption of decentralization measures and participatory strategies. Numerous countries confronted with the educational impasse has appealed to these strategies. Reforming projects have begun application diversified and nuanced social decentralization models according to the specific social and educational situation. Analysis of legal provisions and measures adopted in the framework of the reform process indicates that, at least formally, decentralization is the solution chosen.

Keywords: decentralization, educational, management, reforming

Procedia PDF Downloads 157
35515 Fibers Presence Effects on Air Flow of Attenuator of Spun-Bond Production System

Authors: Nasser Ghassembaglou, Abdullah Bolek, Oktay Yilmaz, Ertan Oznergiz, Hikmet Kocabas, Safak Yilmaz

Abstract:

High quality air filters production using nanofibers, as a functional material, has frequently been investigated. As it is more environmentally friendly, melting method has been selected to produce nanofibers. Spun-bond production systems consist of extruder, spin-pump, nozzle package and attenuators. Spin-pump makes molten polymer steady, which flows through extruder. Fibers are formed by regular melts passing through nuzzle holes under high pressure. Attenuator prolongs fibers to micron size to be collected on a conveyor. Different designs of attenuator systems have been studied in this research; new analysis have been done on existed designs considering fibers effect on air flow; it was comprehended that, at fibers presence, there is an air flow which agglomerates fibers as a negative effect. So some new representations have been designed and CFD analysis have been done on them. Afterwards, one of these representations selected as the most optimum and effective design which is brought in this paper.

Keywords: attenuator, CFD, nanofiber, spun-bond

Procedia PDF Downloads 440