Search results for: technological process
14371 Design of Enhanced Adaptive Filter for Integrated Navigation System of FOG-SINS and Star Tracker
Authors: Nassim Bessaad, Qilian Bao, Zhao Jiangkang
Abstract:
The fiber optics gyroscope in the strap-down inertial navigation system (FOG-SINS) suffers from precision degradation due to the influence of random errors. In this work, an enhanced Allan variance (AV) stochastic modeling method combined with discrete wavelet transform (DWT) for signal denoising is implemented to estimate the random process in the FOG signal. Furthermore, we devise a measurement-based iterative adaptive Sage-Husa nonlinear filter with augmented states to integrate a star tracker sensor with SINS. The proposed filter adapts the measurement noise covariance matrix based on the available data. Moreover, the enhanced stochastic modeling scheme is invested in tuning the process noise covariance matrix and the augmented state Gauss-Markov process parameters. Finally, the effectiveness of the proposed filter is investigated by employing the collected data in laboratory conditions. The result shows the filter's improved accuracy in comparison with the conventional Kalman filter (CKF).Keywords: inertial navigation, adaptive filtering, star tracker, FOG
Procedia PDF Downloads 8014370 A Digital Twin Approach for Sustainable Territories Planning: A Case Study on District Heating
Authors: Ahmed Amrani, Oussama Allali, Amira Ben Hamida, Felix Defrance, Stephanie Morland, Eva Pineau, Thomas Lacroix
Abstract:
The energy planning process is a very complex task that involves several stakeholders and requires the consideration of several local and global factors and constraints. In order to optimize and simplify this process, we propose a tool-based iterative approach applied to district heating planning. We build our tool with the collaboration of a French territory using actual district data and implementing the European incentives. We set up an iterative process including data visualization and analysis, identification and extraction of information related to the area concerned by the operation, design of sustainable planning scenarios leveraging local renewable and recoverable energy sources, and finally, the evaluation of scenarios. The last step is performed by a dynamic digital twin replica of the city. Territory’s energy experts confirm that the tool provides them with valuable support towards sustainable energy planning.Keywords: climate change, data management, decision support, digital twin, district heating, energy planning, renewables, smart city
Procedia PDF Downloads 17214369 Creativity and Innovation in a Military Unit of South America: Decision Making Process, Socio-Emotional Climate, Shared Flow and Leadership
Authors: S. da Costa, D. Páez, E. Martínez, A. Torres, M. Beramendi, D. Hermosilla, M. Muratori
Abstract:
This study examined the association between creative performance, organizational climate and leadership, affectivity, shared flow, and group decision making. The sample consisted of 315 cadets of a military academic unit of South America. Satisfaction with the decision-making process during a creative task was associated with the usefulness and effectiveness of the ideas generated by the teams with a weighted average correlation of r = .18. Organizational emotional climate, positive and innovation leadership were associated with this group decision-making process r = .25, with shared flow, r = .29 and with positive affect felt during the performance of the creative task, r = .12. In a sequential mediational analysis positive organizational leadership styles were significantly associated with decision-making process and trough cohesion with utility and efficacy of the solution of a creative task. Satisfactory decision-making was related to shared flow during the creative task at collective or group level, and positive affect with flow at individual level.This study examined the association between creative performance, organizational climate and leadership, affectivity, shared flow, and group decision making. The sample consisted of 315 cadets of a military academic unit of South America. Satisfaction with the decision-making process during a creative task was associated with the usefulness and effectiveness of the ideas generated by the teams with a weighted average correlation of r = .18. Organizational emotional climate, positive and innovation leadership were associated with this group decision-making process r = .25, with shared flow, r = .29 and with positive affect felt during the performance of the creative task, r = .12. In a sequential mediational analysis positive organizational leadership styles were significantly associated with decision-making process and trough cohesion with utility and efficacy of the solution of a creative task. Satisfactory decision-making was related to shared flow during the creative task at collective or group level, and positive affect with flow at individual level.Keywords: creativity, innovation, military, organization, teams
Procedia PDF Downloads 12314368 The Essence and Attribution of Intellectual Property Rights Generated in the Digitization of Intangible Cultural Heritage
Authors: Jiarong Zhang
Abstract:
Digitizing intangible cultural heritage is a complex and comprehensive process from which sorts of intellectual property rights may be generated. Digitizing may be a repacking process of cultural heritage, which creates copyrights; recording folk songs and indigenous performances can create 'related rights'. At the same time, digitizing intangible cultural heritage may infringe the intellectual property rights of others unintentionally. Recording religious rituals of indigenous communities without authorization can violate the moral right of the ceremony participants of the community; making digital copies of rock paintings may infringe the right of reproduction. In addition, several parties are involved in the digitization process: indigenous peoples, museums, and archives can be holders of cultural heritage; companies and research institutions can be technology providers; internet platforms can be promoters and sellers; the public and groups above can be beneficiaries. When diverse intellectual property rights versus various parties, problems and disputes can arise easily. What are the types of intellectual property rights generated in the digitization process? What is the essence of these rights? Who should these rights belong to? How to use intellectual property to protect the digitalization of cultural heritage? How to avoid infringing on the intellectual property rights of others? While the digitization has been regarded as an effective approach to preserve intangible cultural heritage, related intellectual property issues have not received the attention and full discussion. Thus, parties involving in the digitization process may face intellectual property infringement lawsuits. The article will explore those problems from the intersection perspective of intellectual property law and cultural heritage. From a comparative approach, the paper will analysis related legal documents and cases, and shed some lights of those questions listed. The findings show, although there are no intellectual property laws targeting the cultural heritage in most countries, the involved stakeholders can seek protection from existing intellectual property rights following the suggestions of the article. The research will contribute to the digitization of intangible cultural heritage from a legal and policy aspect.Keywords: copyright, digitization, intangible cultural heritage, intellectual property, Internet platforms
Procedia PDF Downloads 14614367 Reverse Logistics Information Management Using Ontological Approach
Authors: F. Lhafiane, A. Elbyed, M. Bouchoum
Abstract:
Reverse Logistics (RL) Process is considered as complex and dynamic network that involves many stakeholders such as: suppliers, manufactures, warehouse, retails, and costumers, this complexity is inherent in such process due to lack of perfect knowledge or conflicting information. Ontologies, on the other hand, can be considered as an approach to overcome the problem of sharing knowledge and communication among the various reverse logistics partners. In this paper, we propose a semantic representation based on hybrid architecture for building the Ontologies in an ascendant way, this method facilitates the semantic reconciliation between the heterogeneous information systems (ICT) that support reverse logistics Processes and product data.Keywords: Reverse Logistics, information management, heterogeneity, ontologies, semantic web
Procedia PDF Downloads 49214366 Polymer Recycling by Biomaterial and Its Application in Grease Formulation
Authors: Amitkumar Barot, Vijaykumar Sinha
Abstract:
There is growing interest in the development of new materials based on recycled polymers from plastic waste, and also in the field of lubricants much effort has been spent on substitution of petro-based raw materials by natural-based renewable ones. This is due to the facts of depleting fossil fuels and due to strict environmental laws. In relevance to this, new technique for the formulation of grease that combines the chemical recycling of poly (ethylene terephthalate) PET with the use of castor oil (CO) has been developed. Comparison to diols used in chemical recycling of PET, castor oil is renewable, easily available, environmentally friendly, economically cheaper and hence sustainability indeed. The process parameters like CO concentration and temperature were altered, and further, the influences of the process parameters have been studied in order to establish technically and commercially viable process. Further thereby formed depolymerized product find an application as base oil in the formulation of grease. A depolymerized product has been characterized by various chemical and instrumental methods, while formulated greases have been evaluated for its tribological properties. The grease formulated using this new environmentally friendly approach presents applicative properties similar, and in some cases superior, compared to those of a commercial grease obtained from non-renewable resources.Keywords: castor oil, grease formulation, recycling, sustainability
Procedia PDF Downloads 22014365 Personnel Selection Based on Step-Wise Weight Assessment Ratio Analysis and Multi-Objective Optimization on the Basis of Ratio Analysis Methods
Authors: Emre Ipekci Cetin, Ebru Tarcan Icigen
Abstract:
Personnel selection process is considered as one of the most important and most difficult issues in human resources management. At the stage of personnel selection, the applicants are handled according to certain criteria, the candidates are dealt with, and efforts are made to select the most appropriate candidate. However, this process can be more complicated in terms of the managers who will carry out the staff selection process. Candidates should be evaluated according to different criteria such as work experience, education, foreign language level etc. It is crucial that a rational selection process is carried out by considering all the criteria in an integrated structure. In this study, the problem of choosing the front office manager of a 5 star accommodation enterprise operating in Antalya is addressed by using multi-criteria decision-making methods. In this context, SWARA (Step-wise weight assessment ratio analysis) and MOORA (Multi-Objective Optimization on the basis of ratio analysis) methods, which have relatively few applications when compared with other methods, have been used together. Firstly SWARA method was used to calculate the weights of the criteria and subcriteria that were determined by the business. After the weights of the criteria were obtained, the MOORA method was used to rank the candidates using the ratio system and the reference point approach. Recruitment processes differ from sector to sector, from operation to operation. There are a number of criteria that must be taken into consideration by businesses in accordance with the structure of each sector. It is of utmost importance that all candidates are evaluated objectively in the framework of these criteria, after these criteria have been carefully selected in the selection of suitable candidates for employment. In the study, staff selection process was handled by using SWARA and MOORA methods together.Keywords: accommodation establishments, human resource management, multi-objective optimization on the basis of ratio analysis, multi-criteria decision making, step-wise weight assessment ratio analysis
Procedia PDF Downloads 34314364 Clubhouse: A Minor Rebellion against the Algorithmic Tyranny of the Majority
Authors: Vahid Asadzadeh, Amin Ataee
Abstract:
Since the advent of social media, there has been a wave of optimism among researchers and civic activists about the influence of virtual networks on the democratization process, which has gradually waned. One of the lesser-known concerns is how to increase the possibility of hearing the voices of different minorities. According to the theory of media logic, the media, using their technological capabilities, act as a structure through which events and ideas are interpreted. Social media, through the use of the learning machine and the use of algorithms, has formed a kind of structure in which the voices of minorities and less popular topics are lost among the commotion of the trends. In fact, the recommended systems and algorithms used in social media are designed to help promote trends and make popular content more popular, and content that belongs to minorities is constantly marginalized. As social networks gradually play a more active role in politics, the possibility of freely participating in the reproduction and reinterpretation of structures in general and political structures in particular (as Laclau and Mouffe had in mind) can be considered as criteria to democracy in action. The point is that the media logic of virtual networks is shaped by the rule and even the tyranny of the majority, and this logic does not make it possible to design a self-foundation and self-revolutionary model of democracy. In other words, today's social networks, though seemingly full of variety But they are governed by the logic of homogeneity, and they do not have the possibility of multiplicity as is the case in immanent radical democracies (influenced by Gilles Deleuze). However, with the emergence and increasing popularity of Clubhouse as a new social media, there seems to be a shift in the social media space, and that is the diminishing role of algorithms and systems reconditioners as content delivery interfaces. This has led to the fact that in the Clubhouse, the voices of minorities are better heard, and the diversity of political tendencies manifests itself better. The purpose of this article is to show, first, how social networks serve the elimination of minorities in general, and second, to argue that the media logic of social networks must adapt to new interpretations of democracy that give more space to minorities and human rights. Finally, this article will show how the Clubhouse serves the new interpretations of democracy at least in a minimal way. To achieve the mentioned goals, in this article by a descriptive-analytical method, first, the relation between media logic and postmodern democracy will be inquired. The political economy popularity in social media and its conflict with democracy will be discussed. Finally, it will be explored how the Clubhouse provides a new horizon for the concepts embodied in radical democracy, a horizon that more effectively serves the rights of minorities and human rights in general.Keywords: algorithmic tyranny, Clubhouse, minority rights, radical democracy, social media
Procedia PDF Downloads 14514363 Optimization of Titanium Leaching Process Using Experimental Design
Authors: Arash Rafiei, Carroll Moore
Abstract:
Leaching process as the first stage of hydrometallurgy is a multidisciplinary system including material properties, chemistry, reactor design, mechanics and fluid dynamics. Therefore, doing leaching system optimization by pure scientific methods need lots of times and expenses. In this work, a mixture of two titanium ores and one titanium slag are used for extracting titanium for leaching stage of TiO2 pigment production procedure. Optimum titanium extraction can be obtained from following strategies: i) Maximizing titanium extraction without selective digestion; and ii) Optimizing selective titanium extraction by balancing between maximum titanium extraction and minimum impurity digestion. The main difference between two strategies is due to process optimization framework. For the first strategy, the most important stage of production process is concerned as the main stage and rest of stages would be adopted with respect to the main stage. The second strategy optimizes performance of more than one stage at once. The second strategy has more technical complexity compared to the first one but it brings more economical and technical advantages for the leaching system. Obviously, each strategy has its own optimum operational zone that is not as same as the other one and the best operational zone is chosen due to complexity, economical and practical aspects of the leaching system. Experimental design has been carried out by using Taguchi method. The most important advantages of this methodology are involving different technical aspects of leaching process; minimizing the number of needed experiments as well as time and expense; and concerning the role of parameter interactions due to principles of multifactor-at-time optimization. Leaching tests have been done at batch scale on lab with appropriate control on temperature. The leaching tank geometry has been concerned as an important factor to provide comparable agitation conditions. Data analysis has been done by using reactor design and mass balancing principles. Finally, optimum zone for operational parameters are determined for each leaching strategy and discussed due to their economical and practical aspects.Keywords: titanium leaching, optimization, experimental design, performance analysis
Procedia PDF Downloads 37314362 Random Vertical Seismic Vibrations of the Long Span Cantilever Beams
Authors: Sergo Esadze
Abstract:
Seismic resistance norms require calculation of cantilevers on vertical components of the base seismic acceleration. Long span cantilevers, as a rule, must be calculated as a separate construction element. According to the architectural-planning solution, functional purposes and environmental condition of a designing buildings/structures, long span cantilever construction may be of very different types: both by main bearing element (beam, truss, slab), and by material (reinforced concrete, steel). A choice from these is always linked with bearing construction system of the building. Research of vertical seismic vibration of these constructions requires individual approach for each (which is not specified in the norms) in correlation with model of seismic load. The latest may be given both as deterministic load and as a random process. Loading model as a random process is more adequate to this problem. In presented paper, two types of long span (from 6m – up to 12m) reinforcement concrete cantilever beams have been considered: a) bearing elements of cantilevers, i.e., elements in which they fixed, have cross-sections with large sizes and cantilevers are made with haunch; b) cantilever beam with load-bearing rod element. Calculation models are suggested, separately for a) and b) types. They are presented as systems with finite quantity degree (concentrated masses) of freedom. Conditions for fixing ends are corresponding with its types. Vertical acceleration and vertical component of the angular acceleration affect masses. Model is based on assumption translator-rotational motion of the building in the vertical plane, caused by vertical seismic acceleration. Seismic accelerations are considered as random processes and presented by multiplication of the deterministic envelope function on stationary random process. Problem is solved within the framework of the correlation theory of random process. Solved numerical examples are given. The method is effective for solving the specific problems.Keywords: cantilever, random process, seismic load, vertical acceleration
Procedia PDF Downloads 18914361 Optimization of Monascus Orange Pigments Production Using pH-Controlled Fed-Batch Fermentation
Authors: Young Min Kim, Deokyeong Choe, Chul Soo Shin
Abstract:
Monascus pigments, commonly used as a natural colorant in Asia, have many biological activities, such as cholesterol level control, anti-obesity, anti-cancer, and anti-oxidant, that have recently been elucidated. Especially, amino acid derivatives of Monascus pigments are receiving much attention because they have higher biological activities than original Monascus pigments. Previously, there have been two ways to produce amino acid derivatives: one-step production and two-step production. However, the one-step production has low purity, and the two-step production—precursor(orange pigments) fermentation and derivatives synthesis—has low productivity and growth rate during its precursor fermentation step. In this study, it was verified that pH is a key factor that affects the stability of orange pigments and the growth rate of Monascus. With an optimal pH profile obtained by pH-stat fermentation, we designed a process of precursor(orange pigments) fermentation that is a pH-controlled fed-batch fermentation. The final concentration of orange pigments in this process increased to 5.5g/L which is about 30% higher than the concentration produced from the previously used precursor fermentation step.Keywords: cultivation process, fed-batch fermentation, monascus pigments, pH stability
Procedia PDF Downloads 29914360 Energy-Led Sustainability Assessment Approach for Energy-Efficient Manufacturing
Authors: Aldona Kluczek
Abstract:
In recent years, manufacturing processes have interacted with sustainability issues realized in the cost-effective ways that minimalize energy, decrease negative impacts on the environment and are safe for society. However, the attention has been on separate sustainability assessment methods considering energy and material flow, energy consumption, and emission release or process control. In this paper, the energy-led sustainability assessment approach combining the methods: energy Life Cycle Assessment to assess environmental impact, Life Cycle Cost to analyze costs, and Social Life Cycle Assessment through ‘energy LCA-based value stream map’, is used to assess the energy sustainability of the hardwood lumber manufacturing process in terms of technologies. The approach integrating environmental, economic and social issues can be visualized in the considered energy-efficient technologies on the map of an energy LCA-related (input and output) inventory data. It will enable the identification of efficient technology of a given process to be reached, through the effective analysis of energy flow. It is also indicated that interventions in the considered technology should focus on environmental, economic improvements to achieve energy sustainability. The results have indicated that the most intense energy losses are caused by a cogeneration technology. The environmental impact analysis shows that a substantial reduction by 34% can be achieved with the improvement of it. From the LCC point of view, the result seems to be cost-effective, when done at that plant where the improvement is used. By demonstrating the social dimension, every component of the energy of plant labor use in the life-cycle process of the lumber production has positive energy benefits. The energy required to install the energy-efficient technology amounts to 30.32 kJ compared to others components of the energy of plant labor and it has the highest value in terms of energy-related social indicators. The paper depicts an example of hardwood lumber production in order to prove the applicability of a sustainability assessment method.Keywords: energy efficiency, energy life cycle assessment, life cycle cost, social life cycle analysis, manufacturing process, sustainability assessment
Procedia PDF Downloads 24714359 Scale, Technique and Composition Effects of CO2 Emissions under Trade Liberalization of EGS: A CGE Evaluation for Argentina
Authors: M. Priscila Ramos, Omar O. Chisari, Juan Pablo Vila Martínez
Abstract:
Current literature about trade liberalization of environmental goods and services (EGS) raises doubts about the extent of the triple win-win situation for trade, development and the environment. However, much of this literature does not consider the possibility that this agreement carries technological transmissions, either through trade or foreign direct investment. This paper presents a computable general equilibrium model calibrated for Argentina, where there are alternative technologies (one dirty and one clean according to carbon emissions) to produce the same goods. In this context, the trade liberalization of EGS allows to increase GDP, trade, reduce unemployment and improve the households welfare. However, the capital mobility appears as the key assumption to jointly reach the environmental target, when the positive scale effect generated by the increase in trade is offset by the change in the composition of production (composition and technical effects by the use of the clean alternative technology) and of consumption (composition effect by substitution of relatively lesspolluting imported goods).Keywords: CGE modeling, CO2 emissions, composition effect, scale effect, technique effect, trade liberalization of EGS
Procedia PDF Downloads 38014358 Computer-Aided Teaching of Transformers for Undergraduates
Authors: Rajesh Kumar, Roopali Dogra, Puneet Aggarwal
Abstract:
In the era of technological advancement, use of computer technology has become inevitable. Hence it has become the need of the hour to integrate software methods in engineering curriculum as a part to boost pedagogy techniques. Simulations software is a great help to graduates of disciplines such as electrical engineering. Since electrical engineering deals with high voltages and heavy instruments, extra care must be taken while operating with them. The viable solution would be to have appropriate control. The appropriate control could be well designed if engineers have knowledge of kind of waveforms associated with the system. Though these waveforms can be plotted manually, but it consumes a lot of time. Hence aid of simulation helps to understand steady state of system and resulting in better performance. In this paper computer, aided teaching of transformer is carried out using MATLAB/Simulink. The test carried out on a transformer includes open circuit test and short circuit respectively. The respective parameters of transformer are then calculated using the values obtained from open circuit and short circuit test respectively using Simulink.Keywords: computer aided teaching, open circuit test, short circuit test, simulink, transformer
Procedia PDF Downloads 37414357 Re-Engineering Management Process in IRAN’s Smart Schools
Authors: M. R. Babaei, S. M. Hosseini, S. Rahmani, L. Moradi
Abstract:
Today, the quality of education and training systems and the effectiveness of the education systems of most concern to stakeholders and decision-makers of our country's development in each country. In Iran this is a double issue of concern to numerous reasons; So that governments, over the past decade have hardly even paid the running costs of education. ICT is claiming it has the power to change the structure of a program for training, reduce costs and increase quality, and do education systems and products consistent with the needs of the community and take steps to practice education. Own of the areas that the introduction of information technology has fundamentally changed is the field of education. The aim of this research is process reengineering management in schools simultaneously has been using field studies to collect data in the form of interviews and a questionnaire survey. The statistical community of this research has been the country of Iran and smart schools under the education. Sampling was targeted. The data collection tool was a questionnaire composed of two parts. The questionnaire consists of 36 questions that each question designates one of effective factors on the management of smart schools. Also each question consists of two parts. The first part designates the operating position in the management process, which represents the domain's belonging to the management agent (planning, organizing, leading, controlling). According to the classification of Dabryn and in second part the factors affect the process of managing the smart schools were examined, that Likert scale is used to classify. Questions the validity of the group of experts and prominent university professors in the fields of information technology, management and reengineering of approved and Cronbach's alpha reliability and also with the use of the formula is evaluated and approved. To analyse the data, descriptive and inferential statistics were used to analyse the factors contributing to the rating of (Linkert scale) descriptive statistics (frequency table data, mean, median, mode) was used. To analyse the data using analysis of variance and nonparametric tests and Friedman test, the assumption was evaluated. The research conclusions show that the factors influencing the management process re-engineering smart schools in school performance is affected.Keywords: re-engineering, management process, smart school, Iran's school
Procedia PDF Downloads 24414356 An Automatic Model Transformation Methodology Based on Semantic and Syntactic Comparisons and the Granularity Issue Involved
Authors: Tiexin Wang, Sebastien Truptil, Frederick Benaben
Abstract:
Model transformation, as a pivotal aspect of Model-driven engineering, attracts more and more attentions both from researchers and practitioners. Many domains (enterprise engineering, software engineering, knowledge engineering, etc.) use model transformation principles and practices to serve to their domain specific problems; furthermore, model transformation could also be used to fulfill the gap between different domains: by sharing and exchanging knowledge. Since model transformation has been widely used, there comes new requirement on it: effectively and efficiently define the transformation process and reduce manual effort that involved in. This paper presents an automatic model transformation methodology based on semantic and syntactic comparisons, and focuses particularly on granularity issue that existed in transformation process. Comparing to the traditional model transformation methodologies, this methodology serves to a general purpose: cross-domain methodology. Semantic and syntactic checking measurements are combined into a refined transformation process, which solves the granularity issue. Moreover, semantic and syntactic comparisons are supported by software tool; manual effort is replaced in this way.Keywords: automatic model transformation, granularity issue, model-driven engineering, semantic and syntactic comparisons
Procedia PDF Downloads 39514355 Induction of Innovation (Districts) in (Spatial) Planning and Policy
Authors: Meera Prajapati
Abstract:
Technological innovation is important for economic and spatial rejuvenation. Innovation districts from the last decades around university towns offer interesting examples. Planning directs the interplay between economic and urban development in these innovation districts that appear in particular regions with economic benefits as a result of incentives to attract multinational industries in innovation centres, research parks, universities, bio incubator assets, etc. The inclination of the OECED towards developing entrepreneurship and innovation to harness a boost in growth requires sustainable living conditions. This research aims to understand ‘how innovation or knowledge centres affected development policies and helped cities to become a high-tech region?’ Therefore, the economic policies of cities are investigated as well as the location logic of centres and their intertwining with supporting services (health, education, living environment, etc.). Case studies (Eindhoven (The Netherlands) and Ho Chi Minh City (Viet Nam)) position Pune (India) in terms of the planning components of innovation.Keywords: innovation districts, high-tech regions, smart cities, urban planning and policies
Procedia PDF Downloads 14514354 Hydrodynamic Study and Sizing of a Distillation Column by HYSYS Software
Authors: Derrouazin Mohammed Redhouane, Souakri Mohammed Lotfi, Henini Ghania
Abstract:
This work consists, first of all, of mastering one of the powerful process simulation tools currently used in the industrial processes, which is the HYSYS sizing software, and second, of simulating a petroleum distillation column. This study is divided into two parts; where the first one consists of a dimensioning of the column with a fast approximating method using state equations, iterative calculations, and then a precise simulation method with the HYSYS software. The second part of this study is a hydrodynamic study in order to verify by obtained results the proper functioning of the plates.Keywords: industry process engineering, water distillation, environment, HYSYS simulation tool
Procedia PDF Downloads 12914353 From E-Government to Cloud-Government Challenges of Jordanian Citizens' Acceptance for Public Services
Authors: Abeer Alkhwaldi, Mumtaz Kamala
Abstract:
On the inception of the third millennium, there is much evidence that cloud technologies have become the strategic trend for many governments not only developed countries (e.g., UK, Japan, and USA), but also developing countries (e.g. Malaysia and the Middle East region), who have launched cloud computing movements for enhanced standardization of IT resources, cost reduction, and more efficient public services. Therefore, cloud-based e-government services considered as one of the high priorities for government agencies in Jordan. Although of their phenomenal evolution, government cloud-services still suffering from the adoption challenges of e-government initiatives (e.g. technological, human-aspects, social, and financial) which need to be considered carefully by governments contemplating its implementation. This paper presents a pilot study to investigate the citizens' perception of the extent in which these challenges affect the acceptance and use of cloud computing in Jordanian public sector. Based on the data analysis collected using online survey some important challenges were identified. The results can help to guide successful acceptance of cloud-based e-government services in Jordan.Keywords: challenges, cloud computing, e-government, acceptance, Jordan
Procedia PDF Downloads 43514352 Road Maintenance Management Decision System Using Multi-Criteria and Geographical Information System for Takoradi Roads, Ghana
Authors: Eric Mensah, Carlos Mensah
Abstract:
The road maintenance backlogs created as a result of deferred maintenance especially in developing countries has caused considerable deterioration of many road assets. This is usually due to difficulties encountered in selecting and prioritising maintainable roads based on objective criteria rather than some political or other less important criteria. In order to ensure judicious use of limited resources for road maintenance, five factors were identified as the most important criteria for road management within the study area. This was based on the judgements of 40 experts. The results were further used to develop weightings using the Multi-Criteria Decision Process (MCDP) to analyse and select road alternatives according to maintenance goal. Using Geographical Information Systems (GIS), maintainable roads were grouped using the Jenk’s natural breaks to allow for further prioritised in order of importance for display on a dashboard of maps, charts, and tables. This reduces the problems of subjective maintenance and road selections, thereby reducing wastage of resources and easing the maintenance process through an object organised spatial decision support system.Keywords: decision support, geographical information systems, multi-criteria decision process, weighted sum
Procedia PDF Downloads 37614351 A Pattern Practise for Awareness Educations on Information Security: Information Security Project
Authors: Fati̇h Apaydin
Abstract:
Education technology is an area which constantly changes and creates innovations. As an inevitable part of the changing circumstances, the societies who have a tendency to the improvements keep up with these innovations by using the methods and strategies which have been designed for education technology. At this point, education technology has taken the responsibility to help the individuals improve themselves and teach the effective teaching methods by filling the airs in theoretical information, information security and the practice. The technology which comes to the core of our lives by raising the importance of it day by day and it enforced its position in computer- based environments. As a result, ‘being ready for technological innovations, improvement on computer-based talent, information, ability and attitude’ doctrines have to be given. However, it is today quite hard to deal with the security and reinforcement of this information. The information which is got illegally gives harm to society from every aspect, especially education. This study includes how and to what extent to use these innovative appliances such as computers and the factor of information security of these appliances in computer-based education. As the use of computer is constantly becoming prevalent in our country, both education and computer will never become out of date, so how computer-based education affects our lives and the study of information security for this type of education are important topics.Keywords: computer, information security, education, technology, development
Procedia PDF Downloads 59414350 Dynamic Process Model for Designing Smart Spaces Based on Context-Awareness and Computational Methods Principles
Authors: Heba M. Jahin, Ali F. Bakr, Zeyad T. Elsayad
Abstract:
As smart spaces can be defined as any working environment which integrates embedded computers, information appliances and multi-modal sensors to remain focused on the interaction between the users, their activity, and their behavior in the space; hence, smart space must be aware of their contexts and automatically adapt to their changing context-awareness, by interacting with their physical environment through natural and multimodal interfaces. Also, by serving the information used proactively. This paper suggests a dynamic framework through the architectural design process of the space based on the principles of computational methods and context-awareness principles to help in creating a field of changes and modifications. It generates possibilities, concerns about the physical, structural and user contexts. This framework is concerned with five main processes: gathering and analyzing data to generate smart design scenarios, parameters, and attributes; which will be transformed by coding into four types of models. Furthmore, connecting those models together in the interaction model which will represent the context-awareness system. Then, transforming that model into a virtual and ambient environment which represents the physical and real environments, to act as a linkage phase between the users and their activities taking place in that smart space . Finally, the feedback phase from users of that environment to be sure that the design of that smart space fulfill their needs. Therefore, the generated design process will help in designing smarts spaces that can be adapted and controlled to answer the users’ defined goals, needs, and activity.Keywords: computational methods, context-awareness, design process, smart spaces
Procedia PDF Downloads 33114349 The Implementation of the European Landscape Convention in Turkey: Opportunities and Constraints
Authors: Tutku Ak, Abdullah Kelkit, Cihad Öztürk
Abstract:
An increase has been witnessed with the number of multinational environmental agreements in the past decade, particularly in Europe. Success with implementation, however, shows variation. While many countries are willing to join these agreements, they do not always fully honor their obligations to put their commitments into practice. One reason for this is that countries have different legal and administrative systems. One example of an international multilateral environmental agreement is the European Landscape Convention (ELC). ELC expresses a concern to achieve sustainable development based on a balanced and harmonious relationship between social needs, economic activity, and the environment. Member states are required to implement the convention in accordance with their own administrative structure, respecting subsidiarity. In particular, the importance of cooperation in the protection, management, and planning of the resources is expressed through the convention. In this paper, it is intended to give a broad view of ELC’s implementation process in Turkey and what factors have influenced by the process. Under this context, the paper will focus on the objectives of the convention for addressing the issue of the loss of European landscapes, and the justification and tools used to accomplish these objectives. The degree to which these objectives have been implemented in Turkey and the opportunities and constraints that have been faced during this process have been discussed.Keywords: European landscape convention, implementation, multinational environmental agreements, policy tools
Procedia PDF Downloads 29814348 Effect of Equal Channel Angular Pressing Process on Impact Property of Pure Copper
Authors: Fahad Al-Mufadi, F. Djavanroodi
Abstract:
Ultrafine grained (UFG) and nanostructured (NS) materials have experienced a rapid development during the last decade and made profound impact on every field of materials science and engineering. The present work has been undertaken to develop ultra-fine grained pure copper by severe plastic deformation method and to examine the impact property by different characterizing tools. For this aim, equal channel angular pressing die with the channel angle, outer corner angle and channel diameter of 90°, 17° and 20 mm had been designed and manufactured. Commercial pure copper billets were ECAPed up to four passes by route BC at the ambient temperature. The results indicated that there is a great improvement at the hardness measurement, yield strength and ultimate tensile strength after ECAP process. It is found that the magnitudes of HV reach 136HV from 52HV after the final pass. Also, about 285% and 125% enhancement at the YS and UTS values have been obtained after the fourth pass as compared to the as-received conditions, respectively. On the other hand, the elongation to failure and impact energy have been reduced by imposing ECAP process and pass numbers. It is needed to say that about 56% reduction in the impact energy have been attained for the samples as contrasted to annealed specimens.Keywords: SPD, ECAP, pure cu, impact property
Procedia PDF Downloads 25914347 Applying Big Data Analysis to Efficiently Exploit the Vast Unconventional Tight Oil Reserves
Authors: Shengnan Chen, Shuhua Wang
Abstract:
Successful production of hydrocarbon from unconventional tight oil reserves has changed the energy landscape in North America. The oil contained within these reservoirs typically will not flow to the wellbore at economic rates without assistance from advanced horizontal well and multi-stage hydraulic fracturing. Efficient and economic development of these reserves is a priority of society, government, and industry, especially under the current low oil prices. Meanwhile, society needs technological and process innovations to enhance oil recovery while concurrently reducing environmental impacts. Recently, big data analysis and artificial intelligence become very popular, developing data-driven insights for better designs and decisions in various engineering disciplines. However, the application of data mining in petroleum engineering is still in its infancy. The objective of this research aims to apply intelligent data analysis and data-driven models to exploit unconventional oil reserves both efficiently and economically. More specifically, a comprehensive database including the reservoir geological data, reservoir geophysical data, well completion data and production data for thousands of wells is firstly established to discover the valuable insights and knowledge related to tight oil reserves development. Several data analysis methods are introduced to analysis such a huge dataset. For example, K-means clustering is used to partition all observations into clusters; principle component analysis is applied to emphasize the variation and bring out strong patterns in the dataset, making the big data easy to explore and visualize; exploratory factor analysis (EFA) is used to identify the complex interrelationships between well completion data and well production data. Different data mining techniques, such as artificial neural network, fuzzy logic, and machine learning technique are then summarized, and appropriate ones are selected to analyze the database based on the prediction accuracy, model robustness, and reproducibility. Advanced knowledge and patterned are finally recognized and integrated into a modified self-adaptive differential evolution optimization workflow to enhance the oil recovery and maximize the net present value (NPV) of the unconventional oil resources. This research will advance the knowledge in the development of unconventional oil reserves and bridge the gap between the big data and performance optimizations in these formations. The newly developed data-driven optimization workflow is a powerful approach to guide field operation, which leads to better designs, higher oil recovery and economic return of future wells in the unconventional oil reserves.Keywords: big data, artificial intelligence, enhance oil recovery, unconventional oil reserves
Procedia PDF Downloads 28314346 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets
Authors: Ece Cigdem Mutlu, Burak Alakent
Abstract:
Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.Keywords: average run length, M-estimators, quality control, robust estimators
Procedia PDF Downloads 19014345 Nanda Ways of Knowing, Being and Doing: Our Process of Research Engagement and Research Impacts
Authors: Steven Kelly
Abstract:
A fundament role of the researcher is research engagement, that is, the interaction between researchers and research end-users outside of academia for the mutually beneficial transfer of knowledge, technologies, methods, or resources. While research impact is the contribution that research makes to the economy, society, environment, or culture beyond the contribution to academic research. Ironically, traditional impact metrics in the academy are designed to focus on the outputs; it dismisses the important role engagement plays in fostering a collaborative process that leads to meaningful, ethical, and useful impacts. Dr. Kelly, aNanda (First Nations) man himself, has worked closely with the Nanda community over the past decade, ensuring cultural protocols are upheld and implemented while doing research engagement. The focus was on the process, which was essential to foster a positive research impact culture. The contributions that flowed from this process were the naming of a new species of squat lobster in the Nanda language, a poster design in collaboration with The University of Melbourne, Museums Victoria and Bundiyarra - IrraWanga language centre, media coverage, and the formation of the “Nanda language, Nanda country project”. The Nanda language, Nanda country project is a language revitalization project that focused on reconnecting Nanda people with the language & culture on Nanda Country. Such outcomes are imperative on the eve of the United Nations International Decade of Indigenous Languages. In this paperDr, Kellywill discuss howNanda cultural practicesinformed research engagement to foster a collaborative processthat, in turn, ledto meaningful, ethical, and useful impacts within and outside of the academy.Keywords: community collaboration, indigenous, nanda, research engagement, research impacts
Procedia PDF Downloads 11414344 The Automatic Transliteration Model of Images of the Book Hamong Tani Using Statistical Approach
Authors: Agustinus Rudatyo Himamunanto, Anastasia Rita Widiarti
Abstract:
Transliteration using Javanese manuscripts is one of methods to preserve and legate the wealth of literature in the past for the present generation in Indonesia. The transliteration manual process commonly requires philologists and takes a relatively long time. The automatic transliteration process is expected to shorten the time so as to help the works of philologists. The preprocessing and segmentation stage firstly done is used to manage the document images, thus obtaining image script units that will compile input document images free from noise and have the similarity in properties in the thickness, size, and slope. The next stage of characteristic extraction is used to find unique characteristics that will distinguish each Javanese script image. One of characteristics that is used in this research is the number of black pixels in each image units. Each image of Java scripts contained in the data training will undergo the same process similar to the input characters. The system testing was performed with the data of the book Hamong Tani. The book Hamong Tani was selected due to its content, age and number of pages. Those were considered sufficient as a model experimental input. Based on the results of random page automatic transliteration process testing, it was determined that the maximum percentage correctness obtained was 81.53%. The percentage of success was obtained in 32x32 pixel input image size with the 5x5 image window. With regard to the results, it can be concluded that the automatic transliteration model offered is relatively good.Keywords: Javanese script, character recognition, statistical, automatic transliteration
Procedia PDF Downloads 33914343 Imputation Technique for Feature Selection in Microarray Data Set
Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam
Abstract:
Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.Keywords: DNA microarray, feature selection, missing data, bioinformatics
Procedia PDF Downloads 57414342 From Biosensors towards Artificial Intelligence: A New Era in Toxoplasmosis Diagnostics and Therapeutics
Authors: Gehan Labib Abuelenain, Azza Fahmi, Salma Awad Mahmoud
Abstract:
Toxoplasmosis is a global parasitic disease caused by the protozoan Toxoplasma gondii (T. gondii), with a high infection rate that affects one third of the human population and results in severe implications in pregnant women, neonates, and immunocompromised patients. Anti-parasitic treatments and schemes available against toxoplasmosis have barely evolved over the last two decades. The available T. gondii therapeutics cannot completely eradicate tissue cysts produced by the parasite and are not well-tolerated by immunocompromised patients. This work aims to highlight new trends in Toxoplasma gondii diagnosis by providing a comprehensive overview of the field, summarizing recent findings, and discussing the new technological advancements in toxoplasma diagnosis and treatment. Advancements in therapeutics utilizing trends in molecular biophysics, such as biosensors, epigenetics, and artificial intelligence (AI), might provide solutions for disease management and prevention. These insights will provide tools to identify research gaps and proffer planning options for disease control.Keywords: toxoplamosis, diagnosis, therapeutics, biosensors, AI
Procedia PDF Downloads 36