Search results for: graphic user interface (GUI)
2266 Life-Cycle Assessment of Residential Buildings: Addressing the Influence of Commuting
Authors: J. Bastos, P. Marques, S. Batterman, F. Freire
Abstract:
Due to demands of a growing urban population, it is crucial to manage urban development and its associated environmental impacts. While most of the environmental analyses have addressed buildings and transportation separately, both the design and location of a building affect environmental performance and focusing on one or the other can shift impacts and overlook improvement opportunities for more sustainable urban development. Recently, several life-cycle (LC) studies of residential buildings have integrated user transportation, focusing exclusively on primary energy demand and/or greenhouse gas emissions. Additionally, most papers considered only private transportation (mainly car). Although it is likely to have the largest share both in terms of use and associated impacts, exploring the variability associated with mode choice is relevant for comprehensive assessments and, eventually, for supporting decision-makers. This paper presents a life-cycle assessment (LCA) of a residential building in Lisbon (Portugal), addressing building construction, use and user transportation (commuting with private and public transportation). Five environmental indicators or categories are considered: (i) non-renewable primary energy (NRE), (ii) greenhouse gas intensity (GHG), (iii) eutrophication (EUT), (iv) acidification (ACID), and (v) ozone layer depletion (OLD). In a first stage, the analysis addresses the overall life-cycle considering the statistical model mix for commuting in the residence location. Then, a comparative analysis compares different available transportation modes to address the influence mode choice variability has on the results. The results highlight the large contribution of transportation to the overall LC results in all categories. NRE and GHG show strong correlation, as the three LC phases contribute with similar shares to both of them: building construction accounts for 6-9%, building use for 44-45%, and user transportation for 48% of the overall results. However, for other impact categories there is a large variation in the relative contribution of each phase. Transport is the most significant phase in OLD (60%); however, in EUT and ACID building use has the largest contribution to the overall LC (55% and 64%, respectively). In these categories, transportation accounts for 31-38%. A comparative analysis was also performed for four alternative transport modes for the household commuting: car, bus, motorcycle, and company/school collective transport. The car has the largest results in all impact categories. When compared to the overall LC with commuting by car, mode choice accounts for a variability of about 35% in NRE, GHG and OLD (the categories where transportation accounted for the largest share of the LC), 24% in EUT and 16% in ACID. NRE and GHG show a strong correlation because all modes have internal combustion engines. The second largest results for NRE, GHG and OLD are associated with commuting by motorcycle; however, for ACID and EUT this mode has better performance than bus and company/school transport. No single transportation mode performed best in all impact categories. Integrated assessments of buildings are needed to avoid shifts of impacts between life-cycle phases and environmental categories, and ultimately to support decision-makers.Keywords: environmental impacts, LCA, Lisbon, transport
Procedia PDF Downloads 3622265 Real-Time Demonstration of Visible Light Communication Based on Frequency-Shift Keying Employing a Smartphone as the Receiver
Authors: Fumin Wang, Jiaqi Yin, Lajun Wang, Nan Chi
Abstract:
In this article, we demonstrate a visible light communication (VLC) system over 8 meters free space transmission based on a commercial LED and a receiver in connection with an audio interface of a smart phone. The signal is in FSK modulation format. The successful experimental demonstration validates the feasibility of the proposed system in future wireless communication network.Keywords: visible light communication, smartphone communication, frequency shift keying, wireless communication
Procedia PDF Downloads 3922264 Numerical Investigation of Flow Boiling within Micro-Channels in the Slug-Plug Flow Regime
Authors: Anastasios Georgoulas, Manolia Andredaki, Marco Marengo
Abstract:
The present paper investigates the hydrodynamics and heat transfer characteristics of slug-plug flows under saturated flow boiling conditions within circular micro-channels. Numerical simulations are carried out, using an enhanced version of the open-source CFD-based solver ‘interFoam’ of OpenFOAM CFD Toolbox. The proposed user-defined solver is based in the Volume Of Fluid (VOF) method for interface advection, and the mentioned enhancements include the implementation of a smoothing process for spurious current reduction, the coupling with heat transfer and phase change as well as the incorporation of conjugate heat transfer to account for transient solid conduction. In all of the considered cases in the present paper, a single phase simulation is initially conducted until a quasi-steady state is reached with respect to the hydrodynamic and thermal boundary layer development. Then, a predefined and constant frequency of successive vapour bubbles is patched upstream at a certain distance from the channel inlet. The proposed numerical simulation set-up can capture the main hydrodynamic and heat transfer characteristics of slug-plug flow regimes within circular micro-channels. In more detail, the present investigation is focused on exploring the interaction between subsequent vapour slugs with respect to their generation frequency, the hydrodynamic characteristics of the liquid film between the generated vapour slugs and the channel wall as well as of the liquid plug between two subsequent vapour slugs. The proposed investigation is carried out for the 3 different working fluids and three different values of applied heat flux in the heated part of the considered microchannel. The post-processing and analysis of the results indicate that the dynamics of the evolving bubbles in each case are influenced by both the upstream and downstream bubbles in the generated sequence. In each case a slip velocity between the vapour bubbles and the liquid slugs is evident. In most cases interfacial waves appear close to the bubble tail that significantly reduce the liquid film thickness. Finally, in accordance with previous investigations vortices that are identified in the liquid slugs between two subsequent vapour bubbles can significantly enhance the convection heat transfer between the liquid regions and the heated channel walls. The overall results of the present investigation can be used to enhance the present understanding by providing better insight of the complex, underpinned heat transfer mechanisms in saturated boiling within micro-channels in the slug-plug flow regime.Keywords: slug-plug flow regime, micro-channels, VOF method, OpenFOAM
Procedia PDF Downloads 2672263 Towards a Systematic Evaluation of Web Design
Authors: Ivayla Trifonova, Naoum Jamous, Holger Schrödl
Abstract:
A good web design is a prerequisite for a successful business nowadays, especially since the internet is the most common way for people to inform themselves. Web design includes the optical composition, the structure, and the user guidance of websites. The importance of each website leads to the question if there is a way to measure its usefulness. The aim of this paper is to suggest a methodology for the evaluation of web design. The desired outcome is to have an evaluation that is concentrated on a specific website and its target group.Keywords: evaluation methodology, factor analysis, target group, web design
Procedia PDF Downloads 6362262 Computerized Analysis of Phonological Structure of 10,400 Brazilian Sign Language Signs
Authors: Wanessa G. Oliveira, Fernando C. Capovilla
Abstract:
Capovilla and Raphael’s Libras Dictionary documents a corpus of 4,200 Brazilian Sign Language (Libras) signs. Duduchi and Capovilla’s software SignTracking permits users to retrieve signs even when ignoring the gloss corresponding to it and to discover the meaning of all 4,200 signs sign simply by clicking on graphic menus of the sign characteristics (phonemes). Duduchi and Capovilla have discovered that the ease with which any given sign can be retrieved is an inverse function of the average popularity of its component phonemes. Thus, signs composed of rare (distinct) phonemes are easier to retrieve than are those composed of common phonemes. SignTracking offers a means of computing the average popularity of the phonemes that make up each one of 4,200 signs. It provides a precise measure of the degree of ease with which signs can be retrieved, and sign meanings can be discovered. Duduchi and Capovilla’s logarithmic model proved valid: The degree with which any given sign can be retrieved is an inverse function of the arithmetic mean of the logarithm of the popularity of each component phoneme. Capovilla, Raphael and Mauricio’s New Libras Dictionary documents a corpus of 10,400 Libras signs. The present analysis revealed Libras DNA structure by mapping the incidence of 501 sign phonemes resulting from the layered distribution of five parameters: 163 handshape phonemes (CherEmes-ManusIculi); 34 finger shape phonemes (DactilEmes-DigitumIculi); 55 hand placement phonemes (ArtrotoToposEmes-ArticulatiLocusIculi); 173 movement dimension phonemes (CinesEmes-MotusIculi) pertaining to direction, frequency, and type; and 76 Facial Expression phonemes (MascarEmes-PersonalIculi).Keywords: Brazilian sign language, lexical retrieval, libras sign, sign phonology
Procedia PDF Downloads 3452261 Quantum Dots Incorporated in Biomembrane Models for Cancer Marker
Authors: Thiago E. Goto, Carla C. Lopes, Helena B. Nader, Anielle C. A. Silva, Noelio O. Dantas, José R. Siqueira Jr., Luciano Caseli
Abstract:
Quantum dots (QD) are semiconductor nanocrystals that can be employed in biological research as a tool for fluorescence imagings, having the potential to expand in vivo and in vitro analysis as cancerous cell biomarkers. Particularly, cadmium selenide (CdSe) magic-sized quantum dots (MSQDs) exhibit stable luminescence that is feasible for biological applications, especially for imaging of tumor cells. For these facts, it is interesting to know the mechanisms of action of how such QDs mark biological cells. For that, simplified models are a suitable strategy. Among these models, Langmuir films of lipids formed at the air-water interface seem to be adequate since they can mimic half a membrane. They are monomolecular films formed at liquid-gas interfaces that can spontaneously form when organic solutions of amphiphilic compounds are spread on the liquid-gas interface. After solvent evaporation, the monomolecular film is formed, and a variety of techniques, including tensiometric, spectroscopic and optic can be applied. When the monolayer is formed by membrane lipids at the air-water interface, a model for half a membrane can be inferred where the aqueous subphase serve as a model for external or internal compartment of the cell. These films can be transferred to solid supports forming the so-called Langmuir-Blodgett (LB) films, and an ampler variety of techniques can be additionally used to characterize the film, allowing for the formation of devices and sensors. With these ideas in mind, the objective of this work was to investigate the specific interactions of CdSe MSQDs with tumorigenic and non-tumorigenic cells using Langmuir monolayers and LB films of lipids and specific cell extracts as membrane models for diagnosis of cancerous cells. Surface pressure-area isotherms and polarization modulation reflection-absorption spectroscopy (PM-IRRAS) showed an intrinsic interaction between the quantum dots, inserted in the aqueous subphase, and Langmuir monolayers, constructed either of selected lipids or of non-tumorigenic and tumorigenic cells extracts. The quantum dots expanded the monolayers and changed the PM-IRRAS spectra for the lipid monolayers. The mixed films were then compressed to high surface pressures and transferred from the floating monolayer to solid supports by using the LB technique. Images of the films were then obtained with atomic force microscopy (AFM) and confocal microscopy, which provided information about the morphology of the films. Similarities and differences between films with different composition representing cell membranes, with or without CdSe MSQDs, was analyzed. The results indicated that the interaction of quantum dots with the bioinspired films is modulated by the lipid composition. The properties of the normal cell monolayer were not significantly altered, whereas for the tumorigenic cell monolayer models, the films presented significant alteration. The images therefore exhibited a stronger effect of CdSe MSQDs on the models representing cancerous cells. As important implication of these findings, one may envisage for new bioinspired surfaces based on molecular recognition for biomedical applications.Keywords: biomembrane, langmuir monolayers, quantum dots, surfaces
Procedia PDF Downloads 1962260 A Hybrid Multi-Criteria Hotel Recommender System Using Explicit and Implicit Feedbacks
Authors: Ashkan Ebadi, Adam Krzyzak
Abstract:
Recommender systems, also known as recommender engines, have become an important research area and are now being applied in various fields. In addition, the techniques behind the recommender systems have been improved over the time. In general, such systems help users to find their required products or services (e.g. books, music) through analyzing and aggregating other users’ activities and behavior, mainly in form of reviews, and making the best recommendations. The recommendations can facilitate user’s decision making process. Despite the wide literature on the topic, using multiple data sources of different types as the input has not been widely studied. Recommender systems can benefit from the high availability of digital data to collect the input data of different types which implicitly or explicitly help the system to improve its accuracy. Moreover, most of the existing research in this area is based on single rating measures in which a single rating is used to link users to items. This paper proposes a highly accurate hotel recommender system, implemented in various layers. Using multi-aspect rating system and benefitting from large-scale data of different types, the recommender system suggests hotels that are personalized and tailored for the given user. The system employs natural language processing and topic modelling techniques to assess the sentiment of the users’ reviews and extract implicit features. The entire recommender engine contains multiple sub-systems, namely users clustering, matrix factorization module, and hybrid recommender system. Each sub-system contributes to the final composite set of recommendations through covering a specific aspect of the problem. The accuracy of the proposed recommender system has been tested intensively where the results confirm the high performance of the system.Keywords: tourism, hotel recommender system, hybrid, implicit features
Procedia PDF Downloads 2732259 Adaptive Certificate-Based Mutual Authentication Protocol for Mobile Grid Infrastructure
Authors: H. Parveen Begam, M. A. Maluk Mohamed
Abstract:
Mobile Grid Computing is an environment that allows sharing and coordinated use of diverse resources in dynamic, heterogeneous and distributed environment using different types of electronic portable devices. In a grid environment the security issues are like authentication, authorization, message protection and delegation handled by GSI (Grid Security Infrastructure). Proving better security between mobile devices and grid infrastructure is a major issue, because of the open nature of wireless networks, heterogeneous and distributed environments. In a mobile grid environment, the individual computing devices may be resource-limited in isolation, as an aggregated sum, they have the potential to play a vital role within the mobile grid environment. Some adaptive methodology or solution is needed to solve the issues like authentication of a base station, security of information flowing between a mobile user and a base station, prevention of attacks within a base station, hand-over of authentication information, communication cost of establishing a session key between mobile user and base station, computing complexity of achieving authenticity and security. The sharing of resources of the devices can be achieved only through the trusted relationships between the mobile hosts (MHs). Before accessing the grid service, the mobile devices should be proven authentic. This paper proposes the dynamic certificate based mutual authentication protocol between two mobile hosts in a mobile grid environment. The certificate generation process is done by CA (Certificate Authority) for all the authenticated MHs. Security (because of validity period of the certificate) and dynamicity (transmission time) can be achieved through the secure service certificates. Authentication protocol is built on communication services to provide cryptographically secured mechanisms for verifying the identity of users and resources.Keywords: mobile grid computing, certificate authority (CA), SSL/TLS protocol, secured service certificates
Procedia PDF Downloads 3062258 Assessment and Characterization of Dual-Hardening Adhesion Promoter for Self-Healing Mechanisms in Metal-Plastic Hybrid System
Authors: Anas Hallak, Latifa Seblini, Juergen Wilde
Abstract:
In mechatronics or sensor technology, plastic housings are used to protect sensitive components from harmful environmental influences, such as moisture, media, or reactive substances. Connections, preferably in the form of metallic lead-frame structures, through the housing wall are required for their electrical supply or control. In this system, an insufficient connection between the plastic component, e.g., Polyamide66, and the metal surface, e.g., copper, due to the incompatibility is dominating. As a result, leakage paths can occur along with the plastic-metal interface. Since adhesive bonding has been established as one of the most important joining processes and its use has expanded significantly, driven by the development of improved high-performance adhesives and bonding techniques, this technology has been involved in metal-plastic hybrid structures. In this study, an epoxy bonding agent from DELO (DUALBOND LT2266) has been used to improve the mechanical and chemical binding between the metal and the polymer. It is an adhesion promoter with two reaction stages. In these, the first stage provides fixation to the lead frame directly after the coating step, which can be done by UV-Exposure for a few seconds. In the second stage, the material will be thermally hardened during injection molding. To analyze the two reaction stages of the primer, dynamic DSC experiments were carried out and correlated with Fourier-transform infrared spectroscopy measurements. Furthermore, the number of crosslinking bonds formed in the system in each reaction stage has also been estimated by a rheological characterization. Those investigations have been performed with different times of UV exposure: 12, 96 s and in an industrial preferred temperature range from -20 to 175°C. The shear viscosity values of primer have been measured as a function of temperature and exposure times. For further interpretation, the storage modulus values have been calculated, and the so-called Booij–Palmen plot has been sketched. The next approach in this study is the self-healing mechanisms in the hydride system in which the primer should flow into micro-damage such as interface, cracks, inhibit them from growing, and close them. The ability of the primer to flow in and penetrate defined capillaries made in Ultramid was investigated. Holes with a diameter of 0.3 mm were produced in injection-molded A3EG7 plates with 4 mm thickness. A copper substrate coated with the DUALBOND was placed on the A3EG7 plate and pressed with a certain force. Metallographic analyses were carried out to verify the filling grade, which showed an almost 95% filling ratio of the capillaries. Finally, to estimate the self-healing mechanism in metal-plastic hybrid systems, characterizations have been done on a simple geometry with a metal inlay developed by the Institute of Polymer Technology in Friedrich-Alexander-University. The specimens have been modified with tungsten wire which was to be pulled out after the injection molding to create a micro-hole in the specimen at the interface between the primer and the polymer. The capability of the primer to heal those micro-cracks upon heating, pressing, and thermal aging has been characterized through metallographic analyses.Keywords: hybrid structures, self-healing, thermoplastic housing, adhesive
Procedia PDF Downloads 1932257 A POX Controller Module to Collect Web Traffic Statistics in SDN Environment
Authors: Wisam H. Muragaa, Kamaruzzaman Seman, Mohd Fadzli Marhusin
Abstract:
Software Defined Networking (SDN) is a new norm of networks. It is designed to facilitate the way of managing, measuring, debugging and controlling the network dynamically, and to make it suitable for the modern applications. Generally, measurement methods can be divided into two categories: Active and passive methods. Active measurement method is employed to inject test packets into the network in order to monitor their behaviour (ping tool as an example). Meanwhile the passive measurement method is used to monitor the traffic for the purpose of deriving measurement values. The measurement methods, both active and passive, are useful for the collection of traffic statistics, and monitoring of the network traffic. Although there has been a work focusing on measuring traffic statistics in SDN environment, it was only meant for measuring packets and bytes rates for non-web traffic. In this study, a feasible method will be designed to measure the number of packets and bytes in a certain time, and facilitate obtaining statistics for both web traffic and non-web traffic. Web traffic refers to HTTP requests that use application layer; while non-web traffic refers to ICMP and TCP requests. Thus, this work is going to be more comprehensive than previous works. With a developed module on POX OpenFlow controller, information will be collected from each active flow in the OpenFlow switch, and presented on Command Line Interface (CLI) and wireshark interface. Obviously, statistics that will be displayed on CLI and on wireshark interfaces include type of protocol, number of bytes and number of packets, among others. Besides, this module will show the number of flows added to the switch whenever traffic is generated from and to hosts in the same statistics list. In order to carry out this work effectively, our Python module will send a statistics request message to the switch requesting its current ports and flows statistics in every five seconds; while the switch will reply with the required information in a message called statistics reply message. Thus, POX controller will be notified and updated with any changes could happen in the entire network in a very short time. Therefore, our aim of this study is to prepare a list for the important statistics elements that are collected from the whole network, to be used for any further researches; particularly, those that are dealing with the detection of the network attacks that cause a sudden rise in the number of packets and bytes like Distributed Denial of Service (DDoS).Keywords: mininet, OpenFlow, POX controller, SDN
Procedia PDF Downloads 2352256 Harmonizing Cities: Integrating Land Use Diversity and Multimodal Transit for Social Equity
Authors: Zi-Yan Chao
Abstract:
With the rapid development of urbanization and increasing demand for efficient transportation systems, the interaction between land use diversity and transportation resource allocation has become a critical issue in urban planning. Achieving a balance of land use types, such as residential, commercial, and industrial areas, is crucial role in ensuring social equity and sustainable urban development. Simultaneously, optimizing multimodal transportation networks, including bus, subway, and car routes, is essential for minimizing total travel time and costs, while ensuring fairness for all social groups, particularly in meeting the transportation needs of low-income populations. This study develops a bilevel programming model to address these challenges, with land use diversity as the foundation for measuring equity. The upper-level model maximizes land use diversity for balanced land distribution across regions. The lower-level model optimizes multimodal transportation networks to minimize travel time and costs while maintaining user equilibrium. The model also incorporates constraints to ensure fair resource allocation, such as balancing transportation accessibility and cost differences across various social groups. A solution approach is developed to solve the bilevel optimization problem, ensuring efficient exploration of the solution space for land use and transportation resource allocation. This study maximizes social equity by maximizing land use diversity and achieving user equilibrium with optimal transportation resource distribution. The proposed method provides a robust framework for addressing urban planning challenges, contributing to sustainable and equitable urban development.Keywords: bilevel programming model, genetic algorithms, land use diversity, multimodal transportation optimization, social equity
Procedia PDF Downloads 242255 Passive Vibration Isolation Analysis and Optimization for Mechanical Systems
Authors: Ozan Yavuz Baytemir, Ender Cigeroglu, Gokhan Osman Ozgen
Abstract:
Vibration is an important issue in the design of various components of aerospace, marine and vehicular applications. In order not to lose the components’ function and operational performance, vibration isolation design involving the optimum isolator properties selection and isolator positioning processes appear to be a critical study. Knowing the growing need for the vibration isolation system design, this paper aims to present two types of software capable of implementing modal analysis, response analysis for both random and harmonic types of excitations, static deflection analysis, Monte Carlo simulations in addition to study of parameter and location optimization for different types of isolation problem scenarios. Investigating the literature, there is no such study developing a software-based tool that is capable of implementing all those analysis, simulation and optimization studies in one platform simultaneously. In this paper, the theoretical system model is generated for a 6-DOF rigid body. The vibration isolation system of any mechanical structure is able to be optimized using hybrid method involving both global search and gradient-based methods. Defining the optimization design variables, different types of optimization scenarios are listed in detail. Being aware of the need for a user friendly vibration isolation problem solver, two types of graphical user interfaces (GUIs) are prepared and verified using a commercial finite element analysis program, Ansys Workbench 14.0. Using the analysis and optimization capabilities of those GUIs, a real application used in an air-platform is also presented as a case study at the end of the paper.Keywords: hybrid optimization, Monte Carlo simulation, multi-degree-of-freedom system, parameter optimization, location optimization, passive vibration isolation analysis
Procedia PDF Downloads 5652254 Forensic Methods Used for the Verification of the Authenticity of Prints
Authors: Olivia Rybak-Karkosz
Abstract:
This paper aims to present the results of scientific research on methods of forging art prints and their elements, such as signature or provenance and forensic science methods that might be used to verify their authenticity. In the last decades, the art market has observed significant interest in purchasing prints. They are considered an economical alternative to paintings and a considerable investment. However, the authenticity of an art print is difficult to establish as similar visual effects might be achieved with drawings or xerox. The latter is easy to make using a home printer. They are then offered on flea markets or internet auctions as genuine prints. This probable ease of forgery and, at the same time, the difficulty of distinguishing art print techniques were the main reasons why this research was undertaken. A lack of scientific methods dedicated to disclosing a forgery encouraged the author to verify the possibility of using forensic science's methods known and used in other fields of expertise. This research methodology consisted of completing representative forgery samples collected in selected museums based in Poland and a few in Germany and Austria. That allowed the author to present a typology of methods used to forge art prints. Given that one of the most famous graphic design examples is bills and securities, it seems only appropriate to propose in print verification the usage of methods of detecting counterfeit currency. These methods contain an examination of ink, paper, and watermarks. On prints, additionally, signatures and imprints of stamps, etc., are forged as well. So the examination should be completed with handwriting examination and forensic sphragistics. The paper contains a stipulation to conduct a complex analysis of authenticity with the participation of an art restorer, art historian, and forensic expert as head of this team.Keywords: art forgery, examination of an artwork, handwriting analysis, prints
Procedia PDF Downloads 1292253 Re-Constructing the Research Design: Dealing with Problems and Re-Establishing the Method in User-Centered Research
Authors: Kerem Rızvanoğlu, Serhat Güney, Emre Kızılkaya, Betül Aydoğan, Ayşegül Boyalı, Onurcan Güden
Abstract:
This study addresses the re-construction and implementation process of the methodological framework developed to evaluate how locative media applications accompany the urban experiences of international students coming to Istanbul with exchange programs in 2022. The research design was built on a three-stage model. The research team conducted a qualitative questionnaire in the first stage to gain exploratory data. These data were then used to form three persona groups representing the sample by applying cluster analysis. In the second phase, a semi-structured digital diary study was carried out on a gamified task list with a sample selected from the persona groups. This stage proved to be the most difficult to obtaining valid data from the participant group. The research team re-evaluated the design of this second phase to reach the participants who will perform the tasks given by the research team while sharing their momentary city experiences, to ensure the daily data flow for two weeks, and to increase the quality of the obtained data. The final stage, which follows to elaborate on the findings, is the “Walk & Talk,” which is completed with face-to-face and in-depth interviews. It has been seen that the multiple methods used in the research process contribute to the depth and data diversity of the research conducted in the context of urban experience and locative technologies. In addition, by adapting the research design to the experiences of the users included in the sample, the differences and similarities between the initial research design and the research applied are shown.Keywords: digital diary study, gamification, multi-model research, persona analysis, research design for urban experience, user-centered research, “Walk & Talk”
Procedia PDF Downloads 1712252 Collaboration between Grower and Research Organisations as a Mechanism to Improve Water Efficiency in Irrigated Agriculture
Authors: Sarah J. C. Slabbert
Abstract:
The uptake of research as part of the diffusion or adoption of innovation by practitioners, whether individuals or organisations, has been a popular topic in agricultural development studies for many decades. In the classical, linear model of innovation theory, the innovation originates from an expert source such as a state-supported research organisation or academic institution. The changing context of agriculture led to the development of the agricultural innovation systems model, which recognizes innovation as a complex interaction between individuals and organisations, which include private industry and collective action organisations. In terms of this model, an innovation can be developed and adopted without any input or intervention from a state or parastatal research organisation. This evolution in the diffusion of agricultural innovation has put forward new challenges for state or parastatal research organisations, which have to demonstrate the impact of their research to the legislature or a regulatory authority: Unless the organisation and the research it produces cross the knowledge paths of the intended audience, there will be no awareness, no uptake and certainly no impact. It is therefore critical for such a research organisation to base its communication strategy on a thorough understanding of the knowledge needs, information sources and knowledge networks of the intended target audience. In 2016, the South African Water Research Commission (WRC) commissioned a study to investigate the knowledge needs, information sources and knowledge networks of Water User Associations and commercial irrigators with the aim of improving uptake of its research on efficient water use in irrigation. The first phase of the study comprised face-to-face interviews with the CEOs and Board Chairs of four Water User Associations along the Orange River in South Africa, and 36 commercial irrigation farmers from the same four irrigation schemes. Intermediaries who act as knowledge conduits to the Water User Associations and the irrigators were identified and 20 of them were subsequently interviewed telephonically. The study found that irrigators interact regularly with grower organisations such as SATI (South African Table Grape Industry) and SAPPA (South African Pecan Nut Association) and that they perceive these organisations as credible, trustworthy and reliable, within their limitations. State and parastatal research institutions, on the other hand, are associated with a range of negative attributes. As a result, the awareness of, and interest in, the WRC and its research on water use efficiency in irrigated agriculture are low. The findings suggest that a communication strategy that involves collaboration with these grower organisations would empower the WRC to participate much more efficiently and with greater impact in agricultural innovation networks. The paper will elaborate on the findings and discuss partnering frameworks and opportunities to manage perceptions and uptake.Keywords: agricultural innovation systems, communication strategy, diffusion of innovation, irrigated agriculture, knowledge paths, research organisations, target audiences, water use efficiency
Procedia PDF Downloads 1132251 Annealing of the Contact between Graphene and Metal: Electrical and Raman Study
Authors: A. Sakavičius, A. Lukša, V. Nargelienė, V. Bukauskas, G. Astromskas, A. Šetkus
Abstract:
We investigate the influence of annealing on the properties of a contact between graphene and metal (Au and Ni), using circular transmission line model (CTLM) contact geometry. Kelvin probe force microscopy (KPFM) and Raman spectroscopy are applied for characterization of the surface and interface properties. Annealing causes a decrease of the metal-graphene contact resistance for both Ni and Au.Keywords: Au/Graphene contacts, graphene, Kelvin force probe microscopy, NiC/Graphene contacts, Ni/Graphene contacts, Raman spectroscopy
Procedia PDF Downloads 3172250 A Web-Based Real Property Updating System for Efficient and Sustainable Urban Development: A Case Study in Ethiopia
Authors: Eyosiyas Aga
Abstract:
The development of information communication technology has transformed the paper-based mapping and land registration processes to a computerized and networked system. The computerization and networking of real property information system play a vital role in good governance and sustainable development of emerging countries through cost effective, easy and accessible service delivery for the customer. The efficient, transparent and sustainable real property system is becoming the basic infrastructure for the urban development thus improve the data management system and service delivery in the organizations. In Ethiopia, the real property administration is paper based as a result, it confronted problems of data management, illegal transactions, corruptions, and poor service delivery. In order to solve this problem and to facilitate real property market, the implementation of web-based real property updating system is crucial. A web-based real property updating is one of the automation (computerizations) methods to facilitate data sharing, reduce time and cost of the service delivery in real property administration system. In additions, it is useful for the integration of data onto different information systems and organizations. This system is designed by combining open source software which supported by open Geo-spatial consortium. The web-based system is mainly designed by using open source software with the help of open Geo-spatial Consortium. The Open Geo-spatial Consortium standards such as the Web Feature Service and Web Map Services are the most widely used standards to support and improves web-based real property updating. These features allow the integration of data from different sources, and it can be used to maintain consistency of data throughout transactions. The PostgreSQL and Geoserver are used to manage and connect a real property data to the flex viewer and user interface. The system is designed for both internal updating system (municipality); which is mainly updating of spatial and textual information, and the external system (customer) which focus on providing and interacting with the customer. This research assessed the potential of open source web applications and adopted this technology for real property updating system in Ethiopia through simple, cost effective and secured way. The system is designed by combining and customizing open source software to enhance the efficiency of the system in cost effective way. The existing workflow for real property updating is analyzed to identify the bottlenecks, and the new workflow is designed for the system. The requirement is identified through questionnaire and literature review, and the system is prototype for the study area. The research mainly aimed to integrate human resource with technology in designing of the system to reduce data inconsistency and security problems. In additions, the research reflects on the current situation of real property administration and contributions of effective data management system for efficient, transparent and sustainable urban development in Ethiopia.Keywords: cadaster, real property, sustainable, transparency, web feature service, web map service
Procedia PDF Downloads 2672249 A First-Principles Molecular Dynamics Study on Li+ Solvation Structures in THF/MTHF Containing Electrolytes for Lithium Metal Batteries.
Authors: Chiu-Neng Su, Santhanamoorthi Nachimuthu, Jyh-Chiang Jiang
Abstract:
In lithium-ion batteries (LIBs) the solid–electrolyte interphase (SEI) layer, which forms on the anode surface, plays a crucial role in stabilizing battery performance. Over the past two decades, efforts to enhance LIB electrolytes have primarily focused on refining the quality of SEI components. Despite these endeavors, several observed phenomena remain inadequately improved the SEI layer. Consequently, there has been a significant surge in research interest regarding the behavior of electrolyte solvation structures to elucidate improvements in battery performance. Thus, in this study, we aimed to explore the solvation structures of LiPF₆ in a mixture of organic solvents, tetrahydrofuran (THF) and 2-methyl-tetrahydrofuran (MTHF) using ab-initio molecular dynamics (AIMD) simulations. Our work investigated the solvation structure of electrolytes with different salt concentrations: low-concentration electrolyte (1.0M LiPF6 in 1:1v/v mixture of THF and MTHF), and high-concentration electrolyte (2.0M LiPF₆ in 1:1v/v mixture of THF and MTHF) and compared them with that of conventional electrolyte (1.0M LiPF₆ in 1:1v/v mixture of ethylene carbonate (EC) and dimethyl carbonate (DMC)). Furthermore, the reduction stability of Li+ solvation structures in these electrolyte systems are investigated. It is found that the first solvation shell of Li+ primary consists of THF. We also analyzed the molecular orbital energy levels to understand the reducing stability of these solvents. Compared with the solvation sheath of commercial electrolyte, the THF/MTHF-containing electrolytes have a higher lowest unoccupied molecular orbital (LUMO) energy level, resulting in improved reduction and interface stability. It has been shown that Li-Al alloy can significantly improve cycle life and promote the formation of a dense SEI layer. Therefore, this study aims to construct the solvation structures obtained from calculations of the pure electrolyte system on the surface of Al-Li alloy. Additionally, AIMD simulations will be conducted to investigate chemical reactions at the interface. This investigation aims to elucidate the composition of the SEI layer formed. Furthermore, Bader charges are used to determine the origin and flow of electrons, thereby revealing the sequence of reduction reactions for generating SEI layers.Keywords: lithium, aluminum, alloy, battery, solvation structure
Procedia PDF Downloads 232248 Issues in Travel Demand Forecasting
Authors: Huey-Kuo Chen
Abstract:
Travel demand forecasting including four travel choices, i.e., trip generation, trip distribution, modal split and traffic assignment constructs the core of transportation planning. In its current application, travel demand forecasting has associated with three important issues, i.e., interface inconsistencies among four travel choices, inefficiency of commonly used solution algorithms, and undesirable multiple path solutions. In this paper, each of the three issues is extensively elaborated. An ideal unified framework for the combined model consisting of the four travel choices and variable demand functions is also suggested. Then, a few remarks are provided in the end of the paper.Keywords: travel choices, B algorithm, entropy maximization, dynamic traffic assignment
Procedia PDF Downloads 4582247 Rest API Based System-level Test Automation for Mobile Applications
Authors: Jisoo Song
Abstract:
Today’s mobile applications are communicating with servers more and more in order to access external services or information. Also, server-side code changes are more frequent than client-side code changes in a mobile application. The frequent changes lead to an increase in testing cost increase. To reduce costs, UI based test automation can be one of the solutions. It is a common automation technique in system-level testing. However, it can be unsuitable for mobile applications. When you automate tests based on UI elements for mobile applications, there are some limitations such as the overhead of script maintenance or the difficulty of finding invisible defects that UI elements cannot represent. To overcome these limitations, we present a new automation technique based on Rest API. You can automate system-level tests through test scripts that you write. These scripts call a series of Rest API in a user’s action sequence. This technique does not require testers to know the internal implementation details, only input and expected output of Rest API. You can easily modify test cases by modifying Rest API input values and also find problems that might not be evident from the UI level by validating output values. For example, when an application receives price information from a payment server and user cannot see it at UI level, Rest API based scripts can check whether price information is correct or not. More than 10 mobile applications at our company are being tested automatically based on Rest API scripts whenever application source code, mostly server source code, is built. We are finding defects right away by setting a script as a build job in CI server. The build job starts when application code builds are completed. This presentation will also include field cases from our company.Keywords: case studies at SK Planet, introduction of rest API based test automation, limitations of UI based test automation
Procedia PDF Downloads 4492246 Finite Element Analysis for Earing Prediction Incorporating the BBC2003 Material Model with Fully Implicit Integration Method: Derivation and Numerical Algorithm
Authors: Sajjad Izadpanah, Seyed Hadi Ghaderi, Morteza Sayah Irani, Mahdi Gerdooei
Abstract:
In this research work, a sophisticated yield criterion known as BBC2003, capable of describing planar anisotropic behaviors of aluminum alloy sheets, was integrated into the commercial finite element code ABAQUS/Standard via a user subroutine. The complete formulation of the implementation process using a fully implicit integration scheme, i.e., the classic backward Euler method, is presented, and relevant aspects of the yield criterion are introduced. In order to solve nonlinear differential and algebraic equations, the line-search algorithm was adopted in the user-defined material subroutine (UMAT) to expand the convergence domain of the iterative Newton-Raphson method. The developed subroutine was used to simulate a challenging computational problem with complex stress states, i.e., deep drawing of an anisotropic aluminum alloy AA3105. The accuracy and stability of the developed subroutine were confirmed by comparing the numerically predicted earing and thickness variation profiles with the experimental results, which showed an excellent agreement between numerical and experimental earing and thickness profiles. The integration of the BBC2003 yield criterion into ABAQUS/Standard represents a significant contribution to the field of computational mechanics and provides a useful tool for analyzing the mechanical behavior of anisotropic materials subjected to complex loading conditions.Keywords: BBC2003 yield function, plastic anisotropy, fully implicit integration scheme, line search algorithm, explicit and implicit integration schemes
Procedia PDF Downloads 752245 Investigation of Doping Effects on Nonradiative Recombination Parameters in Bulk GaAs
Authors: Soufiene Ilahi
Abstract:
We have used Photothermal deflection spectroscopy PTD to investigate the impact of doping on electronics properties of bulk. Then, the extraction of these parameters is performed by fitting the theoretical curves to the experimental PTD ones. We have remarked that electron mobility in p type C-doped GaAs is about 300 cm2/V·s. Accordinagly, the diffusion length of minority carrier lifetime is equal to 5 (± 7%), 5 (± 4,4%) and 1.42 µm (± 7,2 %) for the Cr, C and Si doped GaAs respectively. Surface recombination velocity varies randomly that can be found around of 7942 m/s, 100 m/s and 153 m/s GaAs doped Si, Cr, C, respectively.Keywords: nonradiative lifetime, mobility of minority carrier, diffusion length, surface and interface recombination in GaAs
Procedia PDF Downloads 722244 A Survey of Field Programmable Gate Array-Based Convolutional Neural Network Accelerators
Authors: Wei Zhang
Abstract:
With the rapid development of deep learning, neural network and deep learning algorithms play a significant role in various practical applications. Due to the high accuracy and good performance, Convolutional Neural Networks (CNNs) especially have become a research hot spot in the past few years. However, the size of the networks becomes increasingly large scale due to the demands of the practical applications, which poses a significant challenge to construct a high-performance implementation of deep learning neural networks. Meanwhile, many of these application scenarios also have strict requirements on the performance and low-power consumption of hardware devices. Therefore, it is particularly critical to choose a moderate computing platform for hardware acceleration of CNNs. This article aimed to survey the recent advance in Field Programmable Gate Array (FPGA)-based acceleration of CNNs. Various designs and implementations of the accelerator based on FPGA under different devices and network models are overviewed, and the versions of Graphic Processing Units (GPUs), Application Specific Integrated Circuits (ASICs) and Digital Signal Processors (DSPs) are compared to present our own critical analysis and comments. Finally, we give a discussion on different perspectives of these acceleration and optimization methods on FPGA platforms to further explore the opportunities and challenges for future research. More helpfully, we give a prospect for future development of the FPGA-based accelerator.Keywords: deep learning, field programmable gate array, FPGA, hardware accelerator, convolutional neural networks, CNN
Procedia PDF Downloads 1282243 Hybrid Lubri-Coolants as an Alternatives to Mineral Based Emulsion in Machining Aerospace Alloy Ti-6Al-4V
Authors: Muhammad Jamil, Ning He, Wei Zhao
Abstract:
Ti-6Al-4V has poor thermal conductivity (6.7W/mK) accumulates shear and friction heat at the tool-chip interface zone. To dissipate the heat generation and friction effect, cryogenic cooling, Minimum quantity lubrication (MQL), nanofluids, hybrid cryogenic-MQL, solid lubricants, etc are applied frequently to underscore their significant effect on improving the machinability of Ti-6Al-4V. Nowadays, hybrid lubri-cooling is getting attention from researchers to explore their effect on machining Ti-6Al-4V.Keywords: hybrid lubri-cooling, tool wear, surface roughness, minimum quantity lubrication
Procedia PDF Downloads 1442242 Replication of Meaningful Gesture Study for N400 Detection Using a Commercial Brain-Computer Interface
Authors: Thomas Ousterhout
Abstract:
In an effort to test the ability of a commercial grade EEG headset to effectively measure the N400 ERP, a replication study was conducted to see if similar results could be produced as that which used a medical grade EEG. Pictures of meaningful and meaningless hand postures were borrowed from the original author and subjects were required to perform a semantic discrimination task. The N400 was detected indicating semantic processing of the meaningfulness of the hand postures. The results corroborate those of the original author and support the use of some commercial grade EEG headsets for non-critical research applications.Keywords: EEG, ERP, N400, semantics, congruency, gestures, emotiv
Procedia PDF Downloads 2632241 CRM Cloud Computing: An Efficient and Cost Effective Tool to Improve Customer Interactions
Authors: Gaurangi Saxena, Ravindra Saxena
Abstract:
Lately, cloud computing is used to enhance the ability to attain corporate goals more effectively and efficiently at lower cost. This new computing paradigm “The Cloud Computing” has emerged as a powerful tool for optimum utilization of resources and gaining competitiveness through cost reduction and achieving business goals with greater flexibility. Realizing the importance of this new technique, most of the well known companies in computer industry like Microsoft, IBM, Google and Apple are spending millions of dollars in researching cloud computing and investigating the possibility of producing interface hardware for cloud computing systems. It is believed that by using the right middleware, a cloud computing system can execute all the programs a normal computer could run. Potentially, everything from most simple generic word processing software to highly specialized and customized programs designed for specific company could work successfully on a cloud computing system. A Cloud is a pool of virtualized computer resources. Clouds are not limited to grid environments, but also support “interactive user-facing applications” such as web applications and three-tier architectures. Cloud Computing is not a fundamentally new paradigm. It draws on existing technologies and approaches, such as utility Computing, Software-as-a-service, distributed computing, and centralized data centers. Some companies rent physical space to store servers and databases because they don’t have it available on site. Cloud computing gives these companies the option of storing data on someone else’s hardware, removing the need for physical space on the front end. Prominent service providers like Amazon, Google, SUN, IBM, Oracle, Salesforce etc. are extending computing infrastructures and platforms as a core for providing top-level services for computation, storage, database and applications. Application services could be email, office applications, finance, video, audio and data processing. By using cloud computing system a company can improve its customer relationship management. A CRM cloud computing system may be highly useful in delivering a sales team a blend of unique functionalities to improve agent/customer interactions. This paper attempts to first define the cloud computing as a tool for running business activities more effectively and efficiently at a lower cost; and then it distinguishes cloud computing with grid computing. Based on exhaustive literature review, authors discuss application of cloud computing in different disciplines of management especially in the field of marketing with special reference to use of cloud computing in CRM. Study concludes that CRM cloud computing platform helps a company track any data, such as orders, discounts, references, competitors and many more. By using CRM cloud computing, companies can improve its customer interactions and by serving them more efficiently that too at a lower cost can help gaining competitive advantage.Keywords: cloud computing, competitive advantage, customer relationship management, grid computing
Procedia PDF Downloads 3122240 Urban Open Source: Synthesis of a Citizen-Centric Framework to Design Densifying Cities
Authors: Shaurya Chauhan, Sagar Gupta
Abstract:
Prominent urbanizing centres across the globe like Delhi, Dhaka, or Manila have exhibited that development often faces a challenge in bridging the gap among the top-down collective requirements of the city and the bottom-up individual aspirations of the ever-diversifying population. When this exclusion is intertwined with rapid urbanization and diversifying urban demography: unplanned sprawl, poor planning, and low-density development emerge as automated responses. In parallel, new ideas and methods of densification and public participation are being widely adopted as sustainable alternatives for the future of urban development. This research advocates a collaborative design method for future development: one that allows rapid application with its prototypical nature and an inclusive approach with mediation between the 'user' and the 'urban', purely with the use of empirical tools. Building upon the concepts and principles of 'open-sourcing' in design, the research establishes a design framework that serves the current user requirements while allowing for future citizen-driven modifications. This is synthesized as a 3-tiered model: user needs – design ideology – adaptive details. The research culminates into a context-responsive 'open source project development framework' (hereinafter, referred to as OSPDF) that can be used for on-ground field applications. To bring forward specifics, the research looks at a 300-acre redevelopment in the core of a rapidly urbanizing city as a case encompassing extreme physical, demographic, and economic diversity. The suggestive measures also integrate the region’s cultural identity and social character with the diverse citizen aspirations, using architecture and urban design tools, and references from recognized literature. This framework, based on a vision – feedback – execution loop, is used for hypothetical development at the five prevalent scales in design: master planning, urban design, architecture, tectonics, and modularity, in a chronological manner. At each of these scales, the possible approaches and avenues for open- sourcing are identified and validated, through hit-and-trial, and subsequently recorded. The research attempts to re-calibrate the architectural design process and make it more responsive and people-centric. Analytical tools such as Space, Event, and Movement by Bernard Tschumi and Five-Point Mental Map by Kevin Lynch, among others, are deep rooted in the research process. Over the five-part OSPDF, a two-part subsidiary process is also suggested after each cycle of application, for a continued appraisal and refinement of the framework and urban fabric with time. The research is an exploration – of the possibilities for an architect – to adopt the new role of a 'mediator' in development of the contemporary urbanity.Keywords: open source, public participation, urbanization, urban development
Procedia PDF Downloads 1492239 Content Monetization as a Mark of Media Economy Quality
Authors: Bela Lebedeva
Abstract:
Characteristics of the Web as a channel of information dissemination - accessibility and openness, interactivity and multimedia news - become wider and cover the audience quickly, positively affecting the perception of content, but blur out the understanding of the journalistic work. As a result audience and advertisers continue migrating to the Internet. Moreover, online targeting allows monetizing not only the audience (as customarily given to traditional media) but also the content and traffic more accurately. While the users identify themselves with the qualitative characteristics of the new market, its actors are formed. Conflict of interests is laid in the base of the economy of their relations, the problem of traffic tax as an example. Meanwhile, content monetization actualizes fiscal interest of the state too. The balance of supply and demand is often violated due to the political risks, particularly in terms of state capitalism, populism and authoritarian methods of governance such social institutions as the media. A unique example of access to journalistic material, limited by monetization of content is a television channel Dozhd' (Rain) in Russian web space. Its liberal-minded audience has a better possibility for discussion. However, the channel could have been much more successful in terms of unlimited free speech. Avoiding state pressure and censorship its management has decided to save at least online performance and monetizing all of the content for the core audience. The study Methodology was primarily based on the analysis of journalistic content, on the qualitative and quantitative analysis of the audience. Reconstructing main events and relationships of actors on the market for the last six years researcher has reached some conclusions. First, under the condition of content monetization the capitalization of its quality will always strive to quality characteristics of user, thereby identifying him. Vice versa, the user's demand generates high-quality journalism. The second conclusion follows the previous one. The growth of technology, information noise, new political challenges, the economy volatility and the cultural paradigm change – all these factors form the content paying model for an individual user. This model defines him as a beneficiary of specific knowledge and indicates the constant balance of supply and demand other conditions being equal. As a result, a new economic quality of information is created. This feature is an indicator of the market as a self-regulated system. Monetized information quality is less popular than that of the Public Broadcasting Service, but this audience is able to make decisions. These very users keep the niche sectors which have more potential of technology development, including the content monetization ways. The third point of the study allows develop it in the discourse of media space liberalization. This cultural phenomenon may open opportunities for the development of social and economic relations architecture both locally and regionally.Keywords: content monetization, state capitalism, media liberalization, media economy, information quality
Procedia PDF Downloads 2482238 Architecture - Performance Relationship in GPU Computing - Composite Process Flow Modeling and Simulations
Authors: Ram Mohan, Richard Haney, Ajit Kelkar
Abstract:
Current developments in computing have shown the advantage of using one or more Graphic Processing Units (GPU) to boost the performance of many computationally intensive applications but there are still limits to these GPU-enhanced systems. The major factors that contribute to the limitations of GPU(s) for High Performance Computing (HPC) can be categorized as hardware and software oriented in nature. Understanding how these factors affect performance is essential to develop efficient and robust applications codes that employ one or more GPU devices as powerful co-processors for HPC computational modeling. This research and technical presentation will focus on the analysis and understanding of the intrinsic interrelationship of both hardware and software categories on computational performance for single and multiple GPU-enhanced systems using a computationally intensive application that is representative of a large portion of challenges confronting modern HPC. The representative application uses unstructured finite element computations for transient composite resin infusion process flow modeling as the computational core, characteristics and results of which reflect many other HPC applications via the sparse matrix system used for the solution of linear system of equations. This work describes these various software and hardware factors and how they interact to affect performance of computationally intensive applications enabling more efficient development and porting of High Performance Computing applications that includes current, legacy, and future large scale computational modeling applications in various engineering and scientific disciplines.Keywords: graphical processing unit, software development and engineering, performance analysis, system architecture and software performance
Procedia PDF Downloads 3632237 Digital Transformation and Digitalization of Public Administration
Authors: Govind Kumar
Abstract:
The concept of ‘e-governance’ that was brought about by the new wave of reforms, namely ‘LPG’ in the early 1990s, has been enabling governments across the globe to digitally transform themselves. Digital transformation is leading the governments with qualitative decisions, optimization in rational use of resources, facilitation of cost-benefit analyses, and elimination of redundancy and corruption with the help of ICT-based applications interface. ICT-based applications/technologies have enormous potential for impacting positive change in the social lives of the global citizenry. Supercomputers test and analyze millions of drug molecules for developing candidate vaccines to combat the global pandemic. Further, e-commerce portals help distribute and supply household items and medicines, while videoconferencing tools provide a visual interface between the clients and hosts. Besides, crop yields are being maximized with the help of drones and machine learning, whereas satellite data, artificial intelligence, and cloud computing help governments with the detection of illegal mining, tackling deforestation, and managing freshwater resources. Such e-applications have the potential to take governance an extra mile by achieving 5 Es (effective, efficient, easy, empower, and equity) of e-governance and six Rs (reduce, reuse, recycle, recover, redesign and remanufacture) of sustainable development. If such digital transformation gains traction within the government framework, it will replace the traditional administration with the digitalization of public administration. On the other hand, it has brought in a new set of challenges, like the digital divide, e-illiteracy, technological divide, etc., and problems like handling e-waste, technological obsolescence, cyber terrorism, e-fraud, hacking, phishing, etc. before the governments. Therefore, it would be essential to bring in a rightful mixture of technological and humanistic interventions for addressing the above issues. This is on account of the reason that technology lacks an emotional quotient, and the administration does not work like technology. Both are self-effacing unless a blend of technology and a humane face are brought in into the administration. The paper will empirically analyze the significance of the technological framework of digital transformation within the government set up for the digitalization of public administration on the basis of the synthesis of two case studies undertaken from two diverse fields of administration and present a future framework of the study.Keywords: digital transformation, electronic governance, public administration, knowledge framework
Procedia PDF Downloads 99