Search results for: quantum information encoding scheme
10429 Usability Testing on Information Design through Single-Lens Wearable Device
Authors: Jae-Hyun Choi, Sung-Soo Bae, Sangyoung Yoon, Hong-Ku Yun, Jiyoung Kwahk
Abstract:
This study was conducted to investigate the effect of ocular dominance on recognition performance using a single-lens smart display designed for cycling. A total of 36 bicycle riders who have been cycling consistently were recruited and participated in the experiment. The participants were asked to perform tasks riding a bicycle on a stationary stand for safety reasons. Independent variables of interest include ocular dominance, bike usage, age group, and information layout. Recognition time (i.e., the time required to identify specific information measured with an eye-tracker), error rate (i.e. false answer or failure to identify the information in 5 seconds), and user preference scores were measured and statistical tests were conducted to identify significant results. Recognition time and error ratio showed significant difference by ocular dominance factor, while the preference score did not. Recognition time was faster when the single-lens see-through display on the dominant eye (average 1.12sec) than on the non-dominant eye (average 1.38sec). Error ratio of the information recognition task was significantly lower when the see-through display was worn on the dominant eye (average 4.86%) than on the non-dominant eye (average 14.04%). The interaction effect of ocular dominance and age group was significant with respect to recognition time and error ratio. The recognition time of the users in their 40s was significantly longer than the other age groups when the display was placed on the non-dominant eye, while no difference was observed on the dominant eye. Error ratio also showed the same pattern. Although no difference was observed for the main effect of ocular dominance and bike usage, the interaction effect between the two variables was significant with respect to preference score. Preference score of daily bike users was higher when the display was placed on the dominant eye, whereas participants who use bikes for leisure purposes showed the opposite preference patterns. It was found more effective and efficient to wear a see-through display on the dominant eye than on the non-dominant eye, although user preference was not affected by ocular dominance. It is recommended to wear a see-through display on the dominant eye since it is safer by helping the user recognize the presented information faster and more accurately, even if the user may not notice the difference.Keywords: eye tracking, information recognition, ocular dominance, smart headware, wearable device
Procedia PDF Downloads 27310428 Analyzing Semantic Feature Using Multiple Information Sources for Reviews Summarization
Authors: Yu Hung Chiang, Hei Chia Wang
Abstract:
Nowadays, tourism has become a part of life. Before reserving hotels, customers need some information, which the most important source is online reviews, about hotels to help them make decisions. Due to the dramatic growing of online reviews, it is impossible for tourists to read all reviews manually. Therefore, designing an automatic review analysis system, which summarizes reviews, is necessary for them. The main purpose of the system is to understand the opinion of reviews, which may be positive or negative. In other words, the system would analyze whether the customers who visited the hotel like it or not. Using sentiment analysis methods will help the system achieve the purpose. In sentiment analysis methods, the targets of opinion (here they are called the feature) should be recognized to clarify the polarity of the opinion because polarity of the opinion may be ambiguous. Hence, the study proposes an unsupervised method using Part-Of-Speech pattern and multi-lexicons sentiment analysis to summarize all reviews. We expect this method can help customers search what they want information as well as make decisions efficiently.Keywords: text mining, sentiment analysis, product feature extraction, multi-lexicons
Procedia PDF Downloads 33110427 Flood-prone Urban Area Mapping Using Machine Learning, a Case Sudy of M'sila City (Algeria)
Authors: Medjadj Tarek, Ghribi Hayet
Abstract:
This study aims to develop a flood sensitivity assessment tool using machine learning (ML) techniques and geographic information system (GIS). The importance of this study is integrating the geographic information systems (GIS) and machine learning (ML) techniques for mapping flood risks, which help decision-makers to identify the most vulnerable areas and take the necessary precautions to face this type of natural disaster. To reach this goal, we will study the case of the city of M'sila, which is among the areas most vulnerable to floods. This study drew a map of flood-prone areas based on the methodology where we have made a comparison between 3 machine learning algorithms: the xGboost model, the Random Forest algorithm and the K Nearest Neighbour algorithm. Each of them gave an accuracy respectively of 97.92 - 95 - 93.75. In the process of mapping flood-prone areas, the first model was relied upon, which gave the greatest accuracy (xGboost).Keywords: Geographic information systems (GIS), machine learning (ML), emergency mapping, flood disaster management
Procedia PDF Downloads 9510426 MASCOT: Design and Development of an Interactive Self-Evaluation Tool for Students’ Thinking Complexity
Abstract:
'In Dialogue with Humanity’ and ‘In Dialogue with Nature’ are two compulsory General Education Foundation (GEF) courses for all undergraduates at the Chinese University of Hong Kong (CUHK). These courses aim to enrich students’ intellectual pursuits and enhance their thinking capabilities through classic readings. To better understand and evaluate students’ thinking habits and abilities, GEF introduced Narrative Qualitative Analysis (NQA) in 2014 and has continued the study since then. Through the NQA study, a two-way evaluation scheme has been developed, including both student self-evaluation and teacher evaluation. This study will first introduce the theoretical background and research framework of the NQA study and then focus on student self-evaluation. An interactive online application, MASCOT, has been developed to facilitate students’ self-evaluation of their own thinking complexity. In this presentation, the design and development of MASCOT will be explained, and the main results will be reported when applying it in classroom teaching. An obvious discrepancy has been observed between students’ self-evaluations and teachers’ evaluations.Keywords: narrative qualitative analysis, thinking complexity, student self-evaluation, interactive online application
Procedia PDF Downloads 4910425 Building Information Modeling and Its Application in the State of Kuwait
Authors: Michael Gerges, Ograbe Ahiakwo, Martin Jaeger, Ahmad Asaad
Abstract:
Recent advances of Building Information Modeling (BIM) especially in the Middle East have increased remarkably. Dubai has been taking a lead on this by making it mandatory for BIM to be adopted for all projects that involve complex architecture designs. This is because BIM is a dynamic process that assists all stakeholders in monitoring the project status throughout different project phases with great transparency. It focuses on utilizing information technology to improve collaboration among project participants during the entire life cycle of the project from the initial design, to the supply chain, resource allocation, construction and all productivity requirements. In view of this trend, the paper examines the extent of applying BIM in the State of Kuwait, by exploring practitioners’ perspectives on BIM, especially their perspectives on main barriers and main advantages. To this end structured interviews were carried out based on questionnaires and with a range of different construction professionals. The results revealed that practitioners perceive improved communication and mitigated project risks by encouraged collaboration between project participants. However, it was also observed that the full implementation of BIM in the State of Kuwait requires concerted efforts to make clients demanding BIM, counteract resistance to change among construction professionals and offer more training for design team members. This paper forms part of an on-going research effort on BIM and its application in the State of Kuwait and it is on this basis that further research on the topic is proposed.Keywords: building information modeling, BIM, construction industry, Kuwait
Procedia PDF Downloads 37810424 Spectroscopic Investigations of Nd³⁺ Doped Lithium Lead Alumino Borate Glasses for 1.06μM Laser Applications
Authors: Nisha Deopa, A. S. Rao
Abstract:
Neodymium doped lithium lead alumino borate glasses were synthesized with the molar composition 10Li₂O – 10PbO – (10-x) Al₂O₃ – 70B₂O₃ – xNd₂O₃ (where, x = 0.1, 0.5, 1.0, 1.5, 2.0 and 2.5 mol %) via conventional melt quenching technique to understand their lasing potentiality. From the absorption spectra, Judd-Ofelt intensity parameters along with various spectroscopic parameters have been estimated. The emission spectra recorded for the as-prepared glasses under investigation exhibit two emission transitions, ⁴F₃/₂→⁴I₁₁/₂ (1063 nm) and ⁴F₃/₂→⁴I₉/₂ (1350 nm) for which radiative parameters have been evaluated. The emission intensity increases with increase in Nd³⁺ ion concentration up to 1 mol %, and beyond concentration quenching took place. The decay profile shows single exponential nature for lower Nd³⁺ ions concentration and non-exponential for higher concentration. To elucidate the nature of energy transfer process, non-exponential decay curves were well fitted to Inokuti-Hirayama model. The relatively high values of emission cross-section, branching ratio, lifetimes and quantum efficiency suggest that 1.0 mol% of Nd³⁺ in LiPbAlB glasses is aptly suitable to generate lasing action in NIR region at 1063 nm.Keywords: energy transfer, glasses, J-O parameters, photoluminescence
Procedia PDF Downloads 19210423 Flood Disaster Prevention and Mitigation in Nigeria Using Geographic Information System
Authors: Dinebari Akpee, Friday Aabe Gaage, Florence Fred Nwaigwu
Abstract:
Natural disasters like flood affect many parts of the world including developing countries like Nigeria. As a result, many human lives are lost, properties damaged and so much money is lost in infrastructure damages. These hazards and losses can be mitigated and reduced by providing reliable spatial information to the generality of the people through about flood risks through flood inundation maps. Flood inundation maps are very crucial for emergency action plans, urban planning, ecological studies and insurance rates. Nigeria experience her worst flood in her entire history this year. Many cities were submerged and completely under water due to torrential rainfall. Poor city planning, lack of effective development control among others contributes to the problem too. Geographic information system (GIS) can be used to visualize the extent of flooding, analyze flood maps to produce flood damaged estimation maps and flood risk maps. In this research, the under listed steps were taken in preparation of flood risk maps for the study area: (1) Digitization of topographic data and preparation of digital elevation model using ArcGIS (2) Flood simulation using hydraulic model and integration and (3) Integration of the first two steps to produce flood risk maps. The results shows that GIS can play crucial role in Flood disaster control and mitigation.Keywords: flood disaster, risk maps, geographic information system, hazards
Procedia PDF Downloads 22910422 An Exploratory Study to Appraise the Current Challenges and Limitations Faced in Applying and Integrating the Historic Building Information Modelling Concept for the Management of Historic Buildings
Authors: Oluwatosin Adewale
Abstract:
The sustainability of built heritage has become a relevant issue in recent years due to the social and economic values associated with these buildings. Heritage buildings provide a means for human perception of culture and represent a legacy of long-existing history; they define the local character of the social world and provide a vital connection to the past with their associated aesthetical and communal benefits. The identified values of heritage buildings have increased the importance of conservation and the lifecycle management of these buildings. The recent developments of digital design technology in engineering and the built environment have led to the adoption of Building Information Modelling (BIM) by the Architecture, Engineering, Construction, and Operations (AECO) industry. BIM provides a platform for the lifecycle management of a construction project through effective collaboration among stakeholders and the analysis of a digital information model. This growth in digital design technology has also made its way into the field of architectural heritage management in the form of Historic Building Information Modelling (HBIM). A reverse engineering process for digital documentation of heritage assets that draws upon similar information management processes as the BIM process. However, despite the several scientific and technical contributions made to the development of the HBIM process, it doesn't remain easy to integrate at the most practical level of heritage asset management. The main objective identified under the scope of the study is to review the limitations and challenges faced by heritage management professionals in adopting an HBIM-based asset management procedure for historic building projects. This paper uses an exploratory study in the form of semi-structured interviews to investigate the research problem. A purposive sample of heritage industry experts and professionals were selected to take part in a semi-structured interview to appraise some of the limitations and challenges they have faced with the integration of HBIM into their project workflows. The findings from this study will present the challenges and limitations faced in applying and integrating the HBIM concept for the management of historic buildings.Keywords: building information modelling, built heritage, heritage asset management, historic building information modelling, lifecycle management
Procedia PDF Downloads 10310421 Integrated Braking and Traction Torque Vectoring Control Based on Vehicle Yaw Rate for Stability improvement of All-Wheel-Drive Electric Vehicles
Authors: Mahmoud Said Jneid, Péter Harth
Abstract:
EVs with independent wheel driving greatly improve vehicle stability in poor road conditions. Wheel torques can be precisely controlled through electric motors driven using advanced technologies. As a result, various types of advanced chassis assistance systems (ACAS) can be implemented. This paper proposes an integrated torque vectoring control based on wheel slip regulation in both braking and traction modes. For generating the corrective yaw moment, the vehicle yaw rate and sideslip angle are monitored. The corrective yaw moment is distributed into traction and braking torques based on an equal-opposite components approach. The proposed torque vectoring control scheme is validated in simulation and the results show its superiority when compared to conventional schemes.Keywords: all-wheel-drive, electric vehicle, torque vectoring, regenerative braking, stability control, traction control, yaw rate control
Procedia PDF Downloads 8310420 Investigation the Effect of Velocity Inlet and Carrying Fluid on the Flow inside Coronary Artery
Authors: Mohammadreza Nezamirad, Nasim Sabetpour, Azadeh Yazdi, Amirmasoud Hamedi
Abstract:
In this study OpenFOAM 4.4.2 was used to investigate flow inside the coronary artery of the heart. This step is the first step of our future project, which is to include conjugate heat transfer of the heart with three main coronary arteries. Three different velocities were used as inlet boundary conditions to see the effect of velocity increase on velocity, pressure, and wall shear of the coronary artery. Also, three different fluids, namely the University of Wisconsin solution, gelatin, and blood was used to investigate the effect of different fluids on flow inside the coronary artery. A code based on Reynolds Stress Navier Stokes (RANS) equations was written and implemented with the real boundary condition that was calculated based on MRI images. In order to improve the accuracy of the current numerical scheme, hex dominant mesh is utilized. When the inlet velocity increases to 0.5 m/s, velocity, wall shear stress, and pressure increase at the narrower parts.Keywords: CFD, simulation, OpenFOAM, heart
Procedia PDF Downloads 15210419 The Design Inspired by Phra Maha Chedi of King Rama I-IV at Wat Phra Chetuphon Vimolmangklaram Rajwaramahaviharn
Authors: Taechit Cheuypoung
Abstract:
The research will focus on creating pattern designs that are inspired by the pagodas, Phra Maha Chedi of King Rama I-IV, that are located in the temple, Wat Phra Chetuphon Vimolmangklararm Rajwaramahaviharn. Different aspects of the temple were studied, including the history, architecture, significance of the temple, and techniques used to decorate the pagodas, Phra Maha Chedi of King Rama I-IV. Moreover, composition of arts and the form of pattern designs which all led to the outcome of four Thai application pattern. The four patterns combine Thai traditional design with international scheme, however, maintaining the distinctiveness of the glaze mosaic tiles of each Phra Maha Chedi. The patterns consist of rounded and notched petal flowers, leaves and vine, and various square shapes, and original colors which are updated for modernity. These elements are then grouped and combined with new techniques, resulting in pattern designs with modern aspects and simultaneously reflecting the charm and the aesthetic of Thai craftsmanship which are eternally embedded in the designs.Keywords: Chedi, Pagoda, pattern, Wat
Procedia PDF Downloads 38910418 Extraction of Urban Building Damage Using Spectral, Height and Corner Information
Authors: X. Wang
Abstract:
Timely and accurate information on urban building damage caused by earthquake is important basis for disaster assessment and emergency relief. Very high resolution (VHR) remotely sensed imagery containing abundant fine-scale information offers a large quantity of data for detecting and assessing urban building damage in the aftermath of earthquake disasters. However, the accuracy obtained using spectral features alone is comparatively low, since building damage, intact buildings and pavements are spectrally similar. Therefore, it is of great significance to detect urban building damage effectively using multi-source data. Considering that in general height or geometric structure of buildings change dramatically in the devastated areas, a novel multi-stage urban building damage detection method, using bi-temporal spectral, height and corner information, was proposed in this study. The pre-event height information was generated using stereo VHR images acquired from two different satellites, while the post-event height information was produced from airborne LiDAR data. The corner information was extracted from pre- and post-event panchromatic images. The proposed method can be summarized as follows. To reduce the classification errors caused by spectral similarity and errors in extracting height information, ground surface, shadows, and vegetation were first extracted using the post-event VHR image and height data and were masked out. Two different types of building damage were then extracted from the remaining areas: the height difference between pre- and post-event was used for detecting building damage showing significant height change; the difference in the density of corners between pre- and post-event was used for extracting building damage showing drastic change in geometric structure. The initial building damage result was generated by combining above two building damage results. Finally, a post-processing procedure was adopted to refine the obtained initial result. The proposed method was quantitatively evaluated and compared to two existing methods in Port au Prince, Haiti, which was heavily hit by an earthquake in January 2010, using pre-event GeoEye-1 image, pre-event WorldView-2 image, post-event QuickBird image and post-event LiDAR data. The results showed that the method proposed in this study significantly outperformed the two comparative methods in terms of urban building damage extraction accuracy. The proposed method provides a fast and reliable method to detect urban building collapse, which is also applicable to relevant applications.Keywords: building damage, corner, earthquake, height, very high resolution (VHR)
Procedia PDF Downloads 21410417 Comparison of Various Control Methods for an Industrial Multiproduct Fractionator
Authors: Merve Aygün Esastürk, Deren Ataç Yılmaz, Görkem Oğur, Emre Özgen Kuzu, Sadık Ödemiş
Abstract:
Hydrocracker plants are one of the most complicated and most profitable units in the refinery process. It takes long chain paraffinic hydrocarbons as feed and turns them into smaller and more valuable products, mainly kerosene and diesel under high pressure with the excess amount of hydrogen. Controlling the product qualities well directly contributes to the unit profit. Control of a plant is mainly based on PID and MPC controllers. Controlling the reaction section is important in terms of reaction severity. However, controlling the fractionation section is more crucial since the end products are separated in fractionation section. In this paper, the importance of well-configured base layer control mechanism, composed of PID controllers, is highlighted. For this purpose, two different base layer control scheme is applied in a hydrocracker fractionator column performances of schemes, which is a direct contribution to better product quality, are compared.Keywords: controller, distillation, configuration selection, hydrocracker, model predictive controller, proportional-integral-derivative controller
Procedia PDF Downloads 44010416 Financial Management Skills of Supreme Student Government Officers in the Schools Division of Quezon: Basis for Project Financial Literacy Information Program
Authors: Edmond Jaro Malihan
Abstract:
This study aimed to develop and propose Project Financial Literacy Information Program (FLIP) for the Schools Division of Quezon to improve the financial management skills of Supreme Student Government (SSG) officers across different school sizes. This employed a descriptive research design covering the participation of 424 selected SSG officers using purposive sampling procedures from the SDO-Quezon. The consultation was held with DepEd officials, budget officers, and financial advisors to validate the design of the self-made questionnaires in which the computed mean was verbally interpreted using the four-point Likert scale. The data gathered were presented and analyzed using weighted arithmetic mean and ANOVA test. Based on the findings, generally, SSG officers in the SDO-Quezon possess high financial management skills in terms of budget preparation, resource mobilization, and auditing and evaluation. The size of schools has no significant difference and does not contribute to the financial management skills of SSG officers, which they apply in implementing their mandated programs, projects, and activities (PPAs). The Project Financial Literacy Information Program (FLIP) was developed considering their general level of financial management skills and the launched PPAs by the organization. The project covered the suggested training program vital in conducting the Virtual Division Training on Financial Management Skills of the SSG officers.Keywords: financial management skills, SSG officers, school size, financial literacy information program
Procedia PDF Downloads 7510415 Impacts of Applying Automated Vehicle Location Systems to Public Bus Transport Management
Authors: Vani Chintapally
Abstract:
The expansion of modest and minimized Global Positioning System (GPS) beneficiaries has prompted most Automatic Vehicle Location (AVL) frameworks today depending solely on satellite-based finding frameworks, as GPS is the most stable usage of these. This paper shows the attributes of a proposed framework for following and dissecting open transport in a run of the mill medium-sized city and complexities the qualities of such a framework to those of broadly useful AVL frameworks. Particular properties of the courses broke down by the AVL framework utilized for the examination of open transport in our study incorporate cyclic vehicle courses, the requirement for particular execution reports, and so forth. This paper particularly manages vehicle movement forecasts and the estimation of station landing time, combined with consequently produced reports on timetable conformance and other execution measures. Another side of the watched issue is proficient exchange of information from the vehicles to the control focus. The pervasiveness of GSM bundle information exchange advancements combined with decreased information exchange expenses have brought on today's AVL frameworks to depend predominantly on parcel information exchange administrations from portable administrators as the correspondences channel in the middle of vehicles and the control focus. This methodology brings numerous security issues up in this conceivably touchy application field.Keywords: automatic vehicle location (AVL), expectation of landing times, AVL security, data administrations, wise transport frameworks (ITS), guide coordinating
Procedia PDF Downloads 38410414 Deep Learning Based on Image Decomposition for Restoration of Intrinsic Representation
Authors: Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Kensuke Nakamura, Dongeun Choi, Byung-Woo Hong
Abstract:
Artefacts are commonly encountered in the imaging process of clinical computed tomography (CT) where the artefact refers to any systematic discrepancy between the reconstructed observation and the true attenuation coefficient of the object. It is known that CT images are inherently more prone to artefacts due to its image formation process where a large number of independent detectors are involved, and they are assumed to yield consistent measurements. There are a number of different artefact types including noise, beam hardening, scatter, pseudo-enhancement, motion, helical, ring, and metal artefacts, which cause serious difficulties in reading images. Thus, it is desired to remove nuisance factors from the degraded image leaving the fundamental intrinsic information that can provide better interpretation of the anatomical and pathological characteristics. However, it is considered as a difficult task due to the high dimensionality and variability of data to be recovered, which naturally motivates the use of machine learning techniques. We propose an image restoration algorithm based on the deep neural network framework where the denoising auto-encoders are stacked building multiple layers. The denoising auto-encoder is a variant of a classical auto-encoder that takes an input data and maps it to a hidden representation through a deterministic mapping using a non-linear activation function. The latent representation is then mapped back into a reconstruction the size of which is the same as the size of the input data. The reconstruction error can be measured by the traditional squared error assuming the residual follows a normal distribution. In addition to the designed loss function, an effective regularization scheme using residual-driven dropout determined based on the gradient at each layer. The optimal weights are computed by the classical stochastic gradient descent algorithm combined with the back-propagation algorithm. In our algorithm, we initially decompose an input image into its intrinsic representation and the nuisance factors including artefacts based on the classical Total Variation problem that can be efficiently optimized by the convex optimization algorithm such as primal-dual method. The intrinsic forms of the input images are provided to the deep denosing auto-encoders with their original forms in the training phase. In the testing phase, a given image is first decomposed into the intrinsic form and then provided to the trained network to obtain its reconstruction. We apply our algorithm to the restoration of the corrupted CT images by the artefacts. It is shown that our algorithm improves the readability and enhances the anatomical and pathological properties of the object. The quantitative evaluation is performed in terms of the PSNR, and the qualitative evaluation provides significant improvement in reading images despite degrading artefacts. The experimental results indicate the potential of our algorithm as a prior solution to the image interpretation tasks in a variety of medical imaging applications. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).Keywords: auto-encoder neural network, CT image artefact, deep learning, intrinsic image representation, noise reduction, total variation
Procedia PDF Downloads 19010413 Determination of Tide Height Using Global Navigation Satellite Systems (GNSS)
Authors: Faisal Alsaaq
Abstract:
Hydrographic surveys have traditionally relied on the availability of tide information for the reduction of sounding observations to a common datum. In most cases, tide information is obtained from tide gauge observations and/or tide predictions over space and time using local, regional or global tide models. While the latter often provides a rather crude approximation, the former relies on tide gauge stations that are spatially restricted, and often have sparse and limited distribution. A more recent method that is increasingly being used is Global Navigation Satellite System (GNSS) positioning which can be utilised to monitor height variations of a vessel or buoy, thus providing information on sea level variations during the time of a hydrographic survey. However, GNSS heights obtained under the dynamic environment of a survey vessel are affected by “non-tidal” processes such as wave activity and the attitude of the vessel (roll, pitch, heave and dynamic draft). This research seeks to examine techniques that separate the tide signal from other non-tidal signals that may be contained in GNSS heights. This requires an investigation of the processes involved and their temporal, spectral and stochastic properties in order to apply suitable recovery techniques of tide information. In addition, different post-mission and near real-time GNSS positioning techniques will be investigated with focus on estimation of height at ocean. Furthermore, the study will investigate the possibility to transfer the chart datums at the location of tide gauges.Keywords: hydrography, GNSS, datum, tide gauge
Procedia PDF Downloads 26610412 Leveraging Information for Building Supply Chain Competitiveness
Authors: Deepika Joshi
Abstract:
Operations in automotive industry rely greatly on information shared between Supply Chain (SC) partners. This leads to efficient and effective management of SC activity. Automotive sector in India is growing at 14.2 percent per annum and has huge economic importance. We find that no study has been carried out on the role of information sharing in SC management of Indian automotive manufacturers. Considering this research gap, the present study is planned to establish the significance of information sharing in Indian auto-component supply chain activity. An empirical research was conducted for large scale auto component manufacturers from India. Twenty four Supply Chain Performance Indicators (SCPIs) were collected from existing literature. These elements belong to eight diverse but internally related areas of SC management viz., demand management, cost, technology, delivery, quality, flexibility, buyer-supplier relationship, and operational factors. A pair-wise comparison and an open ended questionnaire were designed using these twenty four SCPIs. The questionnaire was then administered among managerial level employees of twenty-five auto-component manufacturing firms. Analytic Network Process (ANP) technique was used to analyze the response of pair-wise questionnaire. Finally, twenty-five priority indexes are developed, one for each respondent. These were averaged to generate an industry specific priority index. The open-ended questions depicted strategies related to information sharing between buyers and suppliers and their influence on supply chain performance. Results show that the impact of information sharing on certain performance indicators is relatively greater than their corresponding variables. For example, flexibility, delivery, demand and cost related elements have massive impact on information sharing. Technology is relatively less influenced by information sharing but it immensely influence the quality of information shared. Responses obtained from managers reveal that timely and accurate information sharing lowers the cost, increases flexibility and on-time delivery of auto parts, therefore, enhancing the competitiveness of Indian automotive industry. Any flaw in dissemination of information can disturb the cycle time of both the parties and thus increases the opportunity cost. Due to supplier’s involvement in decisions related to design of auto parts, quality conformance is found to improve, leading to reduction in rejection rate. Similarly, mutual commitment to share right information at right time between all levels of SC enhances trust level. SC partners share information to perform comprehensive quality planning to ingrain total quality management. This study contributes to operations management literature which faces scarcity of empirical examination on this subject. It views information sharing as a building block which firms can promote and evolve to leverage the operational capability of all SC members. It will provide insights for Indian managers and researchers as every market is unique and suppliers and buyers are driven by local laws, industry status and future vision. While major emphasis in this paper is given to SC operations happening between domestic partners, placing more focus on international SC can bring in distinguished results.Keywords: Indian auto component industry, information sharing, operations management, supply chain performance indicators
Procedia PDF Downloads 55010411 Optimal Design of Concrete Shells by Modified Particle Community Algorithm Using Spinless Curves
Authors: Reza Abbasi, Ahmad Hamidi Benam
Abstract:
Shell structures have many geometrical variables that modify some of these parameters to improve the mechanical behavior of the shell. On the other hand, the behavior of such structures depends on their geometry rather than on mass. Optimization techniques are useful in finding the geometrical shape of shell structures to improve mechanical behavior, especially to prevent or reduce bending anchors. The overall objective of this research is to optimize the shape of concrete shells using the thickness and height parameters along the reference curve and the overall shape of this curve. To implement the proposed scheme, the geometry of the structure was formulated using nonlinear curves. Shell optimization was performed under equivalent static loading conditions using the modified bird community algorithm. The results of this optimization show that without disrupting the initial design and with slight changes in the shell geometry, the structural behavior is significantly improved.Keywords: concrete shells, shape optimization, spinless curves, modified particle community algorithm
Procedia PDF Downloads 23410410 Pareto Optimal Material Allocation Mechanism
Authors: Peter Egri, Tamas Kis
Abstract:
Scheduling problems have been studied by the algorithmic mechanism design research from the beginning. This paper is focusing on a practically important, but theoretically rather neglected field: the project scheduling problem where the jobs connected by precedence constraints compete for various nonrenewable resources, such as materials. Although the centralized problem can be solved in polynomial-time by applying the algorithm of Carlier and Rinnooy Kan from the Eighties, obtaining materials in a decentralized environment is usually far from optimal. It can be observed in practical production scheduling situations that project managers tend to cache the required materials as soon as possible in order to avoid later delays due to material shortages. This greedy practice usually leads both to excess stocks for some projects and materials, and simultaneously, to shortages for others. The aim of this study is to develop a model for the material allocation problem of a production plant, where a central decision maker—the inventory—should assign the resources arriving at different points in time to the jobs. Since the actual due dates are not known by the inventory, the mechanism design approach is applied with the projects as the self-interested agents. The goal of the mechanism is to elicit the required information and allocate the available materials such that it minimizes the maximal tardiness among the projects. It is assumed that except the due dates, the inventory is familiar with every other parameters of the problem. A further requirement is that due to practical considerations monetary transfer is not allowed. Therefore a mechanism without money is sought which excludes some widely applied solutions such as the Vickrey–Clarke–Groves scheme. In this work, a type of Serial Dictatorship Mechanism (SDM) is presented for the studied problem, including a polynomial-time algorithm for computing the material allocation. The resulted mechanism is both truthful and Pareto optimal. Thus the randomization over the possible priority orderings of the projects results in a universally truthful and Pareto optimal randomized mechanism. However, it is shown that in contrast to problems like the many-to-many matching market, not every Pareto optimal solution can be generated with an SDM. In addition, no performance guarantee can be given compared to the optimal solution, therefore this approximation characteristic is investigated with experimental study. All in all, the current work studies a practically relevant scheduling problem and presents a novel truthful material allocation mechanism which eliminates the potential benefit of the greedy behavior that negatively influences the outcome. The resulted allocation is also shown to be Pareto optimal, which is the most widely used criteria describing a necessary condition for a reasonable solution.Keywords: material allocation, mechanism without money, polynomial-time mechanism, project scheduling
Procedia PDF Downloads 33310409 Numerical Modelling of a Vacuum Consolidation Project in Vietnam
Authors: Nguyen Trong Nghia, Nguyen Huu Uy Vu, Dang Huu Phuoc, Sanjay Kumar Shukla, Le Gia Lam, Nguyen Van Cuong
Abstract:
This paper introduces a matching scheme for selection of soil/drain properties in analytical solution and numerical modelling (axisymmetric and plane strain conditions) of a ground improvement project by using Prefabricated Vertical Drains (PVD) in combination with vacuum and surcharge preloading. In-situ monitoring data from a case history of a road construction project in Vietnam was adopted in the back-analysis. Analytical solution and axisymmetric analysis can approximate well the field data meanwhile the horizontal permeability need to be adjusted in plane strain scenario to achieve good agreement. In addition, the influence zone of the ground treatment was examined. The residual settlement was investigated to justify the long-term settlement in compliance with the design code. Moreover, the degree of consolidation of non-PVD sub-layers was also studied by means of two different approaches.Keywords: numerical modelling, prefabricated vertical drains, vacuum consolidation, soft soil
Procedia PDF Downloads 23010408 Video Foreground Detection Based on Adaptive Mixture Gaussian Model for Video Surveillance Systems
Authors: M. A. Alavianmehr, A. Tashk, A. Sodagaran
Abstract:
Modeling background and moving objects are significant techniques for video surveillance and other video processing applications. This paper presents a foreground detection algorithm that is robust against illumination changes and noise based on adaptive mixture Gaussian model (GMM), and provides a novel and practical choice for intelligent video surveillance systems using static cameras. In the previous methods, the image of still objects (background image) is not significant. On the contrary, this method is based on forming a meticulous background image and exploiting it for separating moving objects from their background. The background image is specified either manually, by taking an image without vehicles, or is detected in real-time by forming a mathematical or exponential average of successive images. The proposed scheme can offer low image degradation. The simulation results demonstrate high degree of performance for the proposed method.Keywords: image processing, background models, video surveillance, foreground detection, Gaussian mixture model
Procedia PDF Downloads 51610407 Improving Fake News Detection Using K-means and Support Vector Machine Approaches
Authors: Kasra Majbouri Yazdi, Adel Majbouri Yazdi, Saeid Khodayi, Jingyu Hou, Wanlei Zhou, Saeed Saedy
Abstract:
Fake news and false information are big challenges of all types of media, especially social media. There is a lot of false information, fake likes, views and duplicated accounts as big social networks such as Facebook and Twitter admitted. Most information appearing on social media is doubtful and in some cases misleading. They need to be detected as soon as possible to avoid a negative impact on society. The dimensions of the fake news datasets are growing rapidly, so to obtain a better result of detecting false information with less computation time and complexity, the dimensions need to be reduced. One of the best techniques of reducing data size is using feature selection method. The aim of this technique is to choose a feature subset from the original set to improve the classification performance. In this paper, a feature selection method is proposed with the integration of K-means clustering and Support Vector Machine (SVM) approaches which work in four steps. First, the similarities between all features are calculated. Then, features are divided into several clusters. Next, the final feature set is selected from all clusters, and finally, fake news is classified based on the final feature subset using the SVM method. The proposed method was evaluated by comparing its performance with other state-of-the-art methods on several specific benchmark datasets and the outcome showed a better classification of false information for our work. The detection performance was improved in two aspects. On the one hand, the detection runtime process decreased, and on the other hand, the classification accuracy increased because of the elimination of redundant features and the reduction of datasets dimensions.Keywords: clustering, fake news detection, feature selection, machine learning, social media, support vector machine
Procedia PDF Downloads 17810406 Perceived Influence of Information Communication Technology on Empowerment Amongst the College of Education Physical and Health Education Students in Oyo State
Authors: I. O. Oladipo, Olusegun Adewale Ajayi, Omoniyi Oladipupo Adigun
Abstract:
Information Communication Technology (ICT) have the potential to contribute to different facets of educational development and effective learning; expanding access, promoting efficiency, improve the quality of learning, enhancing the quality of teaching and provide important mechanism for the economic crisis. Considering the prevalence of unemployment among the higher institution graduates in this nation, in which much seems not to have been achieved in this direction. In view of this, the purpose of this study is to create an awareness and enlightenment of ICT for empowerment opportunities after school. A self-developed modified 4-likert scale questionnaire was used for data collection among Colleges of Education, Physical and Health Education students in Oyo State. Inferential statistical analysis of chi-square set at 0.05 alpha levels was used to analyze the stated hypotheses. The study concludes that awareness and enlightenment of ICT significantly influence empowerment opportunities and recommended that college of education students should be encouraged on the application of ICT for job opportunity after school.Keywords: employment, empowerment, information communication technology, physical education
Procedia PDF Downloads 39010405 Understanding Tacit Knowledge and DIKW
Authors: Bahadir Aydin
Abstract:
Today it is difficult to reach accurate knowledge because of mass data. This huge data makes the environment more and more caotic. Data is a main piller of intelligence. There is a close tie between knowledge and intelligence. Information gathered from different sources can be modified, interpreted and classified by using knowledge development process. This process is applied in order to attain intelligence. Within this process the effect of knowledge is crucial. Knowledge is classified as explicit and tacit knowledge. Tacit knowledge can be seen as "only the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose for all organization is to be succesful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. By the help of process the decision-maker can be presented with a clear holistic understanding, as early as possible in the decision making process. Planning, execution and assessments are the key functions that connects to information to knowledge. Altering from the current traditional reactive approach to a proactive knowledge development approach would reduce extensive duplication of work in the organization. By new approach to this process, knowledge can be used more effectively.Keywords: knowledge, intelligence cycle, tacit knowledge, KIDW
Procedia PDF Downloads 52010404 Two-Photon-Exchange Effects in the Electromagnetic Production of Pions
Authors: Hui-Yun Cao, Hai-Qing Zhou
Abstract:
The high precision measurements and experiments play more and more important roles in particle physics and atomic physics. To analyse the precise experimental data sets, the corresponding precise and reliable theoretical calculations are necessary. Until now, the form factors of elemental constituents such as pion and proton are still attractive issues in current Quantum Chromodynamics (QCD). In this work, the two-photon-exchange (TPE) effects in ep→enπ⁺ at small -t are discussed within a hadronic model. Under the pion dominance approximation and the limit mₑ→0, the TPE contribution to the amplitude can be described by a scalar function. We calculate TPE contributions to the amplitude, and the unpolarized differential cross section with the only elastic intermediate state is considered. The results show that the TPE corrections to the unpolarized differential cross section are about from -4% to -20% at Q²=1-1.6 GeV². After considering the TPE corrections to the experimental data sets of unpolarized differential cross section, we analyze the TPE corrections to the separated cross sections σ(L,T,LT,TT). We find that the TPE corrections (at Q²=1-1.6 GeV²) to σL are about from -10% to -30%, to σT are about 20%, and to σ(LT,TT) are much larger. By these analyses, we conclude that the TPE contributions in ep→enπ⁺ at small -t are important to extract the separated cross sections σ(L,T,LT,TT) and the electromagnetic form factor of π⁺ in the experimental analysis.Keywords: differential cross section, form factor, hadronic, two-photon
Procedia PDF Downloads 13410403 Knowledge Management Strategies within a Corporate Environment of Papers
Authors: Daniel J. Glauber
Abstract:
Knowledge transfer between personnel could benefit an organization’s improved competitive advantage in the marketplace from a strategic approach to knowledge management. The lack of information sharing between personnel could create knowledge transfer gaps while restricting the decision-making processes. Knowledge transfer between personnel can potentially improve information sharing based on an implemented knowledge management strategy. An organization’s capacity to gain more knowledge is aligned with the organization’s prior or existing captured knowledge. This case study attempted to understand the overall influence of a KMS within the corporate environment and knowledge exchange between personnel. The significance of this study was to help understand how organizations can improve the Return on Investment (ROI) of a knowledge management strategy within a knowledge-centric organization. A qualitative descriptive case study was the research design selected for this study. The lack of information sharing between personnel may create knowledge transfer gaps while restricting the decision-making processes. Developing a knowledge management strategy acceptable at all levels of the organization requires cooperation in support of a common organizational goal. Working with management and executive members to develop a protocol where knowledge transfer becomes a standard practice in multiple tiers of the organization. The knowledge transfer process could be measurable when focusing on specific elements of the organizational process, including personnel transition to help reduce time required understanding the job. The organization studied in this research acknowledged the need for improved knowledge management activities within the organization to help organize, retain, and distribute information throughout the workforce. Data produced from the study indicate three main themes including information management, organizational culture, and knowledge sharing within the workforce by the participants. These themes indicate a possible connection between an organizations KMS, the organizations culture, knowledge sharing, and knowledge transfer.Keywords: knowledge transfer, management, knowledge management strategies, organizational learning, codification
Procedia PDF Downloads 44310402 Systematic and Simple Guidance for Feed Forward Design in Model Predictive Control
Authors: Shukri Dughman, Anthony Rossiter
Abstract:
This paper builds on earlier work which demonstrated that Model Predictive Control (MPC) may give a poor choice of default feed forward compensator. By first demonstrating the impact of future information of target changes on the performance, this paper proposes a pragmatic method for identifying the amount of future information on the target that can be utilised effectively in both finite and infinite horizon algorithms. Numerical illustrations in MATLAB give evidence of the efficacy of the proposal.Keywords: model predictive control, tracking control, advance knowledge, feed forward
Procedia PDF Downloads 54910401 An Investigation on Organisation Cyber Resilience
Authors: Arniyati Ahmad, Christopher Johnson, Timothy Storer
Abstract:
Cyber exercises used to assess the preparedness of a community against cyber crises, technology failures and critical information infrastructure (CII) incidents. The cyber exercises also called cyber crisis exercise or cyber drill, involved partnerships or collaboration of public and private agencies from several sectors. This study investigates organisation cyber resilience (OCR) of participation sectors in cyber exercise called X Maya in Malaysia. This study used a principal based cyber resilience survey called C-Suite Executive checklist developed by World Economic Forum in 2012. To ensure suitability of the survey to investigate the OCR, the reliability test was conducted on C-Suite Executive checklist items. The research further investigates the differences of OCR in ten Critical National Infrastructure Information (CNII) sectors participated in the cyber exercise. The One Way ANOVA test result showed a statistically significant difference of OCR among ten CNII sectors participated in the cyber exercise.Keywords: critical information infrastructure, cyber resilience, organisation cyber resilience, reliability test
Procedia PDF Downloads 36510400 Library on the Cloud: Universalizing Libraries Based on Virtual Space
Authors: S. Vanaja, P. Panneerselvam, S. Santhanakarthikeyan
Abstract:
Cloud Computing is a latest trend in Libraries. Entering in to cloud services, Librarians can suit the present information handling and they are able to satisfy needs of the knowledge society. Libraries are now in the platform of universalizing all its information to users and they focus towards clouds which gives easiest access to data and application. Cloud computing is a highly scalable platform promising quick access to hardware and software over the internet, in addition to easy management and access by non-expert users. In this paper, we discuss the cloud’s features and its potential applications in the library and information centers, how cloud computing actually works is illustrated in this communication and how it will be implemented. It discuss about what are the needs to move to cloud, process of migration to cloud. In addition to that this paper assessed the practical problems during migration in libraries, advantages of migration process and what are the measures that Libraries should follow during migration in to cloud. This paper highlights the benefits and some concerns regarding data ownership and data security on the cloud computing.Keywords: cloud computing, cloud-service, cloud based-ILS, cloud-providers, discovery service, IaaS, PaaS, SaaS, virtualization, Web scale access
Procedia PDF Downloads 664