Search results for: Network and Information Security
1996 A Way of Converting Color Images to Gray Scale Ones for the Color Blinds -Reducing the Colors for Tokyo Subway Map-
Authors: Katsuhiro Narikiyo, Naoto Kobayakawa
Abstract:
We proposes a way of removing noises and reducing the number of colors contained in a JPEG image. Main purpose of this project is to convert color images to monochrome images for the color blinds. We treat the crispy color images like the Tokyo subway map. Each color in the image has an important information. But for the color blinds, similar colors cannot be distinguished. If we can convert those colors to different gray values, they can distinguish them.
Keywords: Image processing, Color blind, JPEG
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14001995 A Traditional Settlement in a Modernized City: Yanbu, Saudi Arabia
Authors: Hisham Mortada
Abstract:
Transition in the urban configuration of Arab cities has never been as radical and visible as it has been since the turn of the last century. The emergence of new cities near historical settlements of Arabia has spawned a series of developments in and around the old city precincts. New developments are based on advanced technology and conform to globally prevalent standards of city planning, superseding the vernacular arrangements based on traditional norms that guided so-called ‘city planning’. Evidence to this fact are the extant Arab buildings present at the urban core of modern cities, which inform us about intricate spatial organization. Organization that subscribed to multiple norms such as, satisfying gender segregation and socialization, economic sustainability, and ensuring security and environmental coherence etc., within settlement compounds. Several participating factors achieved harmony in such an inclusive city—an organization that was challenged and apparently replaced by the new planning order in the face of growing needs of globalized, economy-centric and high-tech models of development. Communities found it difficult to acclimatize with the new western planning models that were implemented at a very large scale throughout the Kingdom, which later experienced spatial re-structuring to suit users’ needs. A closer look the ancient city of Yanbu, now flanked with such new developments, allows us to differentiate and track the beginnings of this unprecedented transition in settlement formations. This paper aims to elaborate the Arabian context offered to both the ‘traditional’ and ‘modern’ planning approaches, in order to understand challenges and solutions offered by both at different times. In the process it will also establish the inconsistencies and conflicts that arose with the shift in planning paradigm, from traditional-'cultural norms’, to modern-'physical planning', in the Arabian context. Thus, by distinguishing the two divergent planning philosophies, their impact of the Arabian morphology, relevance to lifestyle and suitability to the biophysical environment, it concludes with a perspective on sustainability particularly for in case of Yanbu.Keywords: Yanbu, traditional architecture, Hijaz, coral building, Saudi Arabia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16061994 Adaptive Skin Segmentation Using Color Distance Map
Authors: Mohammad Shoyaib, M. Abdullah-Al-Wadud, Oksam Chae
Abstract:
In this paper an effective approach for segmenting human skin regions in images taken at different environment is proposed. The proposed method uses a color distance map that is flexible enough to reliably detect the skin regions even if the illumination conditions of the image vary. Local image conditions is also focused, which help the technique to adaptively detect differently illuminated skin regions of an image. Moreover, usage of local information also helps the skin detection process to get rid of picking up much noisy pixels.Keywords: Color Distance map, Reference skin color, Regiongrowing, Skin segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20071993 Normalizing Flow to Augmented Posterior: Conditional Density Estimation with Interpretable Dimension Reduction for High Dimensional Data
Authors: Cheng Zeng, George Michailidis, Hitoshi Iyatomi, Leo L Duan
Abstract:
The conditional density characterizes the distribution of a response variable y given other predictor x, and plays a key role in many statistical tasks, including classification and outlier detection. Although there has been abundant work on the problem of Conditional Density Estimation (CDE) for a low-dimensional response in the presence of a high-dimensional predictor, little work has been done for a high-dimensional response such as images. The promising performance of normalizing flow (NF) neural networks in unconditional density estimation acts a motivating starting point. In this work, we extend NF neural networks when external x is present. Specifically, they use the NF to parameterize a one-to-one transform between a high-dimensional y and a latent z that comprises two components [zP , zN]. The zP component is a low-dimensional subvector obtained from the posterior distribution of an elementary predictive model for x, such as logistic/linear regression. The zN component is a high-dimensional independent Gaussian vector, which explains the variations in y not or less related to x. Unlike existing CDE methods, the proposed approach, coined Augmented Posterior CDE (AP-CDE), only requires a simple modification on the common normalizing flow framework, while significantly improving the interpretation of the latent component, since zP represents a supervised dimension reduction. In image analytics applications, AP-CDE shows good separation of x-related variations due to factors such as lighting condition and subject id, from the other random variations. Further, the experiments show that an unconditional NF neural network, based on an unsupervised model of z, such as Gaussian mixture, fails to generate interpretable results.
Keywords: Conditional density estimation, image generation, normalizing flow, supervised dimension reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671992 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data
Authors: Sašo Pečnik, Borut Žalik
Abstract:
This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR datasets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.
Keywords: Filtering, graphics, level-of-details, LiDAR, realtime visualization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25461991 A New Distribution and Application on the Lifetime Data
Authors: Gamze Ozel, Selen Cakmakyapan
Abstract:
We introduce a new model called the Marshall-Olkin Rayleigh distribution which extends the Rayleigh distribution using Marshall-Olkin transformation and has increasing and decreasing shapes for the hazard rate function. Various structural properties of the new distribution are derived including explicit expressions for the moments, generating and quantile function, some entropy measures, and order statistics are presented. The model parameters are estimated by the method of maximum likelihood and the observed information matrix is determined. The potentiality of the new model is illustrated by means of a simulation study.
Keywords: Marshall-Olkin distribution, Rayleigh distribution, estimation, maximum likelihood.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13881990 Behavioral Mapping and Post-Occupancy Evaluation of Meeting-Point Design in an International Airport
Authors: Meng-Cong Zheng, Yu-Sheng Chen
Abstract:
The meeting behavior is a pervasive kind of interaction, which often occurs between the passenger and the shuttle. However, the meeting point set up at the Taoyuan International Airport is too far from the entry-exit, often causing passengers to stop searching near the entry-exit. When the number of people waiting for the rush hour increases, it often results in chaos in the waiting area. This study tried to find out what is the key factor to promote the rapid finding of each other between the passengers and the pick-ups. Then we implemented several design proposals to improve the meeting behavior of passengers and pick-ups based on behavior mapping and post-occupancy evaluation to enhance their meeting efficiency in unfamiliar environments. The research base is the reception hall of the second terminal of Taoyuan International Airport. Behavioral observation and mapping are implemented on the entry of inbound passengers into the welcome space, including the crowd distribution of the people who rely on the separation wall in the waiting area, the behavior of meeting and the interaction between the inbound passengers and the pick-ups. Then we redesign the space planning and signage design based on post-occupancy evaluation to verify the effectiveness of space plan and signage design. This study found that passengers ignore existing meeting-point designs which are placed on distant pillars at both ends. The position of the screen affects the area where the receiver is stranded, causing the pick-ups to block the passenger's moving line. The pick-ups prefer to wait where it is easy to watch incoming passengers and where it is closest to the mode of transport they take when leaving. Large visitors tend to gather next to landmarks, and smaller groups have a wide waiting area in the lobby. The location of the meeting point chosen by the pick-ups is related to the view of the incoming passenger. Finally, this study proposes an improved design of the meeting point, setting the traffic information in it, so that most passengers can see the traffic information when they enter the country. At the same time, we also redesigned the pick-ups desk to improve the efficiency of passenger meeting.
Keywords: Meeting point design, post-occupancy evaluation, behavioral mapping, international airport.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10341989 The Impact of Temporal Impairment on Quality of Experience (QoE) in Video Streaming: A No Reference (NR) Subjective and Objective Study
Authors: Muhammad Arslan Usman, Muhammad Rehan Usman, Soo Young Shin
Abstract:
Live video streaming is one of the most widely used service among end users, yet it is a big challenge for the network operators in terms of quality. The only way to provide excellent Quality of Experience (QoE) to the end users is continuous monitoring of live video streaming. For this purpose, there are several objective algorithms available that monitor the quality of the video in a live stream. Subjective tests play a very important role in fine tuning the results of objective algorithms. As human perception is considered to be the most reliable source for assessing the quality of a video stream subjective tests are conducted in order to develop more reliable objective algorithms. Temporal impairments in a live video stream can have a negative impact on the end users. In this paper we have conducted subjective evaluation tests on a set of video sequences containing temporal impairment known as frame freezing. Frame Freezing is considered as a transmission error as well as a hardware error which can result in loss of video frames on the reception side of a transmission system. In our subjective tests, we have performed tests on videos that contain a single freezing event and also for videos that contain multiple freezing events. We have recorded our subjective test results for all the videos in order to give a comparison on the available No Reference (NR) objective algorithms. Finally, we have shown the performance of no reference algorithms used for objective evaluation of videos and suggested the algorithm that works better. The outcome of this study shows the importance of QoE and its effect on human perception. The results for the subjective evaluation can serve the purpose for validating objective algorithms.Keywords: Objective evaluation, subjective evaluation, quality of experience (QoE), video quality assessment (VQA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16331988 Knowledge Management Model for Managing Knowledge among Related Organizations
Authors: Mahboubeh Molaei
Abstract:
Transferring information developed by other peoples is an ordinary event that happens during daily conversations, for example when employees sea each other in the organization, or when they are having lunch together, or attending a meeting, they use to talk about their experience, and discuss about their current projects, and talk about their successes over some specific problems. Despite the potential value of leveraging organizational memory and expertise by using OMS and ER, still small organizations haven-t been able to capitalize on its promised value. Each organization has its internal knowledge management system, in some of organizations the system face the lack of expert people to save their experience in the repository and in another hand on some other organizations there are lots of expert people but the organization doesn-t have the maximum use of their knowledge.
Keywords: Knowledge, knowledge management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14681987 Open Source Algorithms for 3D Geo-Representation of Subsurface Formations Properties in the Oil and Gas Industry
Authors: Gabriel Quintero
Abstract:
This paper presents the result of the implementation of a series of algorithms intended to be used for representing in most of the 3D geographic software, even Google Earth, the subsurface formations properties combining 2D charts or 3D plots over a 3D background, allowing everyone to use them, no matter the economic size of the company for which they work. Besides the existence of complex and expensive specialized software for modeling subsurface formations based on the same information provided to this one, the use of this open source development shows a higher and easier usability and good results, limiting the rendered properties and polygons to a basic set of charts and tubes.
Keywords: Chart, earth, formations, subsurface, visualization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19181986 Checklist for Autism Spectrum Disorder as an In-class Observation Tool for Teachers
Authors: W. Król-Gierat
Abstract:
The majority of Special Educational Needs checklists are intended for preliminary screening in the special education disability process. The aim of the present paper is to present their potential usefulness as in-class observation tools for teachers working with students who have already been diagnosed with a disorder. A checklist may complement and organize information about a given child, which is indispensable to improve his or her condition. The case of a Polish boy with autism will serve as an example. Last but not least, alternative uses of checklists are suggested in the article.
Keywords: Autism Spectrum Disorders, case study, checklist, observation tool.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 63581985 Machine Learning Techniques for COVID-19 Detection: A Comparative Analysis
Authors: Abeer Aljohani
Abstract:
The COVID-19 virus spread has been one of the extreme pandemics across the globe. It is also referred as corona virus which is a contagious disease that continuously mutates into numerous variants. Currently, the B.1.1.529 variant labeled as Omicron is detected in South Africa. The huge spread of COVID-19 disease has affected several lives and has surged exceptional pressure on the healthcare systems worldwide. Also, everyday life and the global economy have been at stake. Numerous COVID-19 cases have produced a huge burden on hospitals as well as health workers. To reduce this burden, this paper predicts COVID-19 disease based on the symptoms and medical history of the patient. As machine learning is a widely accepted area and gives promising results for healthcare, this research presents an architecture for COVID-19 detection using ML techniques integrated with feature dimensionality reduction. This paper uses a standard University of California Irvine (UCI) dataset for predicting COVID-19 disease. This dataset comprises symptoms of 5434 patients. This paper also compares several supervised ML techniques on the presented architecture. The architecture has also utilized 10-fold cross validation process for generalization and Principal Component Analysis (PCA) technique for feature reduction. Standard parameters are used to evaluate the proposed architecture including F1-Score, precision, accuracy, recall, Receiver Operating Characteristic (ROC) and Area under Curve (AUC). The results depict that Decision tree, Random Forest and neural networks outperform all other state-of-the-art ML techniques. This result can be used to effectively identify COVID-19 infection cases.
Keywords: Supervised machine learning, COVID-19 prediction, healthcare analytics, Random Forest, Neural Network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3841984 Genetic Mining: Using Genetic Algorithm for Topic based on Concept Distribution
Authors: S. M. Khalessizadeh, R. Zaefarian, S.H. Nasseri, E. Ardil
Abstract:
Today, Genetic Algorithm has been used to solve wide range of optimization problems. Some researches conduct on applying Genetic Algorithm to text classification, summarization and information retrieval system in text mining process. This researches show a better performance due to the nature of Genetic Algorithm. In this paper a new algorithm for using Genetic Algorithm in concept weighting and topic identification, based on concept standard deviation will be explored.Keywords: Genetic Algorithm, Text Mining, Term Weighting, Concept Extraction, Concept Distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37151983 Deployment of Service Quality Characteristics
Authors: Shuki Dror
Abstract:
This work discusses an innovative methodology for deployment of service quality characteristics. Four groups of organizational features that may influence the quality of services are identified: human resource, technology, planning, and organizational relationships. A House of Service Quality (HOSQ) matrix is built to extract the desired improvement in the service quality characteristics and to translate them into a hierarchy of important organizational features. The Mean Square Error (MSE) criterion enables the pinpointing of the few essential service quality characteristics to be improved as well as selection of the vital organizational features. The method was implemented in an engineering supply enterprise and provides useful information on its vital service dimensions.Keywords: HOQ, organizational features, service quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18671982 Assessing Water Quality Using GIS: The Case of Northern Lebanon Miocene Aquifer
Authors: M. Saba, A. Iaaly, E. Carlier, N. Georges
Abstract:
This research focuses on assessing the ground water quality of Northern Lebanon affected by saline water intrusion. The chemical, physical and microbiological parameters were collected in various seasons spanning over the period of two years. Results were assessed using Geographic Information System (GIS) due to its visual capabilities in presenting the pollution extent in the studied region. Future projections of the excessive pumping were also simulated using GIS in order to assess the extent of the problem of saline intrusion in the near future.Keywords: GIS, saline water, quality control, drinkable water quality standards, pumping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15671981 MEAL Project: Modifying Eating Attitudes and Actions through Learning
Authors: E. Oliver, A. Cebolla, A. Dominguez, A. Gonzalez-Segura, E. de la Cruz, S. Albertini, L. Ferrini, K. Kronika, T. Nilsen, R. Baños
Abstract:
The main objective of MEAL is to develop a pedagogical tool aimed to help teachers and nutritionists (students and professionals) to acquire, train, promote and deliver to children basic nutritional education and healthy eating behaviours competencies. MEAL is focused on eating behaviours and not only in nutritional literacy, and will use new technologies like Information and Communication Technologies (ICTs) and serious games (SG) platforms to consolidate the nutritional competences and habits.Keywords: Nutritional Education, Pedagogical ICT Platform, Serious Games, Teachers and Nutritionists, Training Course.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22761980 Cost Benefit Analysis: Evaluation among the Millimetre Wavebands and SHF Bands of Small Cell 5G Networks
Authors: Emanuel Teixeira, Anderson Ramos, Marisa Lourenço, Fernando J. Velez, Jon M. Peha
Abstract:
This article discusses the benefit cost analysis aspects of millimetre wavebands (mmWaves) and Super High Frequency (SHF). The devaluation along the distance of the carrier-to-noise-plus-interference ratio with the coverage distance is assessed by considering two different path loss models, the two-slope urban micro Line-of-Sight (UMiLoS) for the SHF band and the modified Friis propagation model, for frequencies above 24 GHz. The equivalent supported throughput is estimated at the 5.62, 28, 38, 60 and 73 GHz frequency bands and the influence of carrier-to-noise-plus-interference ratio in the radio and network optimization process is explored. Mostly owing to the lessening caused by the behaviour of the two-slope propagation model for SHF band, the supported throughput at this band is higher than at the millimetre wavebands only for the longest cell lengths. The benefit cost analysis of these pico-cellular networks was analysed for regular cellular topologies, by considering the unlicensed spectrum. For shortest distances, we can distinguish an optimal of the revenue in percentage terms for values of the cell length, R ≈ 10 m for the millimeter wavebands and for longest distances an optimal of the revenue can be observed at R ≈ 550 m for the 5.62 GHz. It is possible to observe that, for the 5.62 GHz band, the profit is slightly inferior than for millimetre wavebands, for the shortest Rs, and starts to increase for cell lengths approximately equal to the ratio between the break-point distance and the co-channel reuse factor, achieving a maximum for values of R approximately equal to 550 m.
Keywords: 5G, millimetre wavebands, super high-frequency band, SINR, signal-to-interference-plus-noise ratio, cost benefit analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7231979 Using Automatic Ontology Learning Methods in Human Plausible Reasoning Based Systems
Authors: A. R. Vazifedoost, M. Rahgozar, F. Oroumchian
Abstract:
Knowledge discovery from text and ontology learning are relatively new fields. However their usage is extended in many fields like Information Retrieval (IR) and its related domains. Human Plausible Reasoning based (HPR) IR systems for example need a knowledge base as their underlying system which is currently made by hand. In this paper we propose an architecture based on ontology learning methods to automatically generate the needed HPR knowledge base.Keywords: Ontology Learning, Human Plausible Reasoning, knowledge extraction, knowledge representation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16011978 Fast Extraction of Edge Histogram in DCT Domain based on MPEG7
Authors: Minyoung Eom, Yoonsik Choe
Abstract:
In these days, multimedia data is transmitted and processed in compressed format. Due to the decoding procedure and filtering for edge detection, the feature extraction process of MPEG-7 Edge Histogram Descriptor is time-consuming as well as computationally expensive. To improve efficiency of compressed image retrieval, we propose a new edge histogram generation algorithm in DCT domain in this paper. Using the edge information provided by only two AC coefficients of DCT coefficients, we can get edge directions and strengths directly in DCT domain. The experimental results demonstrate that our system has good performance in terms of retrieval efficiency and effectiveness.Keywords: DCT, Descriptor, EHD, MPEG7.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21271977 Semi-Automatic Approach for Semantic Annotation
Authors: Mohammad Yasrebi, Mehran Mohsenzadeh
Abstract:
The third phase of web means semantic web requires many web pages which are annotated with metadata. Thus, a crucial question is where to acquire these metadata. In this paper we propose our approach, a semi-automatic method to annotate the texts of documents and web pages and employs with a quite comprehensive knowledge base to categorize instances with regard to ontology. The approach is evaluated against the manual annotations and one of the most popular annotation tools which works the same as our tool. The approach is implemented in .net framework and uses the WordNet for knowledge base, an annotation tool for the Semantic Web.
Keywords: Semantic Annotation, Metadata, Information Extraction, Semantic Web, knowledge base.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18671976 Deep Learning for Renewable Power Forecasting: An Approach Using LSTM Neural Networks
Authors: Fazıl Gökgöz, Fahrettin Filiz
Abstract:
Load forecasting has become crucial in recent years and become popular in forecasting area. Many different power forecasting models have been tried out for this purpose. Electricity load forecasting is necessary for energy policies, healthy and reliable grid systems. Effective power forecasting of renewable energy load leads the decision makers to minimize the costs of electric utilities and power plants. Forecasting tools are required that can be used to predict how much renewable energy can be utilized. The purpose of this study is to explore the effectiveness of LSTM-based neural networks for estimating renewable energy loads. In this study, we present models for predicting renewable energy loads based on deep neural networks, especially the Long Term Memory (LSTM) algorithms. Deep learning allows multiple layers of models to learn representation of data. LSTM algorithms are able to store information for long periods of time. Deep learning models have recently been used to forecast the renewable energy sources such as predicting wind and solar energy power. Historical load and weather information represent the most important variables for the inputs within the power forecasting models. The dataset contained power consumption measurements are gathered between January 2016 and December 2017 with one-hour resolution. Models use publicly available data from the Turkish Renewable Energy Resources Support Mechanism. Forecasting studies have been carried out with these data via deep neural networks approach including LSTM technique for Turkish electricity markets. 432 different models are created by changing layers cell count and dropout. The adaptive moment estimation (ADAM) algorithm is used for training as a gradient-based optimizer instead of SGD (stochastic gradient). ADAM performed better than SGD in terms of faster convergence and lower error rates. Models performance is compared according to MAE (Mean Absolute Error) and MSE (Mean Squared Error). Best five MAE results out of 432 tested models are 0.66, 0.74, 0.85 and 1.09. The forecasting performance of the proposed LSTM models gives successful results compared to literature searches.Keywords: Deep learning, long-short-term memory, energy, renewable energy load forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15961975 Preliminary Evaluation of Decommissioning Wastes for the First Commercial Nuclear Power Reactor in South Korea
Authors: Kyomin Lee, Joohee Kim, Sangho Kang
Abstract:
The commercial nuclear power reactor in South Korea, Kori Unit 1, which was a 587 MWe pressurized water reactor that started operation since 1978, was permanently shut down in June 2017 without an additional operating license extension. The Kori 1 Unit is scheduled to become the nuclear power unit to enter the decommissioning phase. In this study, the preliminary evaluation of the decommissioning wastes for the Kori Unit 1 was performed based on the following series of process: firstly, the plant inventory is investigated based on various documents (i.e., equipment/ component list, construction records, general arrangement drawings). Secondly, the radiological conditions of systems, structures and components (SSCs) are established to estimate the amount of radioactive waste by waste classification. Third, the waste management strategies for Kori Unit 1 including waste packaging are established. Forth, selection of the proper decontamination and dismantling (D&D) technologies is made considering the various factors. Finally, the amount of decommissioning waste by classification for Kori 1 is estimated using the DeCAT program, which was developed by KEPCO-E&C for a decommissioning cost estimation. The preliminary evaluation results have shown that the expected amounts of decommissioning wastes were less than about 2% and 8% of the total wastes generated (i.e., sum of clean wastes and radwastes) before/after waste processing, respectively, and it was found that the majority of contaminated material was carbon or alloy steel and stainless steel. In addition, within the range of availability of information, the results of the evaluation were compared with the results from the various decommissioning experiences data or international/national decommissioning study. The comparison results have shown that the radioactive waste amount from Kori Unit 1 decommissioning were much less than those from the plants decommissioned in U.S. and were comparable to those from the plants in Europe. This result comes from the difference of disposal cost and clearance criteria (i.e., free release level) between U.S. and non-U.S. The preliminary evaluation performed using the methodology established in this study will be useful as a important information in establishing the decommissioning planning for the decommissioning schedule and waste management strategy establishment including the transportation, packaging, handling, and disposal of radioactive wastes.
Keywords: Characterization, classification, decommissioning, decontamination and dismantling, Kori 1, radioactive waste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14821974 A Robust Audio Fingerprinting Algorithm in MP3 Compressed Domain
Authors: Ruili Zhou, Yuesheng Zhu
Abstract:
In this paper, a new robust audio fingerprinting algorithm in MP3 compressed domain is proposed with high robustness to time scale modification (TSM). Instead of simply employing short-term information of the MP3 stream, the new algorithm extracts the long-term features in MP3 compressed domain by using the modulation frequency analysis. Our experiment has demonstrated that the proposed method can achieve a hit rate of above 95% in audio retrieval and resist the attack of 20% TSM. It has lower bit error rate (BER) performance compared to the other algorithms. The proposed algorithm can also be used in other compressed domains, such as AAC.Keywords: Audio Fingerprinting, MP3, Modulation Frequency, TSM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21961973 Performance and Availability Analyses of PV Generation Systems in Taiwan
Authors: H. S. Huang, J. C. Jao, K. L. Yen, C. T. Tsai
Abstract:
The purpose of this article applies the monthly final energy yield and failure data of 202 PV systems installed in Taiwan to analyze the PV operational performance and system availability. This data is collected by Industrial Technology Research Institute through manual records. Bad data detection and failure data estimation approaches are proposed to guarantee the quality of the received information. The performance ratio value and system availability are then calculated and compared with those of other countries. It is indicated that the average performance ratio of Taiwan-s PV systems is 0.74 and the availability is 95.7%. These results are similar with those of Germany, Switzerland, Italy and Japan.Keywords: availability, performance ratio, PV system, Taiwan
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44391972 Using the Polynomial Approximation Algorithm in the Algorithm 2 for Manipulator's Control in an Unknown Environment
Authors: Pavel K. Lopatin, Artyom S. Yegorov
Abstract:
The Algorithm 2 for a n-link manipulator movement amidst arbitrary unknown static obstacles for a case when a sensor system supplies information about local neighborhoods of different points in the configuration space is presented. The Algorithm 2 guarantees the reaching of a target position in a finite number of steps. The Algorithm 2 is reduced to a finite number of calls of a subroutine for planning a trajectory in the presence of known forbidden states. The polynomial approximation algorithm which is used as the subroutine is presented. The results of the Algorithm2 implementation are given.
Keywords: Manipulator, trajectory planning, unknown obstacles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12401971 A Blind Digital Watermark in Hadamard Domain
Authors: Saeid Saryazdi, Hossein Nezamabadi-pour
Abstract:
A new blind gray-level watermarking scheme is described. In the proposed method, the host image is first divided into 4*4 non-overlapping blocks. For each block, two first AC coefficients of its Hadamard transform are then estimated using DC coefficients of its neighbor blocks. A gray-level watermark is then added into estimated values. Since embedding watermark does not change the DC coefficients, watermark extracting could be done by estimating AC coefficients and comparing them with their actual values. Several experiments are made and results suggest the robustness of the proposed algorithm.
Keywords: Digital Watermarking, Image watermarking, Information Hiden, Steganography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22631970 Social Media Research and Its Effect on Our Society
Authors: A. T. M Shahjahan, Kutub Uddin Chisty
Abstract:
Social media refers to the means of interactions among people in which they create share, exchange and comment contents among themselves in virtual communities and networks. Social media or "social networking" has almost become part of our daily lives and being tossed around over the past few years. It is like any other media such as newspaper, radio and television but it is far more than just about sharing information and ideas. Social networking tools like Twitter, Facebook, Flickr and Blogs have facilitated creation and exchange of ideas so quickly and widely than the conventional media. This paper shows the choices, communication, feeling comfort, time saving and effects of social media among the people.
Keywords: Media, Choice, Effect.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 183691969 Optimization of Doubly Fed Induction Generator Equivalent Circuit Parameters by Direct Search Method
Authors: Mamidi Ramakrishna Rao
Abstract:
Doubly-fed induction generator (DFIG) is currently the choice for many wind turbines. These generators, when connected to the grid through a converter, is subjected to varied power system conditions like voltage variation, frequency variation, short circuit fault conditions, etc. Further, many countries like Canada, Germany, UK, Scotland, etc. have distinct grid codes relating to wind turbines. Accordingly, following the network faults, wind turbines have to supply a definite reactive current. To satisfy the requirements including reactive current capability, an optimum electrical design becomes a mandate for DFIG to function. This paper intends to optimize the equivalent circuit parameters of an electrical design for satisfactory DFIG performance. Direct search method has been used for optimization of the parameters. The variables selected include electromagnetic core dimensions (diameters and stack length), slot dimensions, radial air gap between stator and rotor and winding copper cross section area. Optimization for 2 MW DFIG has been executed separately for three objective functions - maximum reactive power capability (Case I), maximum efficiency (Case II) and minimum weight (Case III). In the optimization analysis program, voltage variations (10%), power factor- leading and lagging (0.95), speeds for corresponding to slips (-0.3 to +0.3) have been considered. The optimum designs obtained for objective functions were compared. It can be concluded that direct search method of optimization helps in determining an optimum electrical design for each objective function like efficiency or reactive power capability or weight minimization.
Keywords: Direct search, DFIG, equivalent circuit parameters, optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9051968 IFC-Based Construction Engineering Domain Otology Development
Authors: Jin Si, Yanzhong Wang
Abstract:
The essence of the 21st century is knowledge economy. Knowledge has become the key resource of economic growth and social development. Construction industry is no exception. Because of the characteristic of complexity, project manager can't depend only on information management. The only way to improve the level of construction project management is to set up a kind of effective knowledge accumulation mechanism. This paper first introduced the IFC standard and the concept of ontology. Then put forward the construction method of the architectural engineering domain ontology based on IFC. And finally build up the concepts, properties and the relationship between the concepts of the ontology. The deficiency of this paper is also pointed out.
Keywords: Construction Engineering, IFC, Ontology
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21091967 Architecture Exception Governance
Authors: Ondruska Marek
Abstract:
The article presents the whole model of IS/IT architecture exception governance. As first, the assumptions of presented model are set. As next, there is defined a generic governance model that serves as a basis for the architecture exception governance. The architecture exception definition and its attributes follow. The model respects well known approaches to the area that are described in the text, but it adopts higher granularity in description and expands the process view with all the next necessary governance components as roles, principles and policies, tools to enable the implementation of the model into organizations. The architecture exception process is decomposed into a set of processes related to the architecture exception lifecycle consisting of set of phases and architecture exception states. Finally, there is information about my future research related to this area.Keywords: Architecture, dispensation, exception, governance, model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2475