Search results for: object based classification
28517 The Optimum Mel-Frequency Cepstral Coefficients (MFCCs) Contribution to Iranian Traditional Music Genre Classification by Instrumental Features
Authors: M. Abbasi Layegh, S. Haghipour, K. Athari, R. Khosravi, M. Tafkikialamdari
Abstract:
An approach to find the optimum mel-frequency cepstral coefficients (MFCCs) for the Radif of Mirzâ Ábdollâh, which is the principal emblem and the heart of Persian music, performed by most famous Iranian masters on two Iranian stringed instruments ‘Tar’ and ‘Setar’ is proposed. While investigating the variance of MFCC for each record in themusic database of 1500 gushe of the repertoire belonging to 12 modal systems (dastgâh and âvâz), we have applied the Fuzzy C-Mean clustering algorithm on each of the 12 coefficient and different combinations of those coefficients. We have applied the same experiment while increasing the number of coefficients but the clustering accuracy remained the same. Therefore, we can conclude that the first 7 MFCCs (V-7MFCC) are enough for classification of The Radif of Mirzâ Ábdollâh. Classical machine learning algorithms such as MLP neural networks, K-Nearest Neighbors (KNN), Gaussian Mixture Model (GMM), Hidden Markov Model (HMM) and Support Vector Machine (SVM) have been employed. Finally, it can be realized that SVM shows a better performance in this study.Keywords: radif of Mirzâ Ábdollâh, Gushe, mel frequency cepstral coefficients, fuzzy c-mean clustering algorithm, k-nearest neighbors (KNN), gaussian mixture model (GMM), hidden markov model (HMM), support vector machine (SVM)
Procedia PDF Downloads 44628516 The Risk of Deaths from Viral Hepatitis among the Female Workers in the Beauty Service Industry
Authors: Byeongju Choi, Sanggil Lee, Kyung-Eun Lee
Abstract:
Introduction: In the republic of Korea, the number of workers in the beauty industry has been increasing. Because the prevalence of hepatitis B carriers in Korea is higher than in other countries, the risk of blood-borne infection including viral hepatitis B and C, among the workers by using the sharp and contaminated instruments during procedure can be expected among beauty salon workers. However, the health care policies for the workers to prevent the blood-borne infection are not established due to the lack of evidences. Moreover, the workers in hair and nail salon were mostly employed at small businesses, where national mandatory systems or policies for workers’ health management are not applied. In this study, the risk of the viral hepatitis B and C from the job experiencing the hair and nail procedures in the mortality was assessed. Method: We conducted a retrospective review of the job histories and causes of death in the female deaths from 2006-2016. 132,744 of female deaths who had one more job experiences during their lifetime were included in this study. Job histories were assessed using the employment insurance database in Korea Employment Information Service (KEIS) and the causes of death were in death statistics produced by Statistics Korea. Case group (n= 666) who died from viral hepatitis was classified the death having record involved in ‘B15-B19’ as a cause of deaths based on Korean Standard Classification of Diseases(KCD) with the deaths from other causes, control group (n=132,078). The group of the workers in the beauty service industry were defined as the employees who had ever worked in the industry coded as ‘9611’ based on Korea Standard Industry Classification (KSIC) and others were others. Other than job histories, birth year, marital status, education level were investigated from the death statistics. Multiple logistic regression analysis were used to assess the risk of deaths from viral hepatitis in the case and control group. Result: The number of the deaths having ever job experiences at the hair and nail salon was 255. After adjusting confounders of age, marital status and education, the odds ratio(OR) for deaths from viral hepatitis was quite high in the group having experiences with working in the beauty service industry with 3.14(95% confidence interval(CI) 1.00-9.87). Other associated factors with increasing the risk of deaths from viral hepatitis were low education level(OR=1.34, 95% CI 1.04-1.73), married women (OR=1.42, 95% CI 1.02-1.97). Conclusion: The risk of deaths from viral hepatitis were high in the workers in the beauty service industry but not statistically significant, which might attributed from the small number of workers in beauty service industry. It was likely that the number of workers in beauty service industry could be underestimated due to their temporary job position. Further studies evaluating the status and the incidence of viral infection among the workers with consideration of the vertical transmission would be required.Keywords: beauty service, viral hepatitis, blood-borne infection, viral infection
Procedia PDF Downloads 13928515 Long Short-Term Memory Based Model for Modeling Nicotine Consumption Using an Electronic Cigarette and Internet of Things Devices
Authors: Hamdi Amroun, Yacine Benziani, Mehdi Ammi
Abstract:
In this paper, we want to determine whether the accurate prediction of nicotine concentration can be obtained by using a network of smart objects and an e-cigarette. The approach consists of, first, the recognition of factors influencing smoking cessation such as physical activity recognition and participant’s behaviors (using both smartphone and smartwatch), then the prediction of the configuration of the e-cigarette (in terms of nicotine concentration, power, and resistance of e-cigarette). The study uses a network of commonly connected objects; a smartwatch, a smartphone, and an e-cigarette transported by the participants during an uncontrolled experiment. The data obtained from sensors carried in the three devices were trained by a Long short-term memory algorithm (LSTM). Results show that our LSTM-based model allows predicting the configuration of the e-cigarette in terms of nicotine concentration, power, and resistance with a root mean square error percentage of 12.9%, 9.15%, and 11.84%, respectively. This study can help to better control consumption of nicotine and offer an intelligent configuration of the e-cigarette to users.Keywords: Iot, activity recognition, automatic classification, unconstrained environment
Procedia PDF Downloads 22428514 A Machine Learning Based Method to Detect System Failure in Resource Constrained Environment
Authors: Payel Datta, Abhishek Das, Abhishek Roychoudhury, Dhiman Chattopadhyay, Tanushyam Chattopadhyay
Abstract:
Machine learning (ML) and deep learning (DL) is most predominantly used in image/video processing, natural language processing (NLP), audio and speech recognition but not that much used in system performance evaluation. In this paper, authors are going to describe the architecture of an abstraction layer constructed using ML/DL to detect the system failure. This proposed system is used to detect the system failure by evaluating the performance metrics of an IoT service deployment under constrained infrastructure environment. This system has been tested on the manually annotated data set containing different metrics of the system, like number of threads, throughput, average response time, CPU usage, memory usage, network input/output captured in different hardware environments like edge (atom based gateway) and cloud (AWS EC2). The main challenge of developing such system is that the accuracy of classification should be 100% as the error in the system has an impact on the degradation of the service performance and thus consequently affect the reliability and high availability which is mandatory for an IoT system. Proposed ML/DL classifiers work with 100% accuracy for the data set of nearly 4,000 samples captured within the organization.Keywords: machine learning, system performance, performance metrics, IoT, edge
Procedia PDF Downloads 19528513 Computer Aide Discrimination of Benign and Malignant Thyroid Nodules by Ultrasound Imaging
Authors: Akbar Gharbali, Ali Abbasian Ardekani, Afshin Mohammadi
Abstract:
Introduction: Thyroid nodules have an incidence of 33-68% in the general population. More than 5-15% of these nodules are malignant. Early detection and treatment of thyroid nodules increase the cure rate and provide optimal treatment. Between the medical imaging methods, Ultrasound is the chosen imaging technique for assessment of thyroid nodules. The confirming of the diagnosis usually demands repeated fine-needle aspiration biopsy (FNAB). So, current management has morbidity and non-zero mortality. Objective: To explore diagnostic potential of automatic texture analysis (TA) methods in differentiation benign and malignant thyroid nodules by ultrasound imaging in order to help for reliable diagnosis and monitoring of the thyroid nodules in their early stages with no need biopsy. Material and Methods: The thyroid US image database consists of 70 patients (26 benign and 44 malignant) which were reported by Radiologist and proven by the biopsy. Two slices per patient were loaded in Mazda Software version 4.6 for automatic texture analysis. Regions of interests (ROIs) were defined within the abnormal part of the thyroid nodules ultrasound images. Gray levels within an ROI normalized according to three normalization schemes: N1: default or original gray levels, N2: +/- 3 Sigma or dynamic intensity limited to µ+/- 3σ, and N3: present intensity limited to 1% - 99%. Up to 270 multiscale texture features parameters per ROIs per each normalization schemes were computed from well-known statistical methods employed in Mazda software. From the statistical point of view, all calculated texture features parameters are not useful for texture analysis. So, the features based on maximum Fisher coefficient and the minimum probability of classification error and average correlation coefficients (POE+ACC) eliminated to 10 best and most effective features per normalization schemes. We analyze this feature under two standardization states (standard (S) and non-standard (NS)) with Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA) and Non-Linear Discriminant Analysis (NDA). The 1NN classifier was performed to distinguish between benign and malignant tumors. The confusion matrix and Receiver operating characteristic (ROC) curve analysis were used for the formulation of more reliable criteria of the performance of employed texture analysis methods. Results: The results demonstrated the influence of the normalization schemes and reduction methods on the effectiveness of the obtained features as a descriptor on discrimination power and classification results. The selected subset features under 1%-99% normalization, POE+ACC reduction and NDA texture analysis yielded a high discrimination performance with the area under the ROC curve (Az) of 0.9722, in distinguishing Benign from Malignant Thyroid Nodules which correspond to sensitivity of 94.45%, specificity of 100%, and accuracy of 97.14%. Conclusions: Our results indicate computer-aided diagnosis is a reliable method, and can provide useful information to help radiologists in the detection and classification of benign and malignant thyroid nodules.Keywords: ultrasound imaging, thyroid nodules, computer aided diagnosis, texture analysis, PCA, LDA, NDA
Procedia PDF Downloads 28028512 The Application on Interactivity of Light in New Media Art
Authors: Yansong Chen
Abstract:
In the age of media convergence, new media technology is constantly impacting, changing, and even reshaping the limits of Art. From the technological ontology of the new media art, the concept of interaction design has always been dominated by I/O (Input/Output) systems through the ages, which ignores the content of systems and kills the aura of art. Light, as a fusion media, basically comes from the extension of some human feelings and can be the content of the input or the effect of output. In this paper, firstly, on the basis of literature review, the interaction characteristics research was conducted on light. Secondly, starting from discourse patterns of people and machines, people and people, people, and imagining things, we propose three light modes: object-oriented interaction, Immersion interaction, Tele-Presence interaction. Finally, this paper explains how to regain the aura of art through light elements in new media art and understand multiple levels of 'Interaction design'. In addition, the new media art, especially the light-based interaction art, enriches the language patterns and motivates emerging art forms to be more widespread and popular, which achieves its aesthetics growth.Keywords: new media art, interaction design, light art, immersion
Procedia PDF Downloads 23628511 Multi-Scale Urban Spatial Evolution Analysis Based on Space Syntax: A Case Study in Modern Yangzhou, China
Authors: Dai Zhimei, Hua Chen
Abstract:
The exploration of urban spatial evolution is an important part of urban development research. Therefore, the evolutionary modern Yangzhou urban spatial texture was taken as the research object, and Spatial Syntax was used as the main research tool, this paper explored Yangzhou spatial evolution law and its driving factors from the urban street network scale, district scale and street scale. The study has concluded that at the urban scale, Yangzhou urban spatial evolution is the result of a variety of causes, including physical and geographical condition, policy and planning factors, and traffic conditions, and the evolution of space also has an impact on social, economic, environmental and cultural factors. At the district and street scales, changes in space will have a profound influence on the history of the city and the activities of people. At the end of the article, the matters needing attention during the evolution of urban space were summarized.Keywords: block, space syntax and methodology, street, urban space, Yangzhou
Procedia PDF Downloads 18128510 Classification of Foliar Nitrogen in Common Bean (Phaseolus Vulgaris L.) Using Deep Learning Models and Images
Authors: Marcos Silva Tavares, Jamile Raquel Regazzo, Edson José de Souza Sardinha, Murilo Mesquita Baesso
Abstract:
Common beans are a widely cultivated and consumed legume globally, serving as a staple food for humans, especially in developing countries, due to their nutritional characteristics. Nitrogen (N) is the most limiting nutrient for productivity, and foliar analysis is crucial to ensure balanced nitrogen fertilization. Excessive N applications can cause, either isolated or cumulatively, soil and water contamination, plant toxicity, and increase their susceptibility to diseases and pests. However, the quantification of N using conventional methods is time-consuming and costly, demanding new technologies to optimize the adequate supply of N to plants. Thus, it becomes necessary to establish constant monitoring of the foliar content of this macronutrient in plants, mainly at the V4 stage, aiming at precision management of nitrogen fertilization. In this work, the objective was to evaluate the performance of a deep learning model, Resnet-50, in the classification of foliar nitrogen in common beans using RGB images. The BRS Estilo cultivar was sown in a greenhouse in a completely randomized design with four nitrogen doses (T1 = 0 kg N ha-1, T2 = 25 kg N ha-1, T3 = 75 kg N ha-1, and T4 = 100 kg N ha-1) and 12 replications. Pots with 5L capacity were used with a substrate composed of 43% soil (Neossolo Quartzarênico), 28.5% crushed sugarcane bagasse, and 28.5% cured bovine manure. The water supply of the plants was done with 5mm of water per day. The application of urea (45% N) and the acquisition of images occurred 14 and 32 days after sowing, respectively. A code developed in Matlab© R2022b was used to cut the original images into smaller blocks, originating an image bank composed of 4 folders representing the four classes and labeled as T1, T2, T3, and T4, each containing 500 images of 224x224 pixels obtained from plants cultivated under different N doses. The Matlab© R2022b software was used for the implementation and performance analysis of the model. The evaluation of the efficiency was done by a set of metrics, including accuracy (AC), F1-score (F1), specificity (SP), area under the curve (AUC), and precision (P). The ResNet-50 showed high performance in the classification of foliar N levels in common beans, with AC values of 85.6%. The F1 for classes T1, T2, T3, and T4 was 76, 72, 74, and 77%, respectively. This study revealed that the use of RGB images combined with deep learning can be a promising alternative to slow laboratory analyses, capable of optimizing the estimation of foliar N. This can allow rapid intervention by the producer to achieve higher productivity and less fertilizer waste. Future approaches are encouraged to develop mobile devices capable of handling images using deep learning for the classification of the nutritional status of plants in situ.Keywords: convolutional neural network, residual network 50, nutritional status, artificial intelligence
Procedia PDF Downloads 1928509 Conformance to Spatial Planning between the Kampala Physical Development Plan of 2012 and the Existing Land Use in 2021
Authors: Brendah Nagula, Omolo Fredrick Okalebo, Ronald Ssengendo, Ivan Bamweyana
Abstract:
The Kampala Physical Development Plan (KPDP) was developed in 2012 and projected both long term and short term developments within the City .The purpose of the plan was to not only shape the city into a spatially planned area but also to control the urban sprawl trends that had expanded with pronounced instances of informal settlements. This plan was approved by the National Physical Planning Board and a signature was appended by the Minister in 2013. Much as the KPDP plan has been implemented using different approaches such as detailed planning, development control, subdivision planning, carrying out construction inspections, greening and beautification, there is still limited knowledge on the level of conformance towards this plan. Therefore, it is yet to be determined whether it has been effective in shaping the City into an ideal spatially planned area. Attaining a clear picture of the level of conformance towards the KPDP 2012 through evaluation between the planned and the existing land use in Kampala City was performed. Methods such as Supervised Classification and Post Classification Change Detection were adopted to perform this evaluation. Scrutiny of findings revealed Central Division registered the lowest level of conformance to the planning standards specified in the KPDP 2012 followed by Nakawa, Rubaga, Kawempe, and Makindye. Furthermore, mixed-use development was identified as the land use with the highest level of non-conformity of 25.11% and institutional land use registered the highest level of conformance of 84.45 %. The results show that the aspect of location was not carefully considered while allocating uses in the KPDP whereby areas located near the Central Business District have higher land rents and hence require uses that ensure profit maximization. Also, the prominence of development towards mixed-use denotes an increased demand for land towards compact development that was not catered for in the plan. Therefore in order to transform Kampala city into a spatially planned area, there is need to carefully develop detailed plans especially for all the Central Division planning precincts indicating considerations for land use densification.Keywords: spatial plan, post classification change detection, Kampala city, landuse
Procedia PDF Downloads 9228508 International Tourists’ Travel Motivation by Push-Pull Factors and Decision Making for Selecting Thailand as Destination Choice
Authors: Siripen Yiamjanya, Kevin Wongleedee
Abstract:
This research paper aims to identify travel motivation by push and pull factors that affected decision making of international tourists in selecting Thailand as their destination choice. A total of 200 international tourists who traveled to Thailand during January and February, 2014 were used as the sample in this study. A questionnaire was employed as a tool in collecting the data, conducted in Bangkok. The list consisted of 30 attributes representing both psychological factors as “push- based factors” and destination factors as “pull-based factors”. Mean and standard deviation were used in order to find the top ten travel motives that were important determinants in the respondents’ decision making process to select Thailand as their destination choice. The finding revealed the top ten travel motivations influencing international tourists to select Thailand as their destination choice included [i] getting experience in foreign land; [ii] Thai food; [iii] learning new culture; [iv] relaxing in foreign land; [v] wanting to learn new things; [vi] being interested in Thai culture, and traditional markets; [vii] escaping from same daily life; [viii] enjoying activities; [ix] adventure; and [x] good weather. Classification of push- based and pull- based motives suggested that getting experience in foreign land was the most important push motive for international tourists to travel, while Thai food portrayed its highest significance as pull motive. Discussion and suggestions were also made for tourism industry of Thailand.Keywords: decision making, destination choice, international tourist, pull factor, push factor, Thailand, travel motivation
Procedia PDF Downloads 39328507 Electroencephalography-Based Intention Recognition and Consensus Assessment during Emergency Response
Abstract:
After natural and man-made disasters, robots can bypass the danger, expedite the search, and acquire unprecedented situational awareness to design rescue plans. The hands-free requirement from the first responders excludes the use of tedious manual control and operation. In unknown, unstructured, and obstructed environments, natural-language-based supervision is not amenable for first responders to formulate, and is difficult for robots to understand. Brain-computer interface is a promising option to overcome the limitations. This study aims to test the feasibility of using electroencephalography (EEG) signals to decode human intentions and detect the level of consensus on robot-provided information. EEG signals were classified using machine-learning and deep-learning methods to discriminate search intentions and agreement perceptions. The results show that the average classification accuracy for intention recognition and consensus assessment is 67% and 72%, respectively, proving the potential of incorporating recognizable users’ bioelectrical responses into advanced robot-assisted systems for emergency response.Keywords: consensus assessment, electroencephalogram, emergency response, human-robot collaboration, intention recognition, search and rescue
Procedia PDF Downloads 9328506 Intuitional Insight in Islamic Mysticism
Authors: Maryam Bakhtyar, Pegah Akrami
Abstract:
Intuitional insight or mystical cognition is a different insight from common, concrete and intellectual insights. This kind of insight is not achieved by visionary contemplation but by the recitation of God, self-purification, and mystical life. In this insight, there is no distance or medium between the subject of cognition and its object, and they have a sort of unification, unison, and incorporation. As a result, knowledgeable consider this insight as direct, immediate, and personal. The goal of this insight is God, cosmos’ creatures, and the general inner and hidden aspect of the world that is nothing except God’s manifestations in the view of mystics. AS our common cognitions have diversity and stages, intuitional insight also has diversity and levels. As our senses are divided into concrete and rational, mystical discovery is divided into superficial discovery and spiritual one. Based on Islamic mystics, the preferable way to know God and believe in him is intuitional insight. There are two important criteria for evaluating mystical intuition, especially for beginner mystics of intellect and revelation. Indeed, the conclusion and a brief evaluation of Islamic mystics’ viewpoint is the main subject of this paper.Keywords: intuition, discovery, mystical insight, personal knowledge, superficial discovery, spiritual discovery
Procedia PDF Downloads 9428505 Brexit and Financial Stability: An Agent-Based Simulation
Authors: Aristeidis Samitas, Stathis Polyzos
Abstract:
As the UK and the EU prepare to start negotiations for Brexit, it is important for both sides to comprehend the full extent of the consequences of this process. In this paper, we employ an object oriented simulation framework in order to test for the short-term and long-term effects of Brexit on both sides of the Channel. The relative strength of the UK economy and the banking sector vis-à-vis the EU is taken under consideration. Our results confirm predictions in the relevant literature regarding the output cost of Brexit, with particular emphasis on the EU. Furthermore, we show that financial stability is also an important issue on both sides, with the banking system suffering significant losses, particularly over the longer term. Our findings suggest that policymakers should be extremely careful in handling Brexit negotiations, making sure to consider dynamic effects that may be caused by UK bank assets moving to the EU after Brexit. The model results show that, as the UK banking system loses its assets, the end state of the UK economy is deteriorated while the end state of EU economy is improved.Keywords: Banking Crises, Brexit, Financial Stability, VBanking
Procedia PDF Downloads 28028504 Weight Estimation Using the K-Means Method in Steelmaking’s Overhead Cranes in Order to Reduce Swing Error
Authors: Seyedamir Makinejadsanij
Abstract:
One of the most important factors in the production of quality steel is to know the exact weight of steel in the steelmaking area. In this study, a calculation method is presented to estimate the exact weight of the melt as well as the objects transported by the overhead crane. Iran Alloy Steel Company's steelmaking area has three 90-ton cranes, which are responsible for transferring the ladles and ladle caps between 34 areas in the melt shop. Each crane is equipped with a Disomat Tersus weighing system that calculates and displays real-time weight. The moving object has a variable weight due to swinging, and the weighing system has an error of about +-5%. This means that when the object is moving by a crane, which weighs about 80 tons, the device (Disomat Tersus system) calculates about 4 tons more or 4 tons less, and this is the biggest problem in calculating a real weight. The k-means algorithm is an unsupervised clustering method that was used here. The best result was obtained by considering 3 centers. Compared to the normal average(one) or two, four, five, and six centers, the best answer is with 3 centers, which is logically due to the elimination of noise above and below the real weight. Every day, the standard weight is moved with working cranes to test and calibrate cranes. The results are shown that the accuracy is about 40 kilos per 60 tons (standard weight). As a result, with this method, the accuracy of moving weight is calculated as 99.95%. K-means is used to calculate the exact mean of objects. The stopping criterion of the algorithm is also the number of 1000 repetitions or not moving the points between the clusters. As a result of the implementation of this system, the crane operator does not stop while moving objects and continues his activity regardless of weight calculations. Also, production speed increased, and human error decreased.Keywords: k-means, overhead crane, melt weight, weight estimation, swing problem
Procedia PDF Downloads 9028503 High-Resolution ECG Automated Analysis and Diagnosis
Authors: Ayad Dalloo, Sulaf Dalloo
Abstract:
Electrocardiogram (ECG) recording is prone to complications, on analysis by physicians, due to noise and artifacts, thus creating ambiguity leading to possible error of diagnosis. Such drawbacks may be overcome with the advent of high resolution Methods, such as Discrete Wavelet Analysis and Digital Signal Processing (DSP) techniques. This ECG signal analysis is implemented in three stages: ECG preprocessing, features extraction and classification with the aim of realizing high resolution ECG diagnosis and improved detection of abnormal conditions in the heart. The preprocessing stage involves removing spurious artifacts (noise), due to such factors as muscle contraction, motion, respiration, etc. ECG features are extracted by applying DSP and suggested sloping method techniques. These measured features represent peak amplitude values and intervals of P, Q, R, S, R’, and T waves on ECG, and other features such as ST elevation, QRS width, heart rate, electrical axis, QR and QT intervals. The classification is preformed using these extracted features and the criteria for cardiovascular diseases. The ECG diagnostic system is successfully applied to 12-lead ECG recordings for 12 cases. The system is provided with information to enable it diagnoses 15 different diseases. Physician’s and computer’s diagnoses are compared with 90% agreement, with respect to physician diagnosis, and the time taken for diagnosis is 2 seconds. All of these operations are programmed in Matlab environment.Keywords: ECG diagnostic system, QRS detection, ECG baseline removal, cardiovascular diseases
Procedia PDF Downloads 29728502 Identification of Vulnerable Zone Due to Cyclone-Induced Storm Surge in the Exposed Coast of Bangladesh
Authors: Mohiuddin Sakib, Fatin Nihal, Rabeya Akter, Anisul Haque, Munsur Rahman, Wasif-E-Elahi
Abstract:
Surge generating cyclones are one of the deadliest natural disasters that threaten the life of coastal environment and communities worldwide. Due to the geographic location, ‘low lying alluvial plain, geomorphologic characteristics and 710 kilometers exposed coastline, Bangladesh is considered as one of the greatest vulnerable country for storm surge flooding. Bay of Bengal is possessing the highest potential of creating storm surge inundation to the coastal areas. Bangladesh is the most exposed country to tropical cyclone with an average of four cyclone striking every years. Frequent cyclone landfall made the country one of the worst sufferer within the world for cyclone induced storm surge flooding and casualties. During the years from 1797 to 2009 Bangladesh has been hit by 63 severe cyclones with strengths of different magnitudes. Though detailed studies were done focusing on the specific cyclone like Sidr or Aila, no study was conducted where vulnerable areas of exposed coast were identified based on the strength of cyclones. This study classifies the vulnerable areas of the exposed coast based on storm surge inundation depth and area due to cyclones of varying strengths. Classification of the exposed coast based on hazard induced cyclonic vulnerability will help the decision makers to take appropriate policies for reducing damage and loss.Keywords: cyclone, landfall, storm surge, exposed coastline, vulnerability
Procedia PDF Downloads 39928501 Sustainable Urban Regenaration the New Vocabulary and the Timless Grammar of the Urban Tissue
Authors: Ruth Shapira
Abstract:
Introduction: The rapid urbanization of the last century confronts planners, regulatory bodies, developers and most of all the public with seemingly unsolved conflicts regarding values, capital, and wellbeing of the built and un-built urban space. There is an out of control change of scale of the urban form and of the rhythm of the urban life which has known no significant progress in the last 2-3 decades despite the on-growing urban population. It is the objective of this paper to analyze some of these fundamental issues through the case study of a relatively small town in the center of Israel (Kiryat-Ono, 36,000 inhabitants), unfold the deep structure of qualities versus disruptors, present some cure that we have developed to bridge over and humbly suggest a practice that may bring about a sustainable new urban environment based on timeless values of the past, an approach that can be generic for similar cases. Basic Methodologies:The object, the town of Kiryat Ono, shall be experimented upon in a series of four action processes: De-composition, Re-composition, the Centering process and, finally, Controlled Structural Disintegration. Each stage will be based on facts, analysis of previous multidisciplinary interventions on various layers – and the inevitable reaction of the OBJECT, leading to the conclusion based on innovative theoretical and practical methods that we have developed and that we believe are proper for the open ended network, setting the rules for the contemporary urban society to cluster by – thus – a new urban vocabulary based on the old structure of times passed. The Study: Kiryat Ono, was founded 70 years ago as an agricultural settlement and rapidly turned into an urban entity. In spite the massive intensification, the original DNA of the old small town was still deeply embedded, mostly in the quality of the public space and in the sense of clustered communities. In the past 20 years, the recent demand for housing has been addressed to on the national level with recent master plans and urban regeneration policies mostly encouraging individual economic initiatives. Unfortunately, due to the obsolete existing planning platform the present urban renewal is characterized by pressure of developers, a dramatic change in building scale and widespread disintegration of the existing urban and social tissue.Our office was commissioned to conceptualize two master plans for the two contradictory processes of Kiryat Ono’s future: intensification and conservation. Following a comprehensive investigation into the deep structures and qualities of the existing town, we developed a new vocabulary of conservation terms thus redefying the sense of PLACE. The main challenge was to create master plans that should offer a regulatory basis to the accelerated and sporadic development providing for the public good and preserving the characteristics of the place consisting of a tool box of design guidelines that will have the ability to reorganize space along the time axis in a sustainable way. In conclusion: The system of rules that we have developed can generate endless possible patterns making sure that at each implementation fragment an event is created, and a better place is revealed. It takes time and perseverance but it seems to be the way to provide a healthy and sustainable framework for the accelerated urbanization of our chaotic present.Keywords: sustainable urban design, intensification, emergent urban patterns, sustainable housing, compact urban neighborhoods, sustainable regeneration, restoration, complexity, uncertainty, need for change, implications of legislation on local planning
Procedia PDF Downloads 38828500 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events
Authors: Jaqueline Maria Ribeiro Vieira
Abstract:
Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). Previously we developed and proposed a novel strategy capable of detecting patterns at borehole images that may point to regions that have tension and breakout characteristics, based on segmented images. In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge data set configurations.Keywords: image segmentation, oil well visualization, classifiers, data-mining, visual computer
Procedia PDF Downloads 30328499 Film Studies: Definition, Current Status, and Future Perspectives for Cuba
Authors: Carlos Guillermo Lloga Sanz, Maria del Carmen Tamayo Asef
Abstract:
As an object of study in Cuban universities, cinema is still in its infancy. This is relevant considering the significance of cinema within the local political culture and its impact on countries of the region. Discussions about the medium have been carried out mainly in the field of film criticism. The objective of this article is to reflect on the divergences between film studies and film criticism taking into account formal and theoretical features and to explore the transcendence of this debate for the intellectual ambiance of the Island. Methodologically, the study relies on theoretical elaborations based on literature review and non-structure interviews with Cuban film critics and scholars. The study finds that the gradation proposed by the Anglo-Saxon tradition, where film studies are considered a “higher stage," compared to criticism and cinephilia, does not apply to the Cuban space. Instead, to assess the state of reflection on cinema in Cuba, it is essential to consider it a starry node traversed by epistemic, institutional, and geopolitical matrices.Keywords: film studies, film criticism, Cuban cinema, Cuban film studies
Procedia PDF Downloads 10228498 A Collaborative Platform for Multilingual Ontology Development
Authors: Ahmed Tawfik, Fausto Giunchiglia, Vincenzo Maltese
Abstract:
Ontologies provide a common understanding of a specific domain of interest that can be communicated between people and used as background knowledge for automated reasoning in a wide range of applications. In this paper we address the design of multilingual ontologies following well-defined knowledge engineering methodologies with the support of novel collaborative development approaches. In particular, we present a collaborative platform which allows ontologies to be developed incrementally in multiple languages. This is made possible via an appropriate mapping between language independent concepts and one lexicalization per language (or a lexical gap in case such lexicalization does not exist). The collaborative platform has been designed to support the development of the Universal Knowledge Core, a multilingual ontology currently in English, Italian, Chinese, Mongolian, Hindi, and Bangladeshi. Its design follows a workflow-based development methodology that models resources as a set of collaborative objects and assigns customizable workflows to build and maintain each collaborative object in a community driven manner, with extensive support of modern web 2.0 social and collaborative features.Keywords: knowledge diversity, knowledge representation, ontology, development
Procedia PDF Downloads 39228497 Text2Time: Transformer-Based Article Time Period Prediction
Authors: Karthick Prasad Gunasekaran, B. Chase Babrich, Saurabh Shirodkar, Hee Hwang
Abstract:
Construction preparation is crucial for the success of a construction project. By involving project participants early in the construction phase, project managers can plan ahead and resolve issues early, resulting in project success and satisfaction. This study uses quantitative data from construction management projects to determine the relationship between the pre-construction phase, construction schedule, and customer satisfaction. This study examined a total of 65 construction projects and 93 clients per job to (a) identify the relationship between the pre-construction phase and program reduction and (b) the pre-construction phase and customer retention. Based on a quantitative analysis, this study found a negative correlation between pre-construction status and project schedule in 65 construction projects. This finding means that the more preparatory work done on a particular project, the shorter the total construction time. The Net Promoter Score of 93 clients from 65 projects was then used to determine the relationship between construction preparation and client satisfaction. The pre-construction status and the projects were further analyzed, and a positive correlation between them was found. This shows that customers are happier with projects with a higher ready-to-build ratio than projects with less ready-to-build.Keywords: NLP, BERT, LLM, deep learning, classification
Procedia PDF Downloads 10428496 The Classification of Parkinson Tremor and Essential Tremor Based on Frequency Alteration of Different Activities
Authors: Chusak Thanawattano, Roongroj Bhidayasiri
Abstract:
This paper proposes a novel feature set utilized for classifying the Parkinson tremor and essential tremor. Ten ET and ten PD subjects are asked to perform kinetic, postural and resting tests. The empirical mode decomposition (EMD) is used to decompose collected tremor signal to a set of intrinsic mode functions (IMF). The IMFs are used for reconstructing representative signals. The feature set is composed of peak frequencies of IMFs and reconstructed signals. Hypothesize that the dominant frequency components of subjects with PD and ET change in different directions for different tests, difference of peak frequencies of IMFs and reconstructed signals of pairwise based tests (kinetic-resting, kinetic-postural and postural-resting) are considered as potential features. Sets of features are used to train and test by classifier including the quadratic discriminant classifier (QLC) and the support vector machine (SVM). The best accuracy, the best sensitivity and the best specificity are 90%, 87.5%, and 92.86%, respectively.Keywords: tremor, Parkinson, essential tremor, empirical mode decomposition, quadratic discriminant, support vector machine, peak frequency, auto-regressive, spectrum estimation
Procedia PDF Downloads 44328495 Deep Learning Based on Image Decomposition for Restoration of Intrinsic Representation
Authors: Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Kensuke Nakamura, Dongeun Choi, Byung-Woo Hong
Abstract:
Artefacts are commonly encountered in the imaging process of clinical computed tomography (CT) where the artefact refers to any systematic discrepancy between the reconstructed observation and the true attenuation coefficient of the object. It is known that CT images are inherently more prone to artefacts due to its image formation process where a large number of independent detectors are involved, and they are assumed to yield consistent measurements. There are a number of different artefact types including noise, beam hardening, scatter, pseudo-enhancement, motion, helical, ring, and metal artefacts, which cause serious difficulties in reading images. Thus, it is desired to remove nuisance factors from the degraded image leaving the fundamental intrinsic information that can provide better interpretation of the anatomical and pathological characteristics. However, it is considered as a difficult task due to the high dimensionality and variability of data to be recovered, which naturally motivates the use of machine learning techniques. We propose an image restoration algorithm based on the deep neural network framework where the denoising auto-encoders are stacked building multiple layers. The denoising auto-encoder is a variant of a classical auto-encoder that takes an input data and maps it to a hidden representation through a deterministic mapping using a non-linear activation function. The latent representation is then mapped back into a reconstruction the size of which is the same as the size of the input data. The reconstruction error can be measured by the traditional squared error assuming the residual follows a normal distribution. In addition to the designed loss function, an effective regularization scheme using residual-driven dropout determined based on the gradient at each layer. The optimal weights are computed by the classical stochastic gradient descent algorithm combined with the back-propagation algorithm. In our algorithm, we initially decompose an input image into its intrinsic representation and the nuisance factors including artefacts based on the classical Total Variation problem that can be efficiently optimized by the convex optimization algorithm such as primal-dual method. The intrinsic forms of the input images are provided to the deep denosing auto-encoders with their original forms in the training phase. In the testing phase, a given image is first decomposed into the intrinsic form and then provided to the trained network to obtain its reconstruction. We apply our algorithm to the restoration of the corrupted CT images by the artefacts. It is shown that our algorithm improves the readability and enhances the anatomical and pathological properties of the object. The quantitative evaluation is performed in terms of the PSNR, and the qualitative evaluation provides significant improvement in reading images despite degrading artefacts. The experimental results indicate the potential of our algorithm as a prior solution to the image interpretation tasks in a variety of medical imaging applications. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).Keywords: auto-encoder neural network, CT image artefact, deep learning, intrinsic image representation, noise reduction, total variation
Procedia PDF Downloads 19028494 The Effectiveness of Intervention Methods for Repetitive Behaviors in Preschool Children with Autism Spectrum Disorder: A Systematic Review
Authors: Akane Uda, Ami Tabata, Mi An, Misa Komaki, Ryotaro Ito, Mayumi Inoue, Takehiro Sasai, Yusuke Kusano, Toshihiro Kato
Abstract:
Early intervention is recommended for children with autism spectrum disorder (ASD), and an increasing number of children have received support and intervention before school age in recent years. In this study, we systematically reviewed preschool interventions focused on repetitive behaviors observed in children with ASD, which are often observed at younger ages. Inclusion criteria were as follows : (1) Child of preschool status (age ≤ 7 years) with a diagnosis of ASD (including autism, Asperger's, and pervasive developmental disorder) or a parent (caregiver) with a preschool child with ASD, (2) Physician-confirmed diagnosis of ASD (autism, Asperger's, and pervasive developmental disorder), (3) Interventional studies for repetitive behaviors, (4) Original articles published within the past 10 years (2012 or later), (5) Written in English and Japanese. Exclusion criteria were as follows: (1) Systematic reviews or meta-analyses, (2) Conference reports or books. We carefully scrutinized databases to remove duplicate references and used a two-step screening process to select papers. The primary screening included close scrutiny of titles and abstracts to exclude articles that did not meet the eligibility criteria. During the secondary screening, we carefully read the complete text to assess eligibility, which was double-checked by six members at the laboratory. Disagreements were resolved through consensus-based discussion. Our search yielded 304 papers, of which nine were included in the study. The level of evidence was as follows: three randomized controlled trials (level 2), four pre-post studies (level 4b), and two case reports (level 5). Seven articles selected for this study described the effectiveness of interventions. Interventions for repetitive behaviors in preschool children with ASD were categorized as five interventions that directly involved the child and four educational programs for caregivers and parents. Studies that directly intervened with children used early intensive intervention based on applied behavior analysis (Early Start Denver Model, Early Intensive Behavioral Intervention, and the Picture Exchange Communication System) and individualized education based on sensory integration. Educational interventions for caregivers included two methods; (a) education regarding combined methods and practices of applied behavior analysis in addition to classification and coping methods for repetitive behaviors, and (b) education regarding evaluation methods and practices based on children’s developmental milestones in play. With regard to the neurophysiological basis of repetitive behaviors, environmental factors are implicated as possible contributors. We assumed that applied behavior analysis was shown to be effective in reducing repetitive behaviors because analysis focused on the interaction between the individual and the environment. Additionally, with regard to educational interventions for caregivers, the intervention was shown to promote behavioral change in children based on the caregivers' understanding of the classification of repetitive behaviors and the children’s developmental milestones in play and adjustment of the person-environment context led to a reduction in repetitive behaviors.Keywords: autism spectrum disorder, early intervention, repetitive behaviors, systematic review
Procedia PDF Downloads 14028493 Linguistics and Islamic Studies in Historical Perspective: The Case of Interdisciplinary Communication
Authors: Olga Bernikova, Oleg Redkin
Abstract:
Islamic Studies and the Arabic language are indivisible from each other starting from the appearance of Islam and formation of the Classical language. The present paper demonstrates correlation among linguistics and religion in historical perspective with regard to peculiarities of the Arabic language which distinguish it from the other prophetic languages. Islamic Studies and Linguistics are indivisible from each other starting from the invent of Islam and formation of the Classical language. In historical perspective, the Arabic language has been and remains a tool for the expression of Islamic rhetoric being a prophetic language. No other language in the world has preserved its stability for more than 14 centuries. Islam is considered to be one of the most important factors which secure this stability. The analysis and study of the text of Qurʾān are of special importance for those who study Islamic civilization, its role in the destinies of the mankind, its values and virtues. Without understanding of the polyphony of this sacred text, indivisible unity of its form and content it is impossible to understand social developments both in the present and the past. Since the first years of Islam Qurʾān had been in the center of attention of Muslim scholars, and in the center of attention of theologians, historians, philologists, jurists, mathematicians. Only quite recently it has become an object of analysis of the specialists of computer technologies. In Arabic and Islamic studies mediaeval texts i.e. textual documents are considered the main source of information. Hence the analysis of the multiplicity of various texts and finding of interconnections between them help to set scattered fragments of the riddle into a common and eloquent picture of the past, which reflects the state of the society on certain stages of its development. The text of the Qurʾān like any other phenomenon is a multifaceted object that should be studied from different points of view. As a result, this complex study will allow obtaining a three-dimensional image rather than a flat picture alone.Keywords: Arabic, Islamic studies, linguistics, religion
Procedia PDF Downloads 22328492 The American Theater: Latinos Performing as American Citizens by Supporting Trump's Ideals
Authors: Mariana Anaya Villafana
Abstract:
The sudden change of a significant percentage of the Latino community in the United States elections towards a Republican political orientation was reflected during the 2016 presidential election. This moment represented a radical change that is happening inside the Latino community in the United States, the support they have given to Trump's campaign only demonstrates their support for new anti-immigration regulations and conservative values, which are causing a division of ideologies inside the Latino community. One of the main goals of the following research is to understand the whole phenomenon 'Why would people join their own oppressor?' Align themselves with the politics that prevent many of their relatives to come to the United States and made the assimilation process difficult for their parents. It is important to prove that a change in the identity has happened, through the use of power relations and the attachment to the desired object. A group of Hispanics/Latinos have decided to vote for Trump in order to belong to a society that hasn’t been able to fully include them within it, an action that can result on the non-intentional harm of the values and aims of the rest of the Latino/Hispanic community. In order to understand their new political beliefs, it is necessary to use the method of discourse analysis to comprehend those comments and interviews that are published on web sites such as: 'Latinos for Trump' and 'GOP Hispanic Division'. Among the results that the research has shown, the notion of the 'American Dream' can be considered as a determinant object for the construction of a new identity that is rooted in hard work and legality. One that is proud of the Latino heritage but still wants to maintain the boundaries between legality and illegality in relation to the immigrants. This discourse results on a contradiction to most of the cases because they mention that their families came to the U.S. as immigrants; the only difference is that they work hard to obtain legal citizenship.Keywords: populism, identity, Latino Community, migration
Procedia PDF Downloads 12828491 Conserving History: Evaluating and Selecting Effective Restoration Methods for a Fragment Mural Painting from Amarna
Authors: Kholod Khairy Salama, Shabban Hassan Thabet
Abstract:
In the present study, a comprehensive investigation has been undertaken into an Egyptian mural painting with feet wear slippers approach to choose the most successful restoration methods. The mural painting under examination dates back to the Amarna period; it was detached from a wall of an unknown tomb in Egypt, and currently, it is initially displayed in a showcase at the Egyptian Museum – Tahrir Square – Cairo, Egypt. The main objectives of this research were to (a) reveal the pigment used in the mural painting, (b) reveal the medium used with colours, (c) determine the technique of manufacturing, (e) determine the ground support, and (f) reveal the main deterioration aspects. The analytical techniques used for investigation were Optical Microscopy, Raman, X-ray Florescence, X-ray diffraction, and Fourier transform infrared coupled with attenuated total reflectance “FTIR-ATR”. The investigation revealed that the vital deterioration factors affecting the object. This research aims to examine and analyze the mural painting to choose the suitable method for the restoration process (a) define the colours through comparative analysis to choose the suitable material for cleaning, (b) define the natural structure of the ground support layer, which appeared as mud layer (c) determine the medium used with colours (d) diagnosis the presence of the white wash layer, and (e) choose the suitable restoration methods according to the results. Conclusion: This study focused mainly on the physical and chemical properties of the mural painting compound and the main changes that happened to the mural painting material, which caused deterioration and fall down of the painting parts, so we can find the best and optimum restoration ways for this object.Keywords: mural paintings, Tal Al-Amarna, digital microscope, Raman, XRF, XRD, FTIR
Procedia PDF Downloads 7628490 A GIS Based Approach in District Peshawar, Pakistan for Groundwater Vulnerability Assessment Using DRASTIC Model
Authors: Syed Adnan, Javed Iqbal
Abstract:
In urban and rural areas groundwater is the most economic natural source of drinking. Groundwater resources of Pakistan are degraded due to high population growth and increased industrial development. A study was conducted in district Peshawar to assess groundwater vulnerable zones using GIS based DRASTIC model. Six input parameters (groundwater depth, groundwater recharge, aquifer material, soil type, slope and hydraulic conductivity) were used in the DRASTIC model to generate the groundwater vulnerable zones. Each parameter was divided into different ranges or media types and a subjective rating from 1-10 was assigned to each factor where 1 represented very low impact on pollution potential and 10 represented very high impact. Weight multiplier from 1-5 was used to balance and enhance the importance of each factor. The DRASTIC model scores obtained varied from 47 to 147. Using quantile classification scheme these values were reclassified into three zones i.e. low, moderate and high vulnerable zones. The areas of these zones were calculated. The final result indicated that about 400 km2, 506 km2, and 375 km2 were classified as low, moderate, and high vulnerable areas, respectively. It is recommended that the most vulnerable zones should be treated on first priority to facilitate the inhabitants for drinking purposes.Keywords: DRASTIC model, groundwater vulnerability, GIS in groundwater, drinking sources
Procedia PDF Downloads 45128489 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features
Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh
Abstract:
In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve
Procedia PDF Downloads 26228488 Implementation of an Economic – Probabilistic Model to Risk Analysis of ERP Project in Technological Innovation Firms – A Case Study of ICT Industry in Iran
Authors: Reza Heidari, Maryam Amiri
Abstract:
In a technological world, many countries have a tendency to fortifying their companies and technological infrastructures. Also, one of the most important requirements for developing technology is innovation, and then, all companies are struggling to consider innovation as a basic principle. Since, the expansion of a product need to combine different technologies, therefore, different innovative projects would be run in the firms as a base of technology development. In such an environment, enterprise resource planning (ERP) has special significance in order to develop and strengthen of innovations. In this article, an economic-probabilistic analysis was provided to perform an implementation project of ERP in the technological innovation (TI) based firms. The used model in this article assesses simultaneously both risk and economic analysis in view of the probability of each event that is jointly between economical approach and risk investigation approach. To provide an economic-probabilistic analysis of risk of the project, activities and milestones in the cash flow were extracted. Also, probability of occurrence of each of them was assessed. Since, Resources planning in an innovative firm is the object of this project. Therefore, we extracted various risks that are in relation with innovative project and then they were evaluated in the form of cash flow. This model, by considering risks affecting the project and the probability of each of them and assign them to the project's cash flow categories, presents an adjusted cash flow based on Net Present Value (NPV) and with probabilistic simulation approach. Indeed, this model presented economic analysis of the project based on risks-adjusted. Then, it measures NPV of the project, by concerning that these risks which have the most effect on technological innovation projects, and in the following measures probability associated with the NPV for each category. As a result of application of presented model in the information and communication technology (ICT) industry, provided an appropriate analysis of feasibility of the project from the point of view of cash flow based on risk impact on the project. Obtained results can be given to decision makers until they can practically have a systematically analysis of the possibility of the project with an economic approach and as moderated.Keywords: cash flow categorization, economic evaluation, probabilistic, risk assessment, technological innovation
Procedia PDF Downloads 404