Search results for: logical layout analysis
27813 Financial Analysis of Selected Private Healthcare Organizations with Special Referance to Guwahati City, Assam
Authors: Mrigakshi Das
Abstract:
The private sector investments and quantum of money required in this sector critically hinges on the financial risk and returns the sector offers to providers of capital. Therefore, it becomes important to understand financial performance of hospitals. Financial Analysis is useful for decision makers in a variety of settings. Consider the small proprietary hospitals, say, Physicians Clinic. The managers of such clinic need the information that financial statements provide. Attention to Financial Statements of healthcare Organizations can provide answers to questions like: How are they doing? What is their rate of profit? What is their solvency and liquidity position? What are their sources and application of funds? What is their Operational Efficiency? The researcher has studied Financial Statements of 5 Private Healthcare Organizations in Guwahati City.Keywords: not-for-profit organizations, financial analysis, ratio analysis, profitability analysis, liquidity analysis, operational efficiency, capital structure analysis
Procedia PDF Downloads 55027812 A Cross-Dialect Statistical Analysis of Final Declarative Intonation in Tuvinian
Authors: D. Beziakina, E. Bulgakova
Abstract:
This study continues the research on Tuvinian intonation and presents a general cross-dialect analysis of intonation of Tuvinian declarative utterances, specifically the character of the tone movement in order to test the hypothesis about the prevalence of level tone in some Tuvinian dialects. The results of the analysis of basic pitch characteristics of Tuvinian speech (in general and in comparison with two other Turkic languages - Uzbek and Azerbaijani) are also given in this paper. The goal of our work was to obtain the ranges of pitch parameter values typical for Tuvinian speech. Such language-specific values can be used in speaker identification systems in order to get more accurate results of ethnic speech analysis. We also present the results of a cross-dialect analysis of declarative intonation in the poorly studied Tuvinian language.Keywords: speech analysis, statistical analysis, speaker recognition, identification of person
Procedia PDF Downloads 47227811 An Emergentist Defense of Incompatibility between Morally Significant Freedom and Causal Determinism
Authors: Lubos Rojka
Abstract:
The common perception of morally responsible behavior is that it presupposes freedom of choice, and that free decisions and actions are not determined by natural events, but by a person. In other words, the moral agent has the ability and the possibility of doing otherwise when making morally responsible decisions, and natural causal determinism cannot fully account for morally significant freedom. The incompatibility between a person’s morally significant freedom and causal determinism appears to be a natural position. Nevertheless, some of the most influential philosophical theories on moral responsibility are compatibilist or semi-compatibilist, and they exclude the requirement of alternative possibilities, which contradicts the claims of classical incompatibilism. The compatibilists often employ Frankfurt-style thought experiments to prove their theory. The goal of this paper is to examine the role of imaginary Frankfurt-style examples in compatibilist accounts. More specifically, the compatibilist accounts defended by John Martin Fischer and Michael McKenna will be inserted into the broader understanding of a person elaborated by Harry Frankfurt, Robert Kane and Walter Glannon. Deeper analysis reveals that the exclusion of alternative possibilities based on Frankfurt-style examples is problematic and misleading. A more comprehensive account of moral responsibility and morally significant (source) freedom requires higher order complex theories of human will and consciousness, in which rational and self-creative abilities and a real possibility to choose otherwise, at least on some occasions during a lifetime, are necessary. Theoretical moral reasons and their logical relations seem to require a sort of higher-order agent-causal incompatibilism. The ability of theoretical or abstract moral reasoning requires complex (strongly emergent) mental and conscious properties, among which an effective free will, together with first and second-order desires. Such a hierarchical theoretical model unifies reasons-responsiveness, mesh theory and emergentism. It is incompatible with physical causal determinism, because such determinism only allows non-systematic processes that may be hard to predict, but not complex (strongly) emergent systems. An agent’s effective will and conscious reflectivity is the starting point of a morally responsible action, which explains why a decision is 'up to the subject'. A free decision does not always have a complete causal history. This kind of an emergentist source hyper-incompatibilism seems to be the best direction of the search for an adequate explanation of moral responsibility in the traditional (merit-based) sense. Physical causal determinism as a universal theory would exclude morally significant freedom and responsibility in the traditional sense because it would exclude the emergence of and supervenience by the essential complex properties of human consciousness.Keywords: consciousness, free will, determinism, emergence, moral responsibility
Procedia PDF Downloads 16627810 Evaluation of Classification Algorithms for Diagnosis of Asthma in Iranian Patients
Authors: Taha SamadSoltani, Peyman Rezaei Hachesu, Marjan GhaziSaeedi, Maryam Zolnoori
Abstract:
Introduction: Data mining defined as a process to find patterns and relationships along data in the database to build predictive models. Application of data mining extended in vast sectors such as the healthcare services. Medical data mining aims to solve real-world problems in the diagnosis and treatment of diseases. This method applies various techniques and algorithms which have different accuracy and precision. The purpose of this study was to apply knowledge discovery and data mining techniques for the diagnosis of asthma based on patient symptoms and history. Method: Data mining includes several steps and decisions should be made by the user which starts by creation of an understanding of the scope and application of previous knowledge in this area and identifying KD process from the point of view of the stakeholders and finished by acting on discovered knowledge using knowledge conducting, integrating knowledge with other systems and knowledge documenting and reporting.in this study a stepwise methodology followed to achieve a logical outcome. Results: Sensitivity, Specifity and Accuracy of KNN, SVM, Naïve bayes, NN, Classification tree and CN2 algorithms and related similar studies was evaluated and ROC curves were plotted to show the performance of the system. Conclusion: The results show that we can accurately diagnose asthma, approximately ninety percent, based on the demographical and clinical data. The study also showed that the methods based on pattern discovery and data mining have a higher sensitivity compared to expert and knowledge-based systems. On the other hand, medical guidelines and evidence-based medicine should be base of diagnostics methods, therefore recommended to machine learning algorithms used in combination with knowledge-based algorithms.Keywords: asthma, datamining, classification, machine learning
Procedia PDF Downloads 44827809 A Survey of Sentiment Analysis Based on Deep Learning
Authors: Pingping Lin, Xudong Luo, Yifan Fan
Abstract:
Sentiment analysis is a very active research topic. Every day, Facebook, Twitter, Weibo, and other social media, as well as significant e-commerce websites, generate a massive amount of comments, which can be used to analyse peoples opinions or emotions. The existing methods for sentiment analysis are based mainly on sentiment dictionaries, machine learning, and deep learning. The first two kinds of methods rely on heavily sentiment dictionaries or large amounts of labelled data. The third one overcomes these two problems. So, in this paper, we focus on the third one. Specifically, we survey various sentiment analysis methods based on convolutional neural network, recurrent neural network, long short-term memory, deep neural network, deep belief network, and memory network. We compare their futures, advantages, and disadvantages. Also, we point out the main problems of these methods, which may be worthy of careful studies in the future. Finally, we also examine the application of deep learning in multimodal sentiment analysis and aspect-level sentiment analysis.Keywords: document analysis, deep learning, multimodal sentiment analysis, natural language processing
Procedia PDF Downloads 16427808 Containment/Penetration Analysis for the Protection of Aircraft Engine External Configuration and Nuclear Power Plant Structures
Authors: Dong Wook Lee, Adrian Mistreanu
Abstract:
The authors have studied a method for analyzing containment and penetration using an explicit nonlinear Finite Element Analysis. This method may be used in the stage of concept design for the protection of external configurations or components of aircraft engines and nuclear power plant structures. This paper consists of the modeling method, the results obtained from the method and the comparison of the results with those calculated from simple analytical method. It shows that the containment capability obtained by proposed method matches well with analytically calculated containment capability.Keywords: computer aided engineering, containment analysis, finite element analysis, impact analysis, penetration analysis
Procedia PDF Downloads 13927807 A Case Study on Problems Originated from Critical Path Method Application in a Governmental Construction Project
Authors: Mohammad Lemar Zalmai, Osman Hurol Turkakin, Cemil Akcay, Ekrem Manisali
Abstract:
In public construction projects, determining the contract period in the award phase is one of the most important factors. The contract period establishes the baseline for creating the cash flow curve and progress payment planning in the post-award phase. If overestimated, project duration causes losses for both the owner and the contractor. Therefore, it is essential to base construction project duration on reliable forecasting. In Turkey, schedules are usually built using the bar chart (Gantt) schedule, especially for governmental construction agencies. The usage of these schedules is limited for bidding purposes. Although the bar-chart schedule is useful in some cases, it lacks logical connections between activities; it would be harder to obtain the activities that have more effects than others on the project's total duration, especially in large complex projects. In this study, a construction schedule is prepared with Critical Path Method (CPM) that addresses the above-mentioned discrepancies. CPM is a simple and effective method that displays project time and critical paths, showing results of forward and backward calculations with considering the logic relationships between activities; it is a powerful tool for planning and managing all kinds of construction projects and is a very convenient method for the construction industry. CPM provides a much more useful and precise approach than traditional bar-chart diagrams that form the basis of construction planning and control. CPM has two main application utilities in the construction field; the first one is obtaining project duration, which is called an as-planned schedule that includes as-planned activity durations with relationships between subsequent activities. Another utility is during the project execution; each activity is tracked, and their durations are recorded in order to obtain as-built schedule, which is named as a black box of the project. The latter is more useful for delay analysis, and conflict resolutions. These features of CPM have been popular around the world. However, it has not been yet extensively used in Turkey. In this study, a real construction project is investigated as a case study; CPM-based scheduling is used for establishing both of as-built and as-planned schedules. Problems that emerged during the construction phase are identified and categorized. Subsequently, solutions are suggested. Two scenarios were considered. In the first scenario, project progress was monitored based as CPM was used to track and manage progress; this was carried out based on real-time data. In the second scenario, project progress was supposedly tracked based on the assumption that the Gantt chart was used. The S-curves of the two scenarios are plotted and interpreted. Comparing the results, possible faults of the latter scenario are highlighted, and solutions are suggested. The importance of CPM implementation has been emphasized and it has been proposed to make it mandatory for preparation of construction schedule based on CPM for public construction projects contracts.Keywords: as-built, case-study, critical path method, Turkish government sector projects
Procedia PDF Downloads 12227806 Collision Theory Based Sentiment Detection Using Discourse Analysis in Hadoop
Authors: Anuta Mukherjee, Saswati Mukherjee
Abstract:
Data is growing everyday. Social networking sites such as Twitter are becoming an integral part of our daily lives, contributing a large increase in the growth of data. It is a rich source especially for sentiment detection or mining since people often express honest opinion through tweets. However, although sentiment analysis is a well-researched topic in text, this analysis using Twitter data poses additional challenges since these are unstructured data with abbreviations and without a strict grammatical correctness. We have employed collision theory to achieve sentiment analysis in Twitter data. We have also incorporated discourse analysis in the collision theory based model to detect accurate sentiment from tweets. We have also used the retweet field to assign weights to certain tweets and obtained the overall weightage of a topic provided in the form of a query. Hadoop has been exploited for speed. Our experiments show effective results.Keywords: sentiment analysis, twitter, collision theory, discourse analysis
Procedia PDF Downloads 53527805 The Prevalence of Organized Retail Crime in Riyadh, Saudi Arabia
Authors: Saleh Dabil
Abstract:
This study investigates the level of existence of organized retail crime in supermarkets of Riyadh, Saudi Arabia. The store managers, security managers and general employees were asked about the types of retail crimes occur in the stores. Three independent variables were related to the report of organized retail theft. The independent variables are: (1) the supermarket profile (volume, location, standard and type of the store), (2) the social physical environment of the store (maintenance, cleanness and overall organizational cooperation), (3) the security techniques and loss prevention electronics techniques used. The theoretical framework of this study based on the social disorganization theory. This study concluded that the organized retail theft, in specific, organized theft is moderately apparent in Riyadh stores. The general result showed that the environment of the stores has an effect on the prevalence of organized retail theft with relation to the gender of thieves, age groups, working shift, type of stolen items as well as the number of thieves in one case. Among other reasons, some factors of the organized theft are: economic pressure of customers based on the location of the store. The dealing of theft also was investigated to have a clear picture of stores dealing with organized retail theft. The result showed that mostly, thieves sent without any action and sometimes given written warning. Very few cases dealt with by police. There are other factors in the study can be looked up in the text. This study suggests solving the problem of organized theft; first is ‘the well distributing of the duties and responsibilities between the employees especially for security purposes’. Second is ‘installation of strong security system’ and ‘making well-designed store layout’. Third is ‘giving training for general employees’ and ‘to give periodically security skills training of employees’. There are other suggestions in the study can be looked up in the text.Keywords: organized crime, retail, theft, loss prevention, store environment
Procedia PDF Downloads 19827804 Using SNAP and RADTRAD to Establish the Analysis Model for Maanshan PWR Plant
Authors: J. R. Wang, H. C. Chen, C. Shih, S. W. Chen, J. H. Yang, Y. Chiang
Abstract:
In this study, we focus on the establishment of the analysis model for Maanshan PWR nuclear power plant (NPP) by using RADTRAD and SNAP codes with the FSAR, manuals, and other data. In order to evaluate the cumulative dose at the Exclusion Area Boundary (EAB) and Low Population Zone (LPZ) outer boundary, Maanshan NPP RADTRAD/SNAP model was used to perform the analysis of the DBA LOCA case. The analysis results of RADTRAD were similar to FSAR data. These analysis results were lower than the failure criteria of 10 CFR 100.11 (a total radiation dose to the whole body, 250 mSv; a total radiation dose to the thyroid from iodine exposure, 3000 mSv).Keywords: RADionuclide, transport, removal, and dose estimation (RADTRAD), symbolic nuclear analysis package (SNAP), dose, PWR
Procedia PDF Downloads 46527803 A Data Envelopment Analysis Model in a Multi-Objective Optimization with Fuzzy Environment
Authors: Michael Gidey Gebru
Abstract:
Most of Data Envelopment Analysis models operate in a static environment with input and output parameters that are chosen by deterministic data. However, due to ambiguity brought on shifting market conditions, input and output data are not always precisely gathered in real-world scenarios. Fuzzy numbers can be used to address this kind of ambiguity in input and output data. Therefore, this work aims to expand crisp Data Envelopment Analysis into Data Envelopment Analysis with fuzzy environment. In this study, the input and output data are regarded as fuzzy triangular numbers. Then, the Data Envelopment Analysis model with fuzzy environment is solved using a multi-objective method to gauge the Decision Making Units' efficiency. Finally, the developed Data Envelopment Analysis model is illustrated with an application on real data 50 educational institutions.Keywords: efficiency, Data Envelopment Analysis, fuzzy, higher education, input, output
Procedia PDF Downloads 6227802 Constructivism Learning Management in Mathematics Analysis Courses
Authors: Komon Paisal
Abstract:
The purposes of this research were (1) to create a learning activity for constructivism, (2) study the Mathematical Analysis courses learning achievement, and (3) study students’ attitude toward the learning activity for constructivism. The samples in this study were divided into 2 parts including 3 Mathematical Analysis courses instructors of Suan Sunandha Rajabhat University who provided basic information and attended the seminar and 17 Mathematical Analysis courses students who were studying in the academic and engaging in the learning activity for constructivism. The research instruments were lesson plans constructivism, subjective Mathematical Analysis courses achievement test with reliability index of 0.8119, and an attitude test concerning the students’ attitude toward the Mathematical Analysis courses learning activity for constructivism. The result of the research show that the efficiency of the Mathematical Analysis courses learning activity for constructivism is 73.05/72.16, which is more than expected criteria of 70/70. The research additionally find that the average score of learning achievement of students who engaged in the learning activities for constructivism are equal to 70% and the students’ attitude toward the learning activity for constructivism are at the medium level.Keywords: constructivism, learning management, mathematics analysis courses, learning activity
Procedia PDF Downloads 53327801 Systematic Review of Functional Analysis in Brazil
Authors: Felipe Magalhaes Lemos
Abstract:
Functional behavior analysis is a procedure that has been studied for several decades by behavior analysts. In Brazil, we still have few studies in the area, so it was decided to carry out a systematic review of the articles published in the area by Brazilians. A search was done on the following scientific article registration sites: PsycINFO, ERIC, ISI Web of Science, Virtual Health Library. The research includes (a) peer-reviewed studies that (b) have been carried out in Brazil containing (c) functional assessment as a pre-treatment through (d) experimental procedures, direct or indirect observation and measurement of behavior problems (e) demonstrating a relationship between environmental events and behavior. During the review, 234 papers were found; however, only 9 were included in the final analysis. Of the 9 articles extracted, only 2 presented functional analysis procedures with manipulation of environmental variables, while the other 7 presented different procedures for a descriptive behavior assessment. Only the two studies using "functional analysis" used graphs to demonstrate the prevalent function of the behavior. Other studies described procedures and did not make clear the causal relationship between environment and behavior. There is still confusion in Brazil regarding the terms "functional analysis", "descriptive assessment" and "contingency analysis," which are generally treated in the same way. This study shows that few articles are published with a focus on functional analysis in Brazil.Keywords: behavior, contingency, descriptive assessment, functional analysis
Procedia PDF Downloads 14627800 Evaluation of the Performance Measures of Two-Lane Roundabout and Turbo Roundabout with Varying Truck Percentages
Authors: Evangelos Kaisar, Anika Tabassum, Taraneh Ardalan, Majed Al-Ghandour
Abstract:
The economy of any country is dependent on its ability to accommodate the movement and delivery of goods. The demand for goods movement and services increases truck traffic on highways and inside the cities. The livability of most cities is directly affected by the congestion and environmental impacts of trucks, which are the backbone of the urban freight system. Better operation of heavy vehicles on highways and arterials could lead to the network’s efficiency and reliability. In many cases, roundabouts can respond better than at-level intersections to enable traffic operations with increased safety for both cars and heavy vehicles. Recently emerged, the concept of turbo-roundabout is a viable alternative to the two-lane roundabout aiming to improve traffic efficiency. The primary objective of this study is to evaluate the operation and performance level of an at-grade intersection, a conventional two-lane roundabout, and a basic turbo roundabout for freight movements. To analyze and evaluate the performances of the signalized intersections and the roundabouts, micro simulation models were developed PTV VISSIM. The networks chosen for this analysis in this study are to experiment and evaluate changes in the performance of the movement of vehicles with different geometric and flow scenarios. There are several scenarios that were examined when attempting to assess the impacts of various geometric designs on vehicle movements. The overall traffic efficiency depends on the geometric layout of the intersections, which consists of traffic congestion rate, hourly volume, frequency of heavy vehicles, type of road, and the ratio of major-street versus side-street traffic. The traffic performance was determined by evaluating the delay time, number of stops, and queue length of each intersection for varying truck percentages. The results indicate that turbo-roundabouts can replace signalized intersections and two-lane roundabouts only when the traffic demand is low, even with high truck volume. More specifically, it is clear that two-lane roundabouts are seen to have shorter queue lengths compared to signalized intersections and turbo-roundabouts. For instance, considering the scenario where the volume is highest, and the truck movement and left turn movement are maximum, the signalized intersection has 3 times, and the turbo-roundabout has 5 times longer queue length than a two-lane roundabout in major roads. Similarly, on minor roads, signalized intersections and turbo-roundabouts have 11 times longer queue lengths than two-lane roundabouts for the same scenario. As explained from all the developed scenarios, while the traffic demand lowers, the queue lengths of turbo-roundabouts shorten. This proves that turbo roundabouts perform well for low and medium traffic demand. The results indicate that turbo-roundabouts can replace signalized intersections and two-lane roundabouts only when the traffic demand is low, even with high truck volume. Finally, this study provides recommendations on the conditions under which different intersections perform better than each other.Keywords: At-grade intersection, simulation, turbo-roundabout, two-lane roundabout
Procedia PDF Downloads 15127799 Experiences of Timing Analysis of Parallel Embedded Software
Authors: Muhammad Waqar Aziz, Syed Abdul Baqi Shah
Abstract:
The execution time analysis is fundamental to the successful design and execution of real-time embedded software. In such analysis, the Worst-Case Execution Time (WCET) of a program is a key measure, on the basis of which system tasks are scheduled. The WCET analysis of embedded software is also needed for system understanding and to guarantee its behavior. WCET analysis can be performed statically (without executing the program) or dynamically (through measurement). Traditionally, research on the WCET analysis assumes sequential code running on single-core platforms. However, as computation is steadily moving towards using a combination of parallel programs and multi-core hardware, new challenges in WCET analysis need to be addressed. In this article, we report our experiences of performing the WCET analysis of Parallel Embedded Software (PES) running on multi-core platform. The primary purpose was to investigate how WCET estimates of PES can be computed statically, and how they can be derived dynamically. Our experiences, as reported in this article, include the challenges we faced, possible suggestions to these challenges and the workarounds that were developed. This article also provides observations on the benefits and drawbacks of deriving the WCET estimates using the said methods and provides useful recommendations for further research in this area.Keywords: embedded software, worst-case execution-time analysis, static flow analysis, measurement-based analysis, parallel computing
Procedia PDF Downloads 32427798 Companies’ Internationalization: Multi-Criteria-Based Prioritization Using Fuzzy Logic
Authors: Jorge Anibal Restrepo Morales, Sonia Martín Gómez
Abstract:
A model based on a logical framework was developed to quantify SMEs' internationalization capacity. To do so, linguistic variables, such as human talent, infrastructure, innovation strategies, FTAs, marketing strategies, finance, etc. were integrated. It is argued that a company’s management of international markets depends on internal factors, especially capabilities and resources available. This study considers internal factors as the biggest business challenge because they force companies to develop an adequate set of capabilities. At this stage, importance and strategic relevance have to be defined in order to build competitive advantages. A fuzzy inference system is proposed to model the resources, skills, and capabilities that determine the success of internationalization. Data: 157 linguistic variables were used. These variables were defined by international trade entrepreneurs, experts, consultants, and researchers. Using expert judgment, the variables were condensed into18 factors that explain SMEs’ export capacity. The proposed model is applied by means of a case study of the textile and clothing cluster in Medellin, Colombia. In the model implementation, a general index of 28.2 was obtained for internationalization capabilities. The result confirms that the sector’s current capabilities and resources are not sufficient for a successful integration into the international market. The model specifies the factors and variables, which need to be worked on in order to improve export capability. In the case of textile companies, the lack of a continuous recording of information stands out. Likewise, there are very few studies directed towards developing long-term plans, and., there is little consistency in exports criteria. This method emerges as an innovative management tool linked to internal organizational spheres and their different abilities.Keywords: business strategy, exports, internationalization, fuzzy set methods
Procedia PDF Downloads 29627797 Vulnerability Assessment of Healthcare Interdependent Critical Infrastructure Coloured Petri Net Model
Authors: N. Nivedita, S. Durbha
Abstract:
Critical Infrastructure (CI) consists of services and technological networks such as healthcare, transport, water supply, electricity supply, information technology etc. These systems are necessary for the well-being and to maintain effective functioning of society. Critical Infrastructures can be represented as nodes in a network where they are connected through a set of links depicting the logical relationship among them; these nodes are interdependent on each other and interact with each at other at various levels, such that the state of each infrastructure influences or is correlated to the state of another. Disruption in the service of one infrastructure nodes of the network during a disaster would lead to cascading and escalating disruptions across other infrastructures nodes in the network. The operation of Healthcare Infrastructure is one such Critical Infrastructure that depends upon a complex interdependent network of other Critical Infrastructure, and during disasters it is very vital for the Healthcare Infrastructure to be protected, accessible and prepared for a mass casualty. To reduce the consequences of a disaster on the Critical Infrastructure and to ensure a resilient Critical Health Infrastructure network, knowledge, understanding, modeling, and analyzing the inter-dependencies between the infrastructures is required. The paper would present inter-dependencies related to Healthcare Critical Infrastructure based on Hierarchical Coloured Petri Nets modeling approach, given a flood scenario as the disaster which would disrupt the infrastructure nodes. The model properties are being analyzed for the various state changes which occur when there is a disruption or damage to any of the Critical Infrastructure. The failure probabilities for the failure risk of interconnected systems are calculated by deriving a reachability graph, which is later mapped to a Markov chain. By analytically solving and analyzing the Markov chain, the overall vulnerability of the Healthcare CI HCPN model is demonstrated. The entire model would be integrated with Geographic information-based decision support system to visualize the dynamic behavior of the interdependency of the Healthcare and related CI network in a geographically based environment.Keywords: critical infrastructure interdependency, hierarchical coloured petrinet, healthcare critical infrastructure, Petri Nets, Markov chain
Procedia PDF Downloads 53027796 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm
Authors: Ameur Abdelkader, Abed Bouarfa Hafida
Abstract:
Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm
Procedia PDF Downloads 14227795 A Prototype of an Information and Communication Technology Based Intervention Tool for Children with Dyslexia
Authors: Rajlakshmi Guha, Sajjad Ansari, Shazia Nasreen, Hirak Banerjee, Jiaul Paik
Abstract:
Dyslexia is a neurocognitive disorder, affecting around fifteen percent of the Indian population. The symptoms include difficulty in reading alphabet, words, and sentences. This can be difficult at the phonemic or recognition level and may further affect lexical structures. Therapeutic intervention of dyslexic children post assessment is generally done by special educators and psychologists through one on one interaction. Considering the large number of children affected and the scarcity of experts, access to care is limited in India. Moreover, unavailability of resources and timely communication with caregivers add on to the problem of proper intervention. With the development of Educational Technology and its use in India, access to information and care has been improved in such a large and diverse country. In this context, this paper proposes an ICT enabled home-based intervention program for dyslexic children which would support the child, and provide an interactive interface between expert, parents, and students. The paper discusses the details of the database design and system layout of the program. Along with, it also highlights the development of different technical aids required to build out personalized android applications for the Indian dyslexic population. These technical aids include speech database creation for children, automatic speech recognition system, serious game development, and color coded fonts. The paper also emphasizes the games developed to assist the dyslexic child on cognitive training primarily for attention, working memory, and spatial reasoning. In addition, it talks about the specific elements of the interactive intervention tool that makes it effective for home based intervention of dyslexia.Keywords: Android applications, cognitive training, dyslexia, intervention
Procedia PDF Downloads 29227794 Millimeter-Wave Silicon Power Amplifiers for 5G Wireless Communications
Authors: Kyoungwoon Kim, Cuong Huynh, Cam Nguyen
Abstract:
Exploding demands for more data, faster data transmission speed, less interference, more users, more wireless devices, and better reliable service-far exceeding those provided in the current mobile communications networks in the RF spectrum below 6 GHz-has led the wireless communication industry to focus on higher, previously unallocated spectrums. High frequencies in RF spectrum near (around 28 GHz) or within the millimeter-wave regime is the logical solution to meet these demands. This high-frequency RF spectrum is of increasingly important for wireless communications due to its large available bandwidths that facilitate various applications requiring large-data high-speed transmissions, reaching up to multi-gigabit per second, of vast information. It also resolves the traffic congestion problems of signals from many wireless devices operating in the current RF spectrum (below 6 GHz), hence handling more traffic. Consequently, the wireless communication industries are moving towards 5G (fifth generation) for next-generation communications such as mobile phones, autonomous vehicles, virtual reality, and the Internet of Things (IoT). The U.S. Federal Communications Commission (FCC) proved on 14th July 2016 three frequency bands for 5G around 28, 37 and 39 GHz. We present some silicon-based RFIC power amplifiers (PA) for possible implementation for 5G wireless communications around 28, 37 and 39 GHz. The 16.5-28 GHz PA exhibits measured gain of more than 34.5 dB and very flat output power of 19.4±1.2 dBm across 16.5-28 GHz. The 25.5/37-GHz PA exhibits gain of 21.4 and 17 dB, and maximum output power of 16 and 13 dBm at 25.5 and 37 GHz, respectively, in the single-band mode. In the dual-band mode, the maximum output power is 13 and 9.5 dBm at 25.5 and 37 GHz, respectively. The 10-19/23-29/33-40 GHz PA has maximum output powers of 15, 13.3, and 13.8 dBm at 15, 25, and 35 GHz, respectively, in the single-band mode. When this PA is operated in dual-band mode, it has maximum output powers of 11.4/8.2 dBm at 15/25 GHz, 13.3/3 dBm at 15/35 GHz, and 8.7/6.7 dBm at 25/35 GHz. In the tri-band mode, it exhibits 8.8/5.4/3.8 dBm maximum output power at 15/25/35 GHz. Acknowledgement: This paper was made possible by NPRP grant # 6-241-2-102 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the authorsKeywords: Microwaves, Millimeter waves, Power Amplifier, Wireless communications
Procedia PDF Downloads 18927793 Numerical Solution of Portfolio Selecting Semi-Infinite Problem
Authors: Alina Fedossova, Jose Jorge Sierra Molina
Abstract:
SIP problems are part of non-classical optimization. There are problems in which the number of variables is finite, and the number of constraints is infinite. These are semi-infinite programming problems. Most algorithms for semi-infinite programming problems reduce the semi-infinite problem to a finite one and solve it by classical methods of linear or nonlinear programming. Typically, any of the constraints or the objective function is nonlinear, so the problem often involves nonlinear programming. An investment portfolio is a set of instruments used to reach the specific purposes of investors. The risk of the entire portfolio may be less than the risks of individual investment of portfolio. For example, we could make an investment of M euros in N shares for a specified period. Let yi> 0, the return on money invested in stock i for each dollar since the end of the period (i = 1, ..., N). The logical goal here is to determine the amount xi to be invested in stock i, i = 1, ..., N, such that we maximize the period at the end of ytx value, where x = (x1, ..., xn) and y = (y1, ..., yn). For us the optimal portfolio means the best portfolio in the ratio "risk-return" to the investor portfolio that meets your goals and risk ways. Therefore, investment goals and risk appetite are the factors that influence the choice of appropriate portfolio of assets. The investment returns are uncertain. Thus we have a semi-infinite programming problem. We solve a semi-infinite optimization problem of portfolio selection using the outer approximations methods. This approach can be considered as a developed Eaves-Zangwill method applying the multi-start technique in all of the iterations for the search of relevant constraints' parameters. The stochastic outer approximations method, successfully applied previously for robotics problems, Chebyshev approximation problems, air pollution and others, is based on the optimal criteria of quasi-optimal functions. As a result we obtain mathematical model and the optimal investment portfolio when yields are not clear from the beginning. Finally, we apply this algorithm to a specific case of a Colombian bank.Keywords: outer approximation methods, portfolio problem, semi-infinite programming, numerial solution
Procedia PDF Downloads 30927792 Studying the Effect of Heartfulness Meditation on Brain Activity
Authors: Norman Farb, Anirudh Kumar, Abdul Subhan, Pallavi Gupta, Jahnavi Mundluru, Abdul Subhan, Shankar Pathmakanthan
Abstract:
Long term meditation practice is increasingly recognized for its health benefits. Among a diversity of contemplative traditions, Heartfulness meditation represents a quickly growing set of practices that is largely unstudied. Heartfulness is unique in that it is a meditation practice that focuses on the Heart. It helps individuals to connect to themselves and find inner peace while meditating. In order to deepen ones’ meditation on the heart, the element of Yogic Energy (‘pranahuti’) is used as an aid during meditation. The purpose of this study was to determine whether consistent EEG effects of Heartfulness meditation be observed in sixty experienced Heartfulness meditators, each of whom attended 6 testing sessions. In each session, participants performed three conditions: a set of cognitive tasks, Heartfulness guided relaxation, and Heartfulness Meditation. To measure EEG, the MUSE EEG head band (product of Interaxon Inc) was used. Participants during the cognitive portion were required to answer questions that tested their logical thinking (Cognitive Reflective Test) and creative thinking skills. (Random Associative Test) The order of condition was randomly counter balanced across six sessions. It was hypothesized that Heartfulness meditation would bring increased alpha (8-12Hz) brain activity during meditation and better cognitive task scores in sessions where the tasks followed meditation. Results show that cognitive task scores were higher after meditation in both CRT and RAT, suggesting stronger right brain and left brain activation. Heartfulness meditation produces a significant decrease in brain activity (as indexed by higher levels of alpha) during the early stages of meditation. As the meditation progressed deep meditative state (as indexed by higher levels of delta) were observed until the end of the condition. This lead to the conclusion that Heartfulness Meditation produces a state that is clearly distinguishable from effortful problem solving.Keywords: heartfulness meditation, neuroplasticity, brain activity, relaxation response
Procedia PDF Downloads 33727791 The Optimal Utilization of Centrally Located Land: The Case of the Bloemfontein Show Grounds
Authors: D. F. Coetzee, M. M. Campbell
Abstract:
The urban environment is constantly expanding and the optimal use of centrally located land is important in terms of sustainable development. Bloemfontein has expanded and this affects land-use functions. The purpose of the study is to examine the possible shift in location of the Bloemfontein show grounds to utilize the space of the grounds more effectively in context of spatial planning. The research method used is qualitative case study research with the case study on the Bloemfontein show grounds. The purposive sample consisted of planners who work or consult in the Bloemfontein area and who are registered with the South African Council for Planners (SACPLAN). Interviews consisting of qualitative open-ended questionnaires were used. When considering relocation the social and economic aspects need to be considered. The findings also indicated a majority consensus that the property can be utilized more effectively in terms of mixed land use. The showground development trust compiled a master plan to ensure that the property is used to its full potential without the relocation of the showground function itself. This Master Plan can be seen as the next logical step for the showground property itself, and it is indeed an attempt to better utilize the land parcel without relocating the show function. The question arises whether the proposed Master Plan is a permanent solution or whether it is merely delaying the relocation of the core showground function to another location. For now, it is a sound solution, making the best out of the situation at hand and utilizing the property more effectively. If the show grounds were to be relocated the researcher proposed a recommendation of mixed-use development, in terms an expansion on the commercial business/retail, together with a sport and recreation function. The show grounds in Bloemfontein are well positioned to capitalize on and to meet the needs of the changing economy, while complimenting the future economic growth strategies of the city if the right plans are in place.Keywords: centrally located land, spatial planning, show grounds, central business district
Procedia PDF Downloads 41727790 Method to Assessing Aspect of Sustainable Development-Walkability
Authors: Amna Ali Nasser Al-Saadi, Riken Homma, Kazuhisa Iki
Abstract:
Need to generate objective communication between researchers, Practitioners and policy makers are top concern of sustainability. Despite the fact that many places have successes in achieving some aspects of sustainable urban development, there are no scientific facts to convince policy makers in the rest of the world to apply their guides and manuals. This is because each of them was developed to fulfill the need of specific city. The question is, how to learn the lesson from each case study? And how distinguish between the potential criteria and negative one? And how quantify their effects in the future development? Walkability has been found as a solution to achieve healthy life style as well as social, environmental and economic sustainability. Moreover, it is complicated as every aspect of sustainable development. This research is stand on quantitative- comparative methodology in order to assess pedestrian oriented development. Three Analyzed Areas (AAs) were selected. One site is located in Oman in which hypotheses as motorized oriented development, while two sites are in Japan where the development is pedestrian friendly. The study used Multi-Criteria Evaluation Method (MCEM). Initially, MCEM stands on Analytic Hierarchy Process (AHP). The later was structured into main goal (walkability), objectives (functions and layout) and attributes (the urban form criteria). Secondly, the GIS were used to evaluate the attributes in multi-criteria maps. Since each criterion has different scale of measurement, all results were standardized by z-score and used to measure the co-relations among cr iteria. Different scenario was generated from each AA. After that, MCEM (AHP- OWA) based on GIS measured the walkability score and determined the priority of criteria development in the non-walker friendly environment. As results, the comparison criteria for z-score presented a measurable distinguished orientation of development. This result has been used to prove that Oman is motorized environment while Japan is walkable. Also, it defined the powerful criteria and week criteria regardless to the AA. This result has been used to generalize the priority for walkable development.Keywords: walkability, sustainable development, multi- criteria evaluation method, gis
Procedia PDF Downloads 45327789 Window Opening Behavior in High-Density Housing Development in Subtropical Climate
Authors: Minjung Maing, Sibei Liu
Abstract:
This research discusses the results of a study of window opening behavior of large housing developments in the high-density megacity of Hong Kong. The methods used for the study involved field observations using photo documentation of the four cardinal elevations (north, south-east, and west) of two large housing developments in a very dense urban area of approx. 46,000 persons per square meter within the city of Hong Kong. The targeted housing developments (A and B) are large public housing with a population of about 13,000 in each development of lower income. However, the mean income level in development A is about 40% higher than development B and home ownership is 60% in development A and 0% in development B. Mapping of the surrounding amenities and layout of the developments were also studied to understand the available activities to the residents. The photo documentation of the elevations was taken from November 2016 to February 2018 to gather a full spectrum of different seasons and both in the morning and afternoon (am/pm) times. From the photograph, the window opening behavior was measured by counting the amount of windows opened as a percentage of all the windows on that façade. For each date of survey data collected, weather data was recorded from weather stations located in the same region to collect temperature, humidity and wind speed. To further understand the behavior, simulation studies of microclimate conditions of the housing development was conducted using the software ENVI-met, a widely used simulation tool by researchers studying urban climate. Four major conclusions can be drawn from the data analysis and simulation results. Firstly, there is little change in the amount of window opening during the different seasons within a temperature range of 10 to 35 degrees Celsius. This means that people who tend to open their windows have consistent window opening behavior throughout the year and high tolerance of indoor thermal conditions. Secondly, for all four elevations the lower-income development B opened more windows (almost two times more units) than higher-income development A meaning window opening behavior had strong correlations with income level. Thirdly, there is a lack of correlation between outdoor horizontal wind speed and window opening behavior, as the changes of wind speed do not seem to affect the action of opening windows in most conditions. Similar to the low correlation between horizontal wind speed and window opening percentage, it is found that vertical wind speed also cannot explain the window opening behavior of occupants. Fourthly, there is a slightly higher average of window opening on the south elevation than the north elevation, which may be due to the south elevation being well shaded from high angle sun during the summer and allowing heat into units from lower angle sun during the winter season. These findings are important to providing insight into how to better design urban environments and indoor thermal environments for a liveable high density city.Keywords: high-density housing, subtropical climate, urban behavior, window opening
Procedia PDF Downloads 12627788 A Study on the Urban Design Path of Historical Block in the Ancient City of Suzhou, China
Abstract:
In recent years, with the gradual change of Chinese urban development mode from 'incremental development' to 'stock-based renewal', the urban design method of ‘grand scene’ in the past could only cope with the planning and construction of incremental spaces such as new towns and new districts, while the problems involved in the renewal of the stock lands such as historic blocks of ancient cities are more complex. 'Simplified' large-scale demolition and construction may lead to the damage of the ancient city's texture and the overall cultural atmosphere; thus it is necessary to re-explore the urban design path of historical blocks in the conservation context of the ancient city. Through the study of the cultural context of the ancient city of Suzhou in China and the interpretation of its current characteristics, this paper explores the methods and paths for the renewal of historical and cultural blocks in the ancient city. It takes No. 12 and No. 13 historical blocks in the ancient city of Suzhou as examples, coordinating the spatial layout and the landscape and shaping the regional characteristics to improve the quality of the ancient city's life. This paper analyses the idea of conservation and regeneration from the aspects of culture, life, business form, and transport. Guided by the planning concept of ‘block repair and cultural infiltration’, it puts forward the urban design path of ‘conservation priority, activation and utilization, organic renewal and strengthening guidance’, with a view to continuing the cultural context and stimulating the vitality of ancient city, so as to realize the integration of history, modernity, space and culture. As a rare research on urban design in the scope of Suzhou ancient city, the paper expects to explore the concepts and methods of urban design for the historic blocks on the basis of the conservation of the history, space, and culture and provides a reference for other similar types of urban construction.Keywords: historical block, Suzhou ancient city, stock-based renewal, urban design
Procedia PDF Downloads 14627787 A Risk-Based Comprehensive Framework for the Assessment of the Security of Multi-Modal Transport Systems
Authors: Mireille Elhajj, Washington Ochieng, Deeph Chana
Abstract:
The challenges of the rapid growth in the demand for transport has traditionally been seen within the context of the problems of congestion, air quality, climate change, safety, and affordability. However, there are increasing threats including those related to crime such as cyber-attacks that threaten the security of the transport of people and goods. To the best of the authors’ knowledge, this paper presents for the first time, a comprehensive framework for the assessment of the current and future security issues of multi-modal transport systems. The approach or method proposed is based on a structured framework starting with a detailed specification of the transport asset map (transport system architecture), followed by the identification of vulnerabilities. The asset map and vulnerabilities are used to identify the various approaches for exploitation of the vulnerabilities, leading to the creation of a set of threat scenarios. The threat scenarios are then transformed into risks and their categories, and include insights for their mitigation. The consideration of the mitigation space is holistic and includes the formulation of appropriate policies and tactics and/or technical interventions. The quality of the framework is ensured through a structured and logical process that identifies the stakeholders, reviews the relevant documents including policies and identifies gaps, incorporates targeted surveys to augment the reviews, and uses subject matter experts for validation. The approach to categorising security risks is an extension of the current methods that are typically employed. Specifically, the partitioning of risks into either physical or cyber categories is too limited for developing mitigation policies and tactics/interventions for transport systems where an interplay between physical and cyber processes is very often the norm. This interplay is rapidly taking on increasing significance for security as the emergence of cyber-physical technologies, are shaping the future of all transport modes. Examples include: Connected Autonomous Vehicles (CAVs) in road transport; the European Rail Traffic Management System (ERTMS) in rail transport; Automatic Identification System (AIS) in maritime transport; advanced Communications, Navigation and Surveillance (CNS) technologies in air transport; and the Internet of Things (IoT). The framework adopts a risk categorisation scheme that considers risks as falling within the following threat→impact relationships: Physical→Physical, Cyber→Cyber, Cyber→Physical, and Physical→Cyber). Thus the framework enables a more complete risk picture to be developed for today’s transport systems and, more importantly, is readily extendable to account for emerging trends in the sector that will define future transport systems. The framework facilitates the audit and retro-fitting of mitigations in current transport operations and the analysis of security management options for the next generation of Transport enabling strategic aspirations such as systems with security-by-design and co-design of safety and security to be achieved. An initial application of the framework to transport systems has shown that intra-modal consideration of security measures is sub-optimal and that a holistic and multi-modal approach that also addresses the intersections/transition points of such networks is required as their vulnerability is high. This is in-line with traveler-centric transport service provision, widely accepted as the future of mobility services. In summary, a risk-based framework is proposed for use by the stakeholders to comprehensively and holistically assess the security of transport systems. It requires a detailed understanding of the transport architecture to enable a detailed vulnerabilities analysis to be undertaken, creates threat scenarios and transforms them into risks which form the basis for the formulation of interventions.Keywords: mitigations, risk, transport, security, vulnerabilities
Procedia PDF Downloads 16627786 Direct Design of Steel Bridge Using Nonlinear Inelastic Analysis
Authors: Boo-Sung Koh, Seung-Eock Kim
Abstract:
In this paper, a direct design using a nonlinear inelastic analysis is suggested. Also, this paper compares the load carrying capacity obtained by a nonlinear inelastic analysis with experiment results to verify the accuracy of the results. The allowable stress design results of a railroad through a plate girder bridge and the safety factor of the nonlinear inelastic analysis were compared to examine the safety performance. As a result, the load safety factor for the nonlinear inelastic analysis was twice as high as the required safety factor under the allowable stress design standard specified in the civil engineering structure design standards for urban magnetic levitation railways, which further verified the advantages of the proposed direct design method.Keywords: direct design, nonlinear inelastic analysis, residual stress, initial geometric imperfection
Procedia PDF Downloads 53127785 Multidimensional Item Response Theory Models for Practical Application in Large Tests Designed to Measure Multiple Constructs
Authors: Maria Fernanda Ordoñez Martinez, Alvaro Mauricio Montenegro
Abstract:
This work presents a statistical methodology for measuring and founding constructs in Latent Semantic Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations present on Item Response Theory. More precisely, we propose initially reducing dimensionality with specific use of Principal Component Analysis for the linguistic data and then, producing axes of groups made from a clustering analysis of the semantic data. This approach allows the user to give meaning to previous clusters and found the real latent structure presented by data. The methodology is applied in a set of real semantic data presenting impressive results for the coherence, speed and precision.Keywords: semantic analysis, factorial analysis, dimension reduction, penalized logistic regression
Procedia PDF Downloads 44427784 On-Farm Research on Organic Fruits Production in the Eastern Thailand
Authors: Sali Chinsathit, Haruthai Kaenla
Abstract:
Organic agriculture has become a major policy theme for agricultural development in Thailand since October 2005. Organic farming is enlisted as an important national agenda, to promote safe food and national export, and many government authorities have initiated projects and activities centered on organic farming promotion. Currently, Thailand has the market share of about 32 million US$ a year by exporting organic products of rice, vegetables, tea, fruits and a few medicinal herbs. There is high potential in organic crop production as there is the tropical environment promoting crop growth and leader farmer in organic farming. However, organic sector is relatively small (0.2%) comparing with conventional agricultural area, since there are many factors affecting farmers’ adoption and success in organic farming. The objective of this project was to get the organic production technology for at least 3 organic crops. The treatment and method were complied with Thai Organic Standard, and were mainly concerned on increase plant biodiversity and soil improvement by using organic fertilizer and bio-extract from fish, egg, plant and fruits. The bio-logical control, plant-extracts, and cultural practices were used to control insect pests and diseases of 3 crops including mangosteen (Garcinia mangostana L.), longkong (Aglaia dookoo Griff.) and banana (Musa (AA group)). The experiments were carried out at research centers of Department of Agriculture and farmers’ farms in Rayong and Chanthaburi provinces from 2009 to 2013. We found that both locations, plant biodiversity by intercropping mangosteen or longkong with banana and soil improvement with composts and bio-extract from fish could increased yield and farmers’ income by 6,835 US$/ha/year. Farmers got knowledge from these technologies to produce organic crops. The organic products were sold both in domestic and international countries. The organic production technologies were also environmental friendly and could be used as an alternative way for farmers in Thailand.Keywords: banana, longkong, mangosteen, organic farming
Procedia PDF Downloads 360