Search results for: victim-centered approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13431

Search results for: victim-centered approach

11991 A Large Dataset Imputation Approach Applied to Country Conflict Prediction Data

Authors: Benjamin Leiby, Darryl Ahner

Abstract:

This study demonstrates an alternative stochastic imputation approach for large datasets when preferred commercial packages struggle to iterate due to numerical problems. A large country conflict dataset motivates the search to impute missing values well over a common threshold of 20% missingness. The methodology capitalizes on correlation while using model residuals to provide the uncertainty in estimating unknown values. Examination of the methodology provides insight toward choosing linear or nonlinear modeling terms. Static tolerances common in most packages are replaced with tailorable tolerances that exploit residuals to fit each data element. The methodology evaluation includes observing computation time, model fit, and the comparison of known values to replaced values created through imputation. Overall, the country conflict dataset illustrates promise with modeling first-order interactions while presenting a need for further refinement that mimics predictive mean matching.

Keywords: correlation, country conflict, imputation, stochastic regression

Procedia PDF Downloads 106
11990 [Keynote Talk]: Water Resources Vulnerability Assessment to Climate Change in a Semi-Arid Basin of South India

Authors: K. Shimola, M. Krishnaveni

Abstract:

This paper examines vulnerability assessment of water resources in a semi-arid basin using the 4-step approach. The vulnerability assessment framework is developed to study the water resources vulnerability which includes the creation of GIS-based vulnerability maps. These maps represent the spatial variability of the vulnerability index. This paper introduces the 4-step approach to assess vulnerability that incorporates a new set of indicators. The approach is demonstrated using a framework composed of a precipitation data for (1975–2010) period, temperature data for (1965–2010) period, hydrological model outputs and the water resources GIS data base. The vulnerability assessment is a function of three components such as exposure, sensitivity and adaptive capacity. The current water resources vulnerability is assessed using GIS based spatio-temporal information. Rainfall Coefficient of Variation, monsoon onset and end date, rainy days, seasonality indices, temperature are selected for the criterion ‘exposure’. Water yield, ground water recharge, evapotranspiration (ET) are selected for the criterion ‘sensitivity’. Type of irrigation and storage structures are selected for the criterion ‘Adaptive capacity’. These indicators were mapped and integrated in GIS environment using overlay analysis. The five sub-basins, namely Arjunanadhi, Kousiganadhi, Sindapalli-Uppodai and Vallampatti Odai, fall under medium vulnerability profile, which indicates that the basin is under moderate stress of water resources. The paper also explores prioritization of sub-basinwise adaptation strategies to climate change based on the vulnerability indices.

Keywords: adaptive capacity, exposure, overlay analysis, sensitivity, vulnerability

Procedia PDF Downloads 301
11989 Stereo Motion Tracking

Authors: Yudhajit Datta, Hamsi Iyer, Jonathan Bandi, Ankit Sethia

Abstract:

Motion Tracking and Stereo Vision are complicated, albeit well-understood problems in computer vision. Existing softwares that combine the two approaches to perform stereo motion tracking typically employ complicated and computationally expensive procedures. The purpose of this study is to create a simple and effective solution capable of combining the two approaches. The study aims to explore a strategy to combine the two techniques of two-dimensional motion tracking using Kalman Filter; and depth detection of object using Stereo Vision. In conventional approaches objects in the scene of interest are observed using a single camera. However for Stereo Motion Tracking; the scene of interest is observed using video feeds from two calibrated cameras. Using two simultaneous measurements from the two cameras a calculation for the depth of the object from the plane containing the cameras is made. The approach attempts to capture the entire three-dimensional spatial information of each object at the scene and represent it through a software estimator object. In discrete intervals, the estimator tracks object motion in the plane parallel to plane containing cameras and updates the perpendicular distance value of the object from the plane containing the cameras as depth. The ability to efficiently track the motion of objects in three-dimensional space using a simplified approach could prove to be an indispensable tool in a variety of surveillance scenarios. The approach may find application from high security surveillance scenes such as premises of bank vaults, prisons or other detention facilities; to low cost applications in supermarkets and car parking lots.

Keywords: kalman filter, stereo vision, motion tracking, matlab, object tracking, camera calibration, computer vision system toolbox

Procedia PDF Downloads 309
11988 Dynamic Process Monitoring of an Ammonia Synthesis Fixed-Bed Reactor

Authors: Bothinah Altaf, Gary Montague, Elaine B. Martin

Abstract:

This study involves the modeling and monitoring of an ammonia synthesis fixed-bed reactor using partial least squares (PLS) and its variants. The process exhibits complex dynamic behavior due to the presence of heat recycling and feed quench. One limitation of static PLS model in this situation is that it does not take account of the process dynamics and hence dynamic PLS was used. Although it showed, superior performance to static PLS in terms of prediction, the monitoring scheme was inappropriate hence adaptive PLS was considered. A limitation of adaptive PLS is that non-conforming observations also contribute to the model, therefore, a new adaptive approach was developed, robust adaptive dynamic PLS. This approach updates a dynamic PLS model and is robust to non-representative data. The developed methodology showed a clear improvement over existing approaches in terms of the modeling of the reactor and the detection of faults.

Keywords: ammonia synthesis fixed-bed reactor, dynamic partial least squares modeling, recursive partial least squares, robust modeling

Procedia PDF Downloads 375
11987 Goal Orientation, Learning Strategies and Academic Performance in Adult Distance Learning

Authors: Ying Zhou, Jian-Hua Wang

Abstract:

Based upon the self-determination theory and self-regulated learning theory, this study examined the predictiveness of goal orientation and self-regulated learning strategies on academic achievement of adult students in distance learning. The results show a positive relation between goal orientation and the use of self-regulated strategies, and academic achievements. A significant and positive indirect relation of mastery goal orientation through self-regulated learning strategies was also found. In addition, results pointed to a positive indirect impact of performance-approach goal orientation on academic achievement. The effort regulation strategy fully mediated this relation. The theoretical and instructional implications are discussed. Interventions can be made to motivate students’ mastery or performance approach goal orientation and help them manage their time or efforts.

Keywords: goal orientation, self-regulated strategies, achievement, adult distance students

Procedia PDF Downloads 259
11986 Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems

Authors: Belkacem Laimouche

Abstract:

With the field of artificial intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.

Keywords: artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, interlaboratory comparison, data analysis, data reliability, measurement of bias impact on predictions, improvement of model accuracy and reliability

Procedia PDF Downloads 92
11985 Application of Fuzzy TOPSIS in Evaluating Green Transportation Options for Dhaka Megacity

Authors: Md. Moniruzzaman, Thirayoot Limanond

Abstract:

Being the most visible indicator, the transport system of a city points out how developed the city is. Dhaka megacity holds a mixed composition of motorized and non-motorized modes of transport and the number of vehicle figure is escalating over times. And this obviously poses associated environmental costs like air pollution, noise etc. which is degrading the quality of life in the city. Eventually sustainable transport or more importantly green transport from environmental point of view has become a prime choice to the transport professionals in order to cope up the crisis. Currently the city authority is planning to execute such sustainable transport systems that could serve the pressing demand of the present and meet the future needs effectively. This study focuses on the selection and evaluation of green transportation systems among potential alternatives on a priority basis. In this paper, Fuzzy TOPSIS - a multi-criteria decision method is presented to find out the most prioritized alternative. In the first step, Twenty-one individual specific criteria for sustainability assessment are selected. In the following step, experts provide linguistic ratings to the potential alternatives with respect to the selected criteria. The approach is used to generate aggregate scores for sustainability assessment and selection of the best alternative. In the third step, a sensitivity analysis is performed to understand the influence of criteria weights on the decision making process. The key strength of fuzzy TOPSIS approach is its practical applicability having a generation of good quality solution even under uncertainty.

Keywords: green transport, multi-criteria decision approach, urban transportation system, sustainability assessment, fuzzy theory, uncertainty

Procedia PDF Downloads 274
11984 A Comparation Analysis of Islamic Bank Efficiency in the United Kingdom and Indonesia during Eurozone Crisis Using Data Envelopment Analysis

Authors: Nisful Laila, Fatin Fadhilah Hasib, Puji Sucia Sukmaningrum, Achsania Hendratmi

Abstract:

The purpose of this study is to determine and comparing the level of efficiency of Islamic Banks in Indonesia and United Kingdom during eurozone sovereign debt crisis. This study using a quantitative non-parametric approach with Data Envelopment Analysis (DEA) VRS assumption, and a statistical tool Mann-Whitney U-Test. The samples are 11 Islamic Banks in Indonesia and 4 Islamic Banks in England. This research used mediating approach. Input variable consists of total deposit, asset, and the cost of labour. Output variable consists of financing and profit/loss. This study shows that the efficiency of Islamic Bank in Indonesia and United Kingdom are varied and fluctuated during the observation period. There is no significant different the efficiency performance of Islamic Banks in Indonesia and United Kingdom.

Keywords: data envelopment analysis, efficiency, eurozone crisis, islamic bank

Procedia PDF Downloads 313
11983 Intelligent Algorithm-Based Tool-Path Planning and Optimization for Additive Manufacturing

Authors: Efrain Rodriguez, Sergio Pertuz, Cristhian Riano

Abstract:

Tool-path generation is an essential step in the FFF (Fused Filament Fabrication)-based Additive Manufacturing (AM) process planning. In the manufacture of a mechanical part by using additive processes, high resource consumption and prolonged production times are inherent drawbacks of these processes mainly due to non-optimized tool-path generation. In this work, we propose a heuristic-search intelligent algorithm-based approach for optimized tool-path generation for FFF-based AM. The main benefit of this approach is a significant reduction of travels without material deposition when the AM machine performs moves without any extrusion. The optimization method used reduces the number of travels without extrusion in comparison with commercial software as Slic3r or Cura Engine, which means a reduction of production time.

Keywords: additive manufacturing, tool-path optimization, fused filament fabrication, process planning

Procedia PDF Downloads 429
11982 A System Dynamic Based DSS for Ecological Urban Management in Alexandria, Egypt

Authors: Mona M. Salem, Khaled S. Al-Hagla, Hany M. Ayad

Abstract:

The concept of urban metabolism has increasingly been employed in a diverse range of disciplines as a mean to analyze and theorize the city. Urban ecology has a particular focus on the implications of applying the metabolism concept to the urban realm. This approach has been developed by a few researchers, though it has rarely if ever been used in policy development for city planning. The aim of this research is to use ecologically informed urban planning interventions to increase the sustainability of urban metabolism; with special focus on land stock as a most important city resource by developing a system dynamic based DSS. This model identifies two critical management strategy variables for the Strategic Urban Plan Alexandria SUP 2032. As a result, this comprehensive and precise quantitative approach is needed to monitor, measure, evaluate and observe dynamic urban changes working as a decision support system (DSS) for policy making.

Keywords: ecology, land resource, LULCC, management, metabolism, model, scenarios, system dynamics, urban development

Procedia PDF Downloads 365
11981 Knowledge Representation and Inconsistency Reasoning of Class Diagram Maintenance in Big Data

Authors: Chi-Lun Liu

Abstract:

Requirements modeling and analysis are important in successful information systems' maintenance. Unified Modeling Language (UML) class diagrams are useful standards for modeling information systems. To our best knowledge, there is a lack of a systems development methodology described by the organism metaphor. The core concept of this metaphor is adaptation. Using the knowledge representation and reasoning approach and ontologies to adopt new requirements are emergent in recent years. This paper proposes an organic methodology which is based on constructivism theory. This methodology is a knowledge representation and reasoning approach to analyze new requirements in the class diagrams maintenance. The process and rules in the proposed methodology automatically analyze inconsistencies in the class diagram. In the big data era, developing an automatic tool based on the proposed methodology to analyze large amounts of class diagram data is an important research topic in the future.

Keywords: knowledge representation, reasoning, ontology, class diagram, software engineering

Procedia PDF Downloads 219
11980 A Virtual Reality Cybersecurity Training Knowledge-Based Ontology

Authors: Shaila Rana, Wasim Alhamdani

Abstract:

Effective cybersecurity learning relies on an engaging, interactive, and entertaining activity that fosters positive learning outcomes. VR cybersecurity training may promote these aforementioned variables. However, a methodological approach and framework have not yet been created to allow trainers and educators to employ VR cybersecurity training methods to promote positive learning outcomes to the author’s best knowledge. Thus, this paper aims to create an approach that cybersecurity trainers can follow to create a VR cybersecurity training module. This methodology utilizes concepts from other cybersecurity training frameworks, such as NICE and CyTrONE. Other cybersecurity training frameworks do not incorporate the use of VR. VR training proposes unique challenges that cannot be addressed in current cybersecurity training frameworks. Subsequently, this ontology utilizes concepts unique to developing VR training to create a relevant methodology for creating VR cybersecurity training modules. The outcome of this research is to create a methodology that is relevant and useful for designing VR cybersecurity training modules.

Keywords: virtual reality cybersecurity training, VR cybersecurity training, traditional cybersecurity training, ontology

Procedia PDF Downloads 269
11979 An Evaluation of Barriers to Implement Reverse Logistics: A Case Study of Indian Fastener Industry

Authors: D. Garg, S. Luthra, A. Haleem

Abstract:

Reverse logistics (RL) is supposed to be a systematic procedure that helps in improving the environmental hazards and maintain business sustainability for industries. Industries in Indian are now opting for adoption of RL techniques in business. But, RL practices are not popular in Indian industries because of many barriers for its successful implementation. Therefore, need arises to identify and evaluate the barriers to implement RL practices by taking an Indian industries perspective. Literature review approach and case study approach have been adapted to identify relevant barriers to implement RL practices. Further, Fuzzy Decision Making Trial and Evaluation Laboratory methodology has been brought into use for evaluating causal relationships among the barriers to implement RL practices. Seven barriers out of ten barriers have been categorized into the cause group and remaining into effect group. This research will help Indian industries to manage these barriers towards effective implementing RL practices.

Keywords: barriers, decision making trial and evaluation laboratory (DEMATEL), fuzzy set theory, Indian industries, reverse logistics (RL)

Procedia PDF Downloads 310
11978 One-Step Synthesis of Titanium Dioxide Porous Microspheres by Picosecond Pulsed Laser Welding

Authors: Huiwu Yu, Xiangyou Li, Xiaoyan Zeng

Abstract:

Porous spheres have been widely used in many fields due to their attractive features. In this work, an approach for fabricating porous spheres of nanoparticles was presented, in which the nanoparticles were welded together to form micro spheres by simply irradiating the nanoparticles in liquid medium by a picosecond laser. As an example, anatase titanium dioxide was chosen as a typical material on account of its metastability. The structure and morphologies of the products were characterised by X-ray diffraction (XRD), scanning electron microscope (SEM), Raman, and high-resolution transmission electron microscopy (HRTEM), respectively. The results showed that, anatase titanium dioxide micro spheres (2-10 μm) with macroporous (10-100 nm) were prepared from nano-anatase titanium dioxide nanoparticles (10-100 nm). The formation process of polycrystalline anatase titanium dioxide microspheres was investigated with different liquid mediums and the input laser fluences. Thus, this facile laser irradiation approach might provide a way for the fabrication of porous microspheres without phase-transition.

Keywords: titanium dioxide, porous microspheres, picosecond laser, nano-welding

Procedia PDF Downloads 288
11977 Application of a Universal Distortion Correction Method in Stereo-Based Digital Image Correlation Measurement

Authors: Hu Zhenxing, Gao Jianxin

Abstract:

Stereo-based digital image correlation (also referred to as three-dimensional (3D) digital image correlation (DIC)) is a technique for both 3D shape and surface deformation measurement of a component, which has found increasing applications in academia and industries. The accuracy of the reconstructed coordinate depends on many factors such as configuration of the setup, stereo-matching, distortion, etc. Most of these factors have been investigated in literature. For instance, the configuration of a binocular vision system determines the systematic errors. The stereo-matching errors depend on the speckle quality and the matching algorithm, which can only be controlled in a limited range. And the distortion is non-linear particularly in a complex imaging acquisition system. Thus, the distortion correction should be carefully considered. Moreover, the distortion function is difficult to formulate in a complex imaging acquisition system using conventional models in such cases where microscopes and other complex lenses are involved. The errors of the distortion correction will propagate to the reconstructed 3D coordinates. To address the problem, an accurate mapping method based on 2D B-spline functions is proposed in this study. The mapping functions are used to convert the distorted coordinates into an ideal plane without distortions. This approach is suitable for any image acquisition distortion models. It is used as a prior process to convert the distorted coordinate to an ideal position, which enables the camera to conform to the pin-hole model. A procedure of this approach is presented for stereo-based DIC. Using 3D speckle image generation, numerical simulations were carried out to compare the accuracy of both the conventional method and the proposed approach.

Keywords: distortion, stereo-based digital image correlation, b-spline, 3D, 2D

Procedia PDF Downloads 484
11976 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model

Authors: Chaudhuri Manoj Kumar Swain, Susmita Das

Abstract:

This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.

Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis

Procedia PDF Downloads 155
11975 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models

Authors: Rossella Arcucci, Luisa D'Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti

Abstract:

This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.

Keywords: data assimilation, GPU architectures, ocean models, parallel algorithm

Procedia PDF Downloads 396
11974 DEMs: A Multivariate Comparison Approach

Authors: Juan Francisco Reinoso Gordo, Francisco Javier Ariza-López, José Rodríguez Avi, Domingo Barrera Rosillo

Abstract:

The evaluation of the quality of a data product is based on the comparison of the product with a reference of greater accuracy. In the case of MDE data products, quality assessment usually focuses on positional accuracy and few studies consider other terrain characteristics, such as slope and orientation. The proposal that is made consists of evaluating the similarity of two DEMs (a product and a reference), through the joint analysis of the distribution functions of the variables of interest, for example, elevations, slopes and orientations. This is a multivariable approach that focuses on distribution functions, not on single parameters such as mean values or dispersions (e.g. root mean squared error or variance). This is considered to be a more holistic approach. The use of the Kolmogorov-Smirnov test is proposed due to its non-parametric nature, since the distributions of the variables of interest cannot always be adequately modeled by parametric models (e.g. the Normal distribution model). In addition, its application to the multivariate case is carried out jointly by means of a single test on the convolution of the distribution functions of the variables considered, which avoids the use of corrections such as Bonferroni when several statistics hypothesis tests are carried out together. In this work, two DEM products have been considered, DEM02 with a resolution of 2x2 meters and DEM05 with a resolution of 5x5 meters, both generated by the National Geographic Institute of Spain. DEM02 is considered as the reference and DEM05 as the product to be evaluated. In addition, the slope and aspect derived models have been calculated by GIS operations on the two DEM datasets. Through sample simulation processes, the adequate behavior of the Kolmogorov-Smirnov statistical test has been verified when the null hypothesis is true, which allows calibrating the value of the statistic for the desired significance value (e.g. 5%). Once the process has been calibrated, the same process can be applied to compare the similarity of different DEM data sets (e.g. the DEM05 versus the DEM02). In summary, an innovative alternative for the comparison of DEM data sets based on a multinomial non-parametric perspective has been proposed by means of a single Kolmogorov-Smirnov test. This new approach could be extended to other DEM features of interest (e.g. curvature, etc.) and to more than three variables

Keywords: data quality, DEM, kolmogorov-smirnov test, multivariate DEM comparison

Procedia PDF Downloads 99
11973 Islam, Tolerance and Anti-Terrorism: A Critical Assessment with Reference to the Royal 'Amman Message'

Authors: Adnan M. Al Assaf

Abstract:

This research project aims to assess the methods of enhancing tolerant thinking and behavior among Muslim societies. This is in addition to spreading the anti-terrorist approach in their communities. The critical assessment for the Islamic major texts in question is the selected way for convincing, as Muslims adopt these sources as the authentic references for their lives and cultures. Moreover, this research devotes a special room to the analysis of the royal ‘Amman Message’ as a contemporary Islamic approach for enhancing tolerance and anti-terrorism from an Islamic perspective. The paper includes the study of the related concepts, texts, practical applications, with some reference to the history of Islam in human interaction, accepting the others, mercy with minorities, protecting human rights. Furthermore, it assesses the methods of enhancing tolerance and minimizing the terrorist thinking and behavior practically, in the view of Amman message, as well.

Keywords: Islam, tolerance, anti-terrorism, coexistence, Amman Message

Procedia PDF Downloads 442
11972 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication

Authors: Aishwarya Shekhar, Himanshu Sharma

Abstract:

Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.

Keywords: confidentiality, deduplication, data compression, hybridity of cloud

Procedia PDF Downloads 367
11971 Project Work with Design Thinking and Blended Learning: A Practical Report from Teaching in Higher Education

Authors: C. Vogeler

Abstract:

Change processes such as individualization and digitalization have an impact on higher education. Graduates are expected to cooperate in creative work processes in their professional life. During their studies, they need to be prepared accordingly. This includes modern learning scenarios that integrate the benefits of digital media. Therefore, design thinking and blended learning have been combined in the project-based seminar conception introduced here. The presented seminar conception has been realized and evaluated with students of information sciences since September 2017. Within the seminar, the students learn to work on a project. They apply the methods in a problem-based learning scenario. Task of the case study is to arrange a conference on the topic gaming in libraries. In order to collaborative develop creative possibilities of realization within the group of students the design thinking method has been chosen. Design thinking is a method, used to create user-centric, problem-solving and need-driven innovation through creative collaboration in multidisciplinary teams. Central characteristics are the openness of this approach to work results and the visualization of ideas. This approach is now also accepted in the field of higher education. Especially in problem-based learning scenarios, the method offers clearly defined process steps for creative ideas and their realization. The creative process can be supported by digital media, such as search engines and tools for the documentation of brainstorming, creation of mind maps, project management etc. Because the students have to do two-thirds of the workload in their private study, design thinking has been combined with a blended learning approach. This supports students’ preparation and follow-up of the joint work in workshops (flipped classroom scenario) as well as the communication and collaboration during the entire project work phase. For this purpose, learning materials are provided on a Moodle-based learning platform as well as various tools that supported the design thinking process as described above. In this paper, the seminar conception with a combination of design thinking and blended learning is described and the potentials and limitations of the chosen strategy for the development of a course with a multimedia approach in higher education are reflected.

Keywords: blended learning, design thinking, digital media tools and methods, flipped classroom

Procedia PDF Downloads 184
11970 Solid Waste Management through Mushroom Cultivation: An Eco Friendly Approach

Authors: Mary Josephine

Abstract:

Waste of certain process can be the input source of other sectors in order to reduce environmental pollution. Today there are more and more solid wastes are generated, but only very small amount of those are recycled. So, the threatening of environmental pressure to public health is very serious. The methods considered for the treatment of solid waste are biogas tanks or processing to make animal feed and fertilizer, however, they did not perform well. An alternative approach is growing mushrooms on waste residues. This is regarded as an environmental friendly solution with potential economic benefit. The substrate producers do their best to produce quality substrate at low cost. Apart from other methods, this can be achieved by employing biologically degradable wastes used as the resource material component of the substrate. Mushroom growing is a significant tool for the restoration, replenishment and remediation of Earth’s overburdened ecosphere. One of the rational methods of waste utilization involves locally available wastes. The present study aims to find out the yield of mushroom grown on locally available waste for free and to conserve our environment by recycling wastes.

Keywords: biodegradable, environment, mushroom, remediation

Procedia PDF Downloads 381
11969 A Relational Case-Based Reasoning Framework for Project Delivery System Selection

Authors: Yang Cui, Yong Qiang Chen

Abstract:

An appropriate project delivery system (PDS) is crucial to the success of a construction project. Case-based reasoning (CBR) is a useful support for PDS selection. However, the traditional CBR approach represents cases as attribute-value vectors without taking relations among attributes into consideration, and could not calculate the similarity when the structures of cases are not strictly same. Therefore, this paper solves this problem by adopting the relational case-based reasoning (RCBR) approach for PDS selection, considering both the structural similarity and feature similarity. To develop the feature terms of the construction projects, the criteria and factors governing PDS selection process are first identified. Then, feature terms for the construction projects are developed. Finally, the mechanism of similarity calculation and a case study indicate how RCBR works for PDS selection. The adoption of RCBR in PDS selection expands the scope of application of traditional CBR method and improves the accuracy of the PDS selection system.

Keywords: relational cased-based reasoning, case-based reasoning, project delivery system, PDS selection

Procedia PDF Downloads 414
11968 Optimal Hedging of a Portfolio of European Options in an Extended Binomial Model under Proportional Transaction Costs

Authors: Norm Josephy, Lucy Kimball, Victoria Steblovskaya

Abstract:

Hedging of a portfolio of European options under proportional transaction costs is considered. Our discrete time financial market model extends the binomial market model with transaction costs to the case where the underlying stock price ratios are distributed over a bounded interval rather than over a two-point set. An optimal hedging strategy is chosen from a set of admissible non-self-financing hedging strategies. Our approach to optimal hedging of a portfolio of options is based on theoretical foundation that includes determination of a no-arbitrage option price interval as well as on properties of the non-self-financing strategies and their residuals. A computational algorithm for optimizing an investor relevant criterion over the set of admissible non-self-financing hedging strategies is developed. Applicability of our approach is demonstrated using both simulated data and real market data.

Keywords: extended binomial model, non-self-financing hedging, optimization, proportional transaction costs

Procedia PDF Downloads 239
11967 On an Approach for Rule Generation in Association Rule Mining

Authors: B. Chandra

Abstract:

In Association Rule Mining, much attention has been paid for developing algorithms for large (frequent/closed/maximal) itemsets but very little attention has been paid to improve the performance of rule generation algorithms. Rule generation is an important part of Association Rule Mining. In this paper, a novel approach named NARG (Association Rule using Antecedent Support) has been proposed for rule generation that uses memory resident data structure named FCET (Frequent Closed Enumeration Tree) to find frequent/closed itemsets. In addition, the computational speed of NARG is enhanced by giving importance to the rules that have lower antecedent support. Comparative performance evaluation of NARG with fast association rule mining algorithm for rule generation has been done on synthetic datasets and real life datasets (taken from UCI Machine Learning Repository). Performance analysis shows that NARG is computationally faster in comparison to the existing algorithms for rule generation.

Keywords: knowledge discovery, association rule mining, antecedent support, rule generation

Procedia PDF Downloads 307
11966 Detecting Heartbeat Architectural Tactic in Source Code Using Program Analysis

Authors: Ananta Kumar Das, Sujit Kumar Chakrabarti

Abstract:

Architectural tactics such as heartbeat, ping-echo, encapsulate, encrypt data are techniques that are used to achieve quality attributes of a system. Detecting architectural tactics has several benefits: it can aid system comprehension (e.g., legacy systems) and in the estimation of quality attributes such as safety, security, maintainability, etc. Architectural tactics are typically spread over the source code and are implicit. For large codebases, manual detection is often not feasible. Therefore, there is a need for automated methods of detection of architectural tactics. This paper presents a formalization of the heartbeat architectural tactic and a program analytic approach to detect this tactic in source code. The experiment of the proposed method is done on a set of Java applications. The outcome of the experiment strongly suggests that the method compares well with a manual approach in terms of its sensitivity and specificity, and far supersedes a manual exercise in terms of its scalability.

Keywords: software architecture, architectural tactics, detecting architectural tactics, program analysis, AST, alias analysis

Procedia PDF Downloads 142
11965 Using Emerging Hot Spot Analysis to Analyze Overall Effectiveness of Policing Policy and Strategy in Chicago

Authors: Tyler Gill, Sophia Daniels

Abstract:

The paper examines how accessing the spatial-temporal constrains of data will help inform policymakers and law enforcement officials. The authors utilize Chicago crime data from 2006-2016 to demonstrate how the Emerging Hot Spot Tool is an ideal hot spot clustering approach to analyze crime data. Traditional approaches include density maps or creating a spatial weights matrix to include the spatial-temporal constrains. This new approach utilizes a space-time implementation of the Getis-Ord Gi* statistic to visualize the data more quickly to make better decisions. The research will help complement socio-cultural research to find key patterns to help frame future policies and evaluate the implementation of prior strategies. Through this analysis, homicide trends and patterns are found more effectively and recommendations for use by non-traditional users of GIS are offered for real life implementation.

Keywords: crime mapping, emerging hot spot analysis, Getis-Ord Gi*, spatial-temporal analysis

Procedia PDF Downloads 230
11964 Russian Spatial Impersonal Sentence Models in Translation Perspective

Authors: Marina Fomina

Abstract:

The paper focuses on the category of semantic subject within the framework of a functional approach to linguistics. The semantic subject is related to similar notions such as the grammatical subject and the bearer of predicative feature. It is the multifaceted nature of the category of subject that 1) triggers a number of issues that, syntax-wise, remain to be dealt with (cf. semantic vs. syntactic functions / sentence parts vs. parts of speech issues, etc.); 2) results in a variety of approaches to the category of subject, such as formal grammatical, semantic/syntactic (functional), communicative approaches, etc. Many linguists consider the prototypical approach to the category of subject to be the most instrumental as it reveals the integrity of denotative and linguistic components of the conceptual category. This approach relates to subject as a source of non-passive predicative feature, an element of subject-predicate-object situation that can take on a variety of semantic roles, cf.: 1) an agent (He carefully surveyed the valley stretching before him), 2) an experiencer (I feel very bitter about this), 3) a recipient (I received this book as a gift), 4) a causee (The plane broke into three pieces), 5) a patient (This stove cleans easily), etc. It is believed that the variety of roles stems from the radial (prototypical) structure of the category with some members more central than others. Translation-wise, the most “treacherous” subject types are the peripheral ones. The paper 1) features a peripheral status of spatial impersonal sentence models such as U menia v ukhe zvenit (lit. I-Gen. in ear buzzes) within the category of semantic subject, 2) makes a structural and semantic analysis of the models, 3) focuses on their Russian-English translation patterns, 4) reveals non-prototypical features of subjects in the English equivalents.

Keywords: bearer of predicative feature, grammatical subject, impersonal sentence model, semantic subject

Procedia PDF Downloads 357
11963 An Integrated Approach to Cultural Heritage Management in the Indian Context

Authors: T. Lakshmi Priya

Abstract:

With the widening definition of heritage, the challenges of heritage management has become more complex . Today heritage not only includes significant monuments but comprises historic areas / sites, historic cities, cultural landscapes, and living heritage sites. There is a need for a comprehensive understanding of the values associated with these heritage resources, which will enable their protection and management. These diverse cultural resources are managed by multiple agencies having their own way of operating in the heritage sites. An Integrated approach to management of these cultural resources ensures its sustainability for the future generation. This paper outlines the importance of an integrated approach for the management and protection of complex heritage sites in India by examining four case studies. The methodology for this study is based on secondary research and primary surveys conducted during the preparation of the conservation management plansfor the various sites. The primary survey included basic documentation, inventorying, and community surveys. Red Fort located in the city of Delhi is one of the most significant forts built in 1639 by the Mughal Emperor Shahjahan. This fort is a national icon and stands testimony to the various historical events . It is on the ramparts of Red Fort that the national flag was unfurled on 15th August 1947, when India became independent, which continues even today. Management of this complex fort necessitated the need for an integrated approach, where in the needs of the official and non official stakeholders were addressed. The understanding of the inherent values and significance of this site was arrived through a systematic methodology of inventorying and mapping of information. Hampi, located in southern part of India, is a living heritage site inscribed in the World Heritage list in 1986. The site comprises of settlements, built heritage structures, traditional water systems, forest, agricultural fields and the remains of the metropolis of the 16th century Vijayanagar empire. As Hampi is a living heritage site having traditional systems of management and practices, the aim has been to include these practices in the current management so that there is continuity in belief, thought and practice. The existing national, regional and local planning instruments have been examined and the local concerns have been addressed.A comprehensive understanding of the site, achieved through an integrated model, is being translated to an action plan which safeguards the inherent values of the site. This paper also examines the case of the 20th century heritage building of National Archives of India, Delhi and protection of a 12th century Tomb of Sultan Ghari located in south Delhi. A comprehensive understanding of the site, lead to the delineation of the Archaeological Park of Sultan Ghari, in the current Master Plan for Delhi, for the protection of the tomb and the settlement around it. Through this study it is concluded that the approach of Integrated Conservation has enabled decision making that sustains the values of these complex heritage sites in Indian context.

Keywords: conservation, integrated, management, approach

Procedia PDF Downloads 71
11962 Evolution of Chemistry in the Waters of Superposed Aquifer System Terminal Complex in the Valley of the Oued Righ - Arid Area Algeria

Authors: Asma Bettahar, Imed Eldine Nezli, Sameh Habes

Abstract:

Groundwater resources in the Oued Righ valley are represented like the parts of the eastern basin of the Algerian Sahara, superposed by two major aquifers: the Intercalary Continental (IC) and the Terminal Complex (TC). From a qualitative point of view, various studies have highlighted that the waters of this region showed excessive mineralization, including the waters of the terminal complex (EC Avg equal 5854.61 S/cm). The present article is a statistical approach by two multi methods various complementary (ACP CAH), applied to the analytical data of multilayered aquifer waters Terminal Complex of the Oued Righ valley. The approach is to establish a correlation between the chemical composition of water and the lithological nature of different aquifer levels formations, and predict possible connection between groundwater’s layers. The results show that the mineralization of water is from geological origin. They concern the composition of the layers that make up the complex terminal.

Keywords: oued righ, complex terminal, infill continental, mineralization

Procedia PDF Downloads 437