Search results for: introductory programming
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 999

Search results for: introductory programming

489 Leveraging Power BI for Advanced Geotechnical Data Analysis and Visualization in Mining Projects

Authors: Elaheh Talebi, Fariba Yavari, Lucy Philip, Lesley Town

Abstract:

The mining industry generates vast amounts of data, necessitating robust data management systems and advanced analytics tools to achieve better decision-making processes in the development of mining production and maintaining safety. This paper highlights the advantages of Power BI, a powerful intelligence tool, over traditional Excel-based approaches for effectively managing and harnessing mining data. Power BI enables professionals to connect and integrate multiple data sources, ensuring real-time access to up-to-date information. Its interactive visualizations and dashboards offer an intuitive interface for exploring and analyzing geotechnical data. Advanced analytics is a collection of data analysis techniques to improve decision-making. Leveraging some of the most complex techniques in data science, advanced analytics is used to do everything from detecting data errors and ensuring data accuracy to directing the development of future project phases. However, while Power BI is a robust tool, specific visualizations required by geotechnical engineers may have limitations. This paper studies the capability to use Python or R programming within the Power BI dashboard to enable advanced analytics, additional functionalities, and customized visualizations. This dashboard provides comprehensive tools for analyzing and visualizing key geotechnical data metrics, including spatial representation on maps, field and lab test results, and subsurface rock and soil characteristics. Advanced visualizations like borehole logs and Stereonet were implemented using Python programming within the Power BI dashboard, enhancing the understanding and communication of geotechnical information. Moreover, the dashboard's flexibility allows for the incorporation of additional data and visualizations based on the project scope and available data, such as pit design, rock fall analyses, rock mass characterization, and drone data. This further enhances the dashboard's usefulness in future projects, including operation, development, closure, and rehabilitation phases. Additionally, this helps in minimizing the necessity of utilizing multiple software programs in projects. This geotechnical dashboard in Power BI serves as a user-friendly solution for analyzing, visualizing, and communicating both new and historical geotechnical data, aiding in informed decision-making and efficient project management throughout various project stages. Its ability to generate dynamic reports and share them with clients in a collaborative manner further enhances decision-making processes and facilitates effective communication within geotechnical projects in the mining industry.

Keywords: geotechnical data analysis, power BI, visualization, decision-making, mining industry

Procedia PDF Downloads 71
488 Electrical Load Estimation Using Estimated Fuzzy Linear Parameters

Authors: Bader Alkandari, Jamal Y. Madouh, Ahmad M. Alkandari, Anwar A. Alnaqi

Abstract:

A new formulation of fuzzy linear estimation problem is presented. It is formulated as a linear programming problem. The objective is to minimize the spread of the data points, taking into consideration the type of the membership function of the fuzzy parameters to satisfy the constraints on each measurement point and to insure that the original membership is included in the estimated membership. Different models are developed for a fuzzy triangular membership. The proposed models are applied to different examples from the area of fuzzy linear regression and finally to different examples for estimating the electrical load on a busbar. It had been found that the proposed technique is more suited for electrical load estimation, since the nature of the load is characterized by the uncertainty and vagueness.

Keywords: fuzzy regression, load estimation, fuzzy linear parameters, electrical load estimation

Procedia PDF Downloads 520
487 Analyzing the Practicality of Drawing Inferences in Automation of Commonsense Reasoning

Authors: Chandan Hegde, K. Ashwini

Abstract:

Commonsense reasoning is the simulation of human ability to make decisions during the situations that we encounter every day. It has been several decades since the introduction of this subfield of artificial intelligence, but it has barely made some significant progress. The modern computing aids also have remained impotent in this regard due to the absence of a strong methodology towards commonsense reasoning development. Among several accountable reasons for the lack of progress, drawing inference out of commonsense knowledge-base stands out. This review paper emphasizes on a detailed analysis of representation of reasoning uncertainties and feasible prospects of programming aids for drawing inferences. Also, the difficulties in deducing and systematizing commonsense reasoning and the substantial progress made in reasoning that influences the study have been discussed. Additionally, the paper discusses the possible impacts of an effective inference technique in commonsense reasoning.

Keywords: artificial intelligence, commonsense reasoning, knowledge base, uncertainty in reasoning

Procedia PDF Downloads 168
486 Some Pertinent Issues and Considerations on CBSE

Authors: Anil Kumar Tripathi, Ratneshwer Gupta

Abstract:

All the software engineering researches and best industry practices aim at providing software products with high degree of quality and functionality at low cost and less time. These requirements are addressed by the Component Based Software Engineering (CBSE) as well. CBSE, which deals with the software construction by components’ assembly, is a revolutionary extension of Software Engineering. CBSE must define and describe processes to assure timely completion of high quality software systems that are composed of a variety of pre built software components. Though these features provide distinct and visible benefits in software design and programming, they also raise some challenging problems. The aim of this work is to summarize the pertinent issues and considerations in CBSE to make an understanding in forms of concepts and observations that may lead to development of newer ways of dealing with the problems and challenges in CBSE.

Keywords: software component, component based software engineering, software process, testing, maintenance

Procedia PDF Downloads 383
485 Mathematical Modeling and Algorithms for the Capacitated Facility Location and Allocation Problem with Emission Restriction

Authors: Sagar Hedaoo, Fazle Baki, Ahmed Azab

Abstract:

In supply chain management, network design for scalable manufacturing facilities is an emerging field of research. Facility location allocation assigns facilities to customers to optimize the overall cost of the supply chain. To further optimize the costs, capacities of these facilities can be changed in accordance with customer demands. A mathematical model is formulated to fully express the problem at hand and to solve small-to-mid range instances. A dedicated constraint has been developed to restrict emissions in line with the Kyoto protocol. This problem is NP-Hard; hence, a simulated annealing metaheuristic has been developed to solve larger instances. A case study on the USA-Canada cross border crossing is used.

Keywords: emission, mixed integer linear programming, metaheuristic, simulated annealing

Procedia PDF Downloads 291
484 Intelligent Rescheduling Trains for Air Pollution Management

Authors: Kainat Affrin, P. Reshma, G. Narendra Kumar

Abstract:

Optimization of timetable is the need of the day for the rescheduling and routing of trains in real time. Trains are scheduled in parallel with the road transport vehicles to the same destination. As the number of trains is restricted due to single track, customers usually opt for road transport to use frequently. The air pollution increases as the density of vehicles on road transport is increased. Use of an alternate mode of transport like train helps in reducing air-pollution. This paper mainly aims at attracting the passengers to Train transport by proper rescheduling of trains using hybrid of stop-skip algorithm and iterative convex programming algorithm. Rescheduling of train bi-directionally is achieved on a single track with dynamic dual time and varying stops. Introduction of more trains attract customers to use rail transport frequently, thereby decreasing the pollution. The results are simulated using Network Simulator (NS-2).

Keywords: air pollution, AODV, re-scheduling, WSNs

Procedia PDF Downloads 341
483 Object-Oriented Program Comprehension by Identification of Software Components and Their Connexions

Authors: Abdelhak-Djamel Seriai, Selim Kebir, Allaoua Chaoui

Abstract:

During the last decades, object oriented program- ming has been massively used to build large-scale systems. However, evolution and maintenance of such systems become a laborious task because of the lack of object oriented programming to offer a precise view of the functional building blocks of the system. This lack is caused by the fine granularity of classes and objects. In this paper, we use a post object-oriented technology namely software components, to propose an approach based on the identification of the functional building blocks of an object oriented system by analyzing its source code. These functional blocks are specified as software components and the result is a multi-layer component based software architecture.

Keywords: software comprehension, software component, object oriented, software architecture, reverse engineering

Procedia PDF Downloads 393
482 Working Effectively with Muslim Communities in the West

Authors: Lisa Tribuzio

Abstract:

This paper explores the complexity of working with Muslim communities in Australia. It will draw upon the notions of belonging, social inclusion and effective community programming to engage Muslim communities in Western environments given the current global political climate. Factors taken into consideration for effective engagement include: family engagement, considering key practices such as Ramadan, fasting and prayer and food requirements, gender relations, core values around faith and spirituality, considering attitudes towards self disclosure in a counseling setting and the notion of Us and Them in the media and systems and its effect on minority communities. It will explore recent research in the field from Australian researchers as well as recommendations from United Nations in working with Muslim communities. It will also explore current practice models applied in Australia in engaging effectively with diverse communities and addressing racism and discrimination in innovative ways.

Keywords: Muslim, cultural diversity, social inclusion, racism

Procedia PDF Downloads 405
481 The Effect of Non-Normality on CB-SEM and PLS-SEM Path Estimates

Authors: Z. Jannoo, B. W. Yap, N. Auchoybur, M. A. Lazim

Abstract:

The two common approaches to Structural Equation Modeling (SEM) are the Covariance-Based SEM (CB-SEM) and Partial Least Squares SEM (PLS-SEM). There is much debate on the performance of CB-SEM and PLS-SEM for small sample size and when distributions are non-normal. This study evaluates the performance of CB-SEM and PLS-SEM under normality and non-normality conditions via a simulation. Monte Carlo Simulation in R programming language was employed to generate data based on the theoretical model with one endogenous and four exogenous variables. Each latent variable has three indicators. For normal distributions, CB-SEM estimates were found to be inaccurate for small sample size while PLS-SEM could produce the path estimates. Meanwhile, for a larger sample size, CB-SEM estimates have lower variability compared to PLS-SEM. Under non-normality, CB-SEM path estimates were inaccurate for small sample size. However, CB-SEM estimates are more accurate than those of PLS-SEM for sample size of 50 and above. The PLS-SEM estimates are not accurate unless sample size is very large.

Keywords: CB-SEM, Monte Carlo simulation, normality conditions, non-normality, PLS-SEM

Procedia PDF Downloads 386
480 Cost-Effective Hybrid Cloud Framework for HEI’s

Authors: Shah Muhammad Butt, Ahmed Masaud Ansari

Abstract:

Present Financial crisis in Higher Educational Institutes (HEIs) facing lots of problems considerable budget cuts, make difficult to meet the ever growing IT-based research and learning needs, institutions are rapidly planning and promoting cloud-based approaches for their academic and research needs. A cost effective Hybrid Cloud framework for HEI’s will provide educational services for campus or intercampus communication. Hybrid Cloud Framework comprises Private and Public Cloud approaches. This paper will propose the framework based on the Open Source Cloud (OpenNebula for Virtualization, Eucalyptus for Infrastructure, and Aneka for programming development environment) combined with CSP’s services which are delivered to the end-user via the Internet from public clouds.

Keywords: educational services, hybrid campus cloud, open source, electrical and systems sciences

Procedia PDF Downloads 435
479 Wearable Music: Generation of Costumes from Music and Generative Art and Wearing Them by 3-Way Projectors

Authors: Noriki Amano

Abstract:

The final goal of this study is to create another way in which people enjoy music through the performance of 'Wearable Music'. Concretely speaking, we generate colorful costumes in real- time from music and to realize their dressing by projecting them to a person. For this purpose, we propose three methods in this study. First, a method of giving color to music in a three-dimensionally way. Second, a method of generating images of costumes from music. Third, a method of wearing the images of music. In particular, this study stands out from other related work in that we generate images of unique costumes from music and realize to wear them. In this study, we use the technique of generative arts to generate images of unique costumes and project the images to the fog generated around a person from 3-way using projectors. From this study, we can get how to enjoy music as 'wearable'. Furthermore, we are also able to have the prospect of unconventional entertainment based on the fusion between music and costumes.

Keywords: entertainment computing, costumes, music, generative programming

Procedia PDF Downloads 157
478 Capacitated Multiple Allocation P-Hub Median Problem on a Cluster Based Network under Congestion

Authors: Çağrı Özgün Kibiroğlu, Zeynep Turgut

Abstract:

This paper considers a hub location problem where the network service area partitioned into predetermined zones (represented by node clusters is given) and potential hub nodes capacity levels are determined a priori as a selection criteria of hub to investigate congestion effect on network. The objective is to design hub network by determining all required hub locations in the node clusters and also allocate non-hub nodes to hubs such that the total cost including transportation cost, opening cost of hubs and penalty cost for exceed of capacity level at hubs is minimized. A mixed integer linear programming model is developed introducing additional constraints to the traditional model of capacitated multiple allocation hub location problem and empirically tested.

Keywords: hub location problem, p-hub median problem, clustering, congestion

Procedia PDF Downloads 478
477 Coordinated Voltage Control in Radial Distribution System with Distributed Generators Using Sensitivity Analysis

Authors: Anubhav Shrivastava Shivarudraswamy, Bhat Lakshya

Abstract:

Distributed generation has indeed become a major area of interest in recent years. Distributed generation can address a large number of loads in a power line and hence has better efficiency over the conventional methods. However, there are certain drawbacks associated with it, an increase in voltage being the major one. This paper addresses the voltage control at the buses for an IEEE 30 bus system by regulating reactive power. For carrying out the analysis, the suitable location for placing distributed generators (DG) is identified through load flow analysis and seeing where the voltage profile is dipping. MATLAB programming is used to regulate the voltage at all buses within +/- 5% of the base value even after the introduction of DGs. Three methods for regulation of voltage are discussed. A sensitivity based analysis is then carried out to determine the priority among the various methods listed in the paper.

Keywords: distributed generators, distributed system, reactive power, voltage control, sensitivity analysis

Procedia PDF Downloads 637
476 End To End Process to Automate Batch Application

Authors: Nagmani Lnu

Abstract:

Often, Quality Engineering refers to testing the applications that either have a User Interface (UI) or an Application Programming Interface (API). We often find mature test practices, standards, and automation regarding UI or API testing. However, another kind is present in almost all types of industries that deal with data in bulk and often get handled through something called a Batch Application. This is primarily an offline application companies develop to process large data sets that often deal with multiple business rules. The challenge gets more prominent when we try to automate batch testing. This paper describes the approaches taken to test a Batch application from a Financial Industry to test the payment settlement process (a critical use case in all kinds of FinTech companies), resulting in 100% test automation in Test Creation and Test execution. One can follow this approach for any other batch use cases to achieve a higher efficiency in their testing process.

Keywords: batch testing, batch test automation, batch test strategy, payments testing, payments settlement testing

Procedia PDF Downloads 35
475 The Application of a Hybrid Neural Network for Recognition of a Handwritten Kazakh Text

Authors: Almagul Assainova , Dariya Abykenova, Liudmila Goncharenko, Sergey Sybachin, Saule Rakhimova, Abay Aman

Abstract:

The recognition of a handwritten Kazakh text is a relevant objective today for the digitization of materials. The study presents a model of a hybrid neural network for handwriting recognition, which includes a convolutional neural network and a multi-layer perceptron. Each network includes 1024 input neurons and 42 output neurons. The model is implemented in the program, written in the Python programming language using the EMNIST database, NumPy, Keras, and Tensorflow modules. The neural network training of such specific letters of the Kazakh alphabet as ә, ғ, қ, ң, ө, ұ, ү, h, і was conducted. The neural network model and the program created on its basis can be used in electronic document management systems to digitize the Kazakh text.

Keywords: handwriting recognition system, image recognition, Kazakh font, machine learning, neural networks

Procedia PDF Downloads 242
474 Single-Cell Visualization with Minimum Volume Embedding

Authors: Zhenqiu Liu

Abstract:

Visualizing the heterogeneity within cell-populations for single-cell RNA-seq data is crucial for studying the functional diversity of a cell. However, because of the high level of noises, outlier, and dropouts, it is very challenging to measure the cell-to-cell similarity (distance), visualize and cluster the data in a low-dimension. Minimum volume embedding (MVE) projects the data into a lower-dimensional space and is a promising tool for data visualization. However, it is computationally inefficient to solve a semi-definite programming (SDP) when the sample size is large. Therefore, it is not applicable to single-cell RNA-seq data with thousands of samples. In this paper, we develop an efficient algorithm with an accelerated proximal gradient method and visualize the single-cell RNA-seq data efficiently. We demonstrate that the proposed approach separates known subpopulations more accurately in single-cell data sets than other existing dimension reduction methods.

Keywords: single-cell RNA-seq, minimum volume embedding, visualization, accelerated proximal gradient method

Procedia PDF Downloads 214
473 Research on Architectural Steel Structure Design Based on BIM

Authors: Tianyu Gao

Abstract:

Digital architectures use computer-aided design, programming, simulation, and imaging to create virtual forms and physical structures. Today's customers want to know more about their buildings. They want an automatic thermostat to learn their behavior and contact them, such as the doors and windows they want to open with a mobile app. Therefore, the architectural display form is more closely related to the customer's experience. Based on the purpose of building informationization, this paper studies the steel structure design based on BIM. Taking the Zigan office building in Hangzhou as an example, it is divided into four parts, namely, the digital design modulus of the steel structure, the node analysis of the steel structure, the digital production and construction of the steel structure. Through the application of BIM software, the architectural design can be synergized, and the building components can be informationized. Not only can the architectural design be feedback in the early stage, but also the stability of the construction can be guaranteed. In this way, the monitoring of the entire life cycle of the building and the meeting of customer needs can be realized.

Keywords: digital architectures, BIM, steel structure, architectural design

Procedia PDF Downloads 176
472 Hard and Soft Skills in Marketing Education: Using Serious Games to Engage Higher Order Processing

Authors: Ann Devitt, Mairead Brady, Markus Lamest, Stephen Gomez

Abstract:

This study set out to explore the use of an online collaborative serious game for student learning in a postgraduate introductory marketing module. The simulation game aimed to bridge the theory-practice divide in marketing by allowing students to apply theory in a safe, simulated marketplace. This study addresses the following research questions: Does an online marketing simulation game engage students higher order cognitive skills? Does collaborative activity required develop students’ “soft” skills, such as communication and negotiation? What specific affordances of the online simulation promote learning? This qualitative case study took place in 2014 with 40 postgraduate students on a Business Masters Programme. The two-week intensive module combined lectures with collaborative activity on a marketing simulation game, MMX from Pearsons. The game requires student teams to compete against other teams in a marketplace and design a marketing plan to maximize key performance indicators. The data for this study comprise essays written by students after the module reflecting on their learning on the module. A thematic analysis was conducted of the essays using the following a priori theme sets: 6 levels of the cognitive domain of Blooms taxonomy; 5 principles of Cooperative Learning; affordances of simulation environments including experiential learning; motivation and engagement; goal orientation. Preliminary findings would strongly suggest that the game facilitated students identifying the value of theory in practice, in particular for future employment; enhanced their understanding of group dynamics and their role within that; and impacted very strongly, both positively and negatively on motivation. In particular the game mechanics of MMX, which hinges on the correct identification of a target consumer group, was identified as a key determinant of extrinsic and intrinsic motivation for learners. The findings also suggest that the situation of the simulation game within a broader module which required post-game reflection was valuable in identifying key learning of marketing concepts in both the positive and the negative experiences of the game.

Keywords: simulation, marketing, serious game, cooperative learning, bloom's taxonomy

Procedia PDF Downloads 538
471 Evaluation of Quasi-Newton Strategy for Algorithmic Acceleration

Authors: T. Martini, J. M. Martínez

Abstract:

An algorithmic acceleration strategy based on quasi-Newton (or secant) methods is displayed for address the practical problem of accelerating the convergence of the Newton-Lagrange method in the case of convergence to critical multipliers. Since the Newton-Lagrange iteration converges locally at a linear rate, it is natural to conjecture that quasi-Newton methods based on the so called secant equation and some minimal variation principle, could converge superlinearly, thus restoring the convergence properties of Newton's method. This strategy can also be applied to accelerate the convergence of algorithms applied to fixed-points problems. Computational experience is reported illustrating the efficiency of this strategy to solve fixed-point problems with linear convergence rate.

Keywords: algorithmic acceleration, fixed-point problems, nonlinear programming, quasi-newton method

Procedia PDF Downloads 473
470 A Study of Quality Assurance and Unit Verification Methods in Safety Critical Environment

Authors: Miklos Taliga

Abstract:

In the present case study we examined the development and testing methods of systems that contain safety-critical elements in different industrial fields. Consequentially, we observed the classical object-oriented development and testing environment, as both medical technology and automobile industry approaches the development of safety critical elements that way. Subsequently, we examined model-based development. We introduce the quality parameters that define development and testing. While taking modern agile methodology (scrum) into consideration, we examined whether and to what extent the methodologies we found fit into this environment.

Keywords: safety-critical elements, quality managent, unit verification, model base testing, agile methods, scrum, metamodel, object-oriented programming, field specific modelling, sprint, user story, UML Standard

Procedia PDF Downloads 571
469 Designing Emergency Response Network for Rail Hazmat Shipments

Authors: Ali Vaezi, Jyotirmoy Dalal, Manish Verma

Abstract:

The railroad is one of the primary transportation modes for hazardous materials (hazmat) shipments in North America. Installing an emergency response network capable of providing a commensurate response is one of the primary levers to contain (or mitigate) the adverse consequences from rail hazmat incidents. To this end, we propose a two-stage stochastic program to determine the location of and equipment packages to be stockpiled at each response facility. The raw input data collected from publicly available reports were processed, fed into the proposed optimization program, and then tested on a realistic railroad network in Ontario (Canada). From the resulting analyses, we conclude that the decisions based only on empirical datasets would undermine the effectiveness of the resulting network; coverage can be improved by redistributing equipment in the network, purchasing equipment with higher containment capacity, and making use of a disutility multiplier factor.

Keywords: hazmat, rail network, stochastic programming, emergency response

Procedia PDF Downloads 163
468 Communicative and Artistic Machines: A Survey of Models and Experiments on Artificial Agents

Authors: Artur Matuck, Guilherme F. Nobre

Abstract:

Machines can be either tool, media, or social agents. Advances in technology have been delivering machines capable of autonomous expression, both through communication and art. This paper deals with models (theoretical approach) and experiments (applied approach) related to artificial agents. On one hand it traces how social sciences' scholars have worked with topics such as text automatization, man-machine writing cooperation, and communication. On the other hand it covers how computer sciences' scholars have built communicative and artistic machines, including the programming of creativity. The aim is to present a brief survey on artificially intelligent communicators and artificially creative writers, and provide the basis to understand the meta-authorship and also to new and further man-machine co-authorship.

Keywords: artificial communication, artificial creativity, artificial writers, meta-authorship, robotic art

Procedia PDF Downloads 274
467 Mathematical Modeling for the Break-Even Point Problem in a Non-homogeneous System

Authors: Filipe Cardoso de Oliveira, Lino Marcos da Silva, Ademar Nogueira do Nascimento, Cristiano Hora de Oliveira Fontes

Abstract:

This article presents a mathematical formulation for the production Break-Even Point problem in a non-homogeneous system. The optimization problem aims to obtain the composition of the best product mix in a non-homogeneous industrial plant, with the lowest cost until the breakeven point is reached. The problem constraints represent real limitations of a generic non-homogeneous industrial plant for n different products. The proposed model is able to solve the equilibrium point problem simultaneously for all products, unlike the existing approaches that propose a resolution in a sequential way, considering each product in isolation and providing a sub-optimal solution to the problem. The results indicate that the product mix found through the proposed model has economical advantages over the traditional approach used.

Keywords: branch and bound, break-even point, non-homogeneous production system, integer linear programming, management accounting

Procedia PDF Downloads 189
466 A Numerical Study of Seismic Effects on Slope Stability Using Node-Based Smooth Finite Element Method

Authors: H. C. Nguyen

Abstract:

This contribution considers seismic effects on the stability of slope and footing resting on a slope. The seismic force is simply treated as static inertial force through the values of acceleration factor. All domains are assumed to be plasticity deformations approximated using node-based smoothed finite element method (NS-FEM). The failure mechanism and safety factor were then explored using numerical procedure based on upper bound approach in which optimization problem was formed as second order cone programming (SOCP). The data obtained confirm that upper bound procedure using NS-FEM and SOCP can give stable and rapid convergence results of seismic stability factors.

Keywords: upper bound analysis, safety factor, slope stability, footing resting on slope

Procedia PDF Downloads 98
465 Hybrid Obfuscation Technique for Reverse Engineering Problem

Authors: Asma’a Mahfoud, Abu Bakar Md. Sultan, Abdul Azim Abd, Norhayati Mohd Ali, Novia Admodisastro

Abstract:

Obfuscation is a practice to make something difficult and complicated. Programming code is ordinarily obfuscated to protect the intellectual property (IP) and prevent the attacker from reverse engineering (RE) a copyrighted software program. Obfuscation may involve encrypting some or all the code, transforming out potentially revealing data, renaming useful classes and variables (identifiers) names to meaningless labels, or adding unused or meaningless code to an application binary. Obfuscation techniques were not performing effectively recently as the reversing tools are able to break the obfuscated code. We propose in this paper a hybrid obfuscation technique that contains three approaches of renaming. Experimentation was conducted to test the effectiveness of the proposed technique. The experimentation has presented a promising result, where the reversing tools were not able to read the code.

Keywords: intellectual property, obfuscation, software security, reverse engineering

Procedia PDF Downloads 135
464 A Survey on the Requirements of University Course Timetabling

Authors: Nurul Liyana Abdul Aziz, Nur Aidya Hanum Aizam

Abstract:

Course timetabling problems occur every semester in a university which includes the allocation of resources (subjects, lecturers and students) to a number of fixed rooms and timeslots. The assignment is carried out in a way such that there are no conflicts within rooms, students and lecturers, as well as fulfilling a range of constraints. The constraints consist of rules and policies set up by the universities as well as lecturers’ and students’ preferences of courses to be allocated in specific timeslots. This paper specifically focuses on the preferences of the course timetabling problem in one of the public universities in Malaysia. The demands will be considered into our existing mathematical model to make it more generalized and can be used widely. We have distributed questionnaires to a number of lecturers and students of the university to investigate their demands and preferences for their desired course timetable. We classify the preferences thus converting them to construct one mathematical model that can produce such timetable.

Keywords: university course timetabling problem, integer programming, preferences, constraints

Procedia PDF Downloads 344
463 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 235
462 Parallel Evaluation of Sommerfeld Integrals for Multilayer Dyadic Green's Function

Authors: Duygu Kan, Mehmet Cayoren

Abstract:

Sommerfeld-integrals (SIs) are commonly encountered in electromagnetics problems involving analysis of antennas and scatterers embedded in planar multilayered media. Generally speaking, the analytical solution of SIs is unavailable, and it is well known that numerical evaluation of SIs is very time consuming and computationally expensive due to the highly oscillating and slowly decaying nature of the integrands. Therefore, fast computation of SIs has a paramount importance. In this paper, a parallel code has been developed to speed up the computation of SI in the framework of calculation of dyadic Green’s function in multilayered media. OpenMP shared memory approach is used to parallelize the SI algorithm and resulted in significant time savings. Moreover accelerating the computation of dyadic Green’s function is discussed based on the parallel SI algorithm developed.

Keywords: Sommerfeld-integrals, multilayer dyadic Green’s function, OpenMP, shared memory parallel programming

Procedia PDF Downloads 228
461 Personalized Learning: An Analysis Using Item Response Theory

Authors: A. Yacob, N. Hj. Ali, M. H. Yusoff, M. Y. MohdSaman, W. M. A. F. W. Hamzah

Abstract:

Personalized learning becomes increasingly popular which not is restricted by time, place or any other barriers. This study proposes an analysis of Personalized Learning using Item Response Theory which considers course material difficulty and learner ability. The study investigates twenty undergraduate students at TATI University College, who are taking programming subject. By using the IRT, it was found that, finding the most appropriate problem levels to each student include high and low level test items together is not a problem. Thus, the student abilities can be asses more accurately and fairly. Learners who experience more anxiety will affect a heavier cognitive load and receive lower test scores. Instructors are encouraged to provide a supportive learning environment to enhance learning effectiveness because Cognitive Load Theory concerns the limited capacity of the brain to absorb new information.

Keywords: assessment, item response theory, cognitive load theory, learning, motivation, performance

Procedia PDF Downloads 292
460 Economic Evaluation of an Advanced Bioethanol Manufacturing Technology Using Maize as a Feedstock in South Africa

Authors: Ayanda Ndokwana, Stanley Fore

Abstract:

Industrial prosperity and rapid expansion of human population in South Africa over the past two decades, have increased the use of conventional fossil fuels such as crude oil, coal and natural gas to meet the country’s energy demands. However, the inevitable depletion of fossil fuel reserves, global volatile oil price and large carbon footprint are some of the crucial reasons the South African Government needs to make a considerable investment in the development of the biofuel industry. In South Africa, this industry is still at the introductory stage with no large scale manufacturing plant that has been commissioned yet. Bioethanol is a potential replacement of gasoline which is a fossil fuel that is used in motor vehicles. Using bioethanol for the transport sector as a source of fuel will help Government to save heavy foreign exchange incurred during importation of oil and create many job opportunities in rural farming. In 2007, the South African Government developed the National Biofuels Industrial Strategy in an effort to make provision for support and attract investment in bioethanol production. However, capital investment in the production of bioethanol on a large scale, depends on the sound economic assessment of the available manufacturing technologies. The aim of this study is to evaluate the profitability of an advanced bioethanol manufacturing technology which uses maize as a feedstock in South Africa. The impact of fiber or bran fractionation in this technology causes it to possess a number of merits such as energy efficiency, low capital expenditure, and profitability compared to a conventional dry-mill bioethanol technology. Quantitative techniques will be used to collect and analyze numerical data from suitable organisations in South Africa. The dependence of three profitability indicators such as the Discounted Payback Period (DPP), Net Present Value (NPV) and Return On Investment (ROI) on plant capacity will be evaluated. Profitability analysis will be done on the following plant capacities: 100 000 ton/year, 150 000 ton/year and 200 000 ton/year. The plant capacity with the shortest Discounted Payback Period, positive Net Present Value and highest Return On Investment implies that a further consideration in terms of capital investment is warranted.

Keywords: bioethanol, economic evaluation, maize, profitability indicators

Procedia PDF Downloads 209