Search results for: elliptic curve digital signature algorithm
4987 A Study of Effective Stereo Matching Method for Long-Wave Infrared Camera Module
Authors: Hyun-Koo Kim, Yonghun Kim, Yong-Hoon Kim, Ju Hee Lee, Myungho Song
Abstract:
In this paper, we have described an efficient stereo matching method and pedestrian detection method using stereo types LWIR camera. We compared with three types stereo camera algorithm as block matching, ELAS, and SGM. For pedestrian detection using stereo LWIR camera, we used that SGM stereo matching method, free space detection method using u/v-disparity, and HOG feature based pedestrian detection. According to testing result, SGM method has better performance than block matching and ELAS algorithm. Combination of SGM, free space detection, and pedestrian detection using HOG features and SVM classification can detect pedestrian of 30m distance and has a distance error about 30 cm.Keywords: advanced driver assistance system, pedestrian detection, stereo matching method, stereo long-wave IR camera
Procedia PDF Downloads 4154986 The Reliability Analysis of Concrete Chimneys Due to Random Vortex Shedding
Authors: Saba Rahman, Arvind K. Jain, S. D. Bharti, T. K. Datta
Abstract:
Chimneys are generally tall and slender structures with circular cross-sections, due to which they are highly prone to wind forces. Wind exerts pressure on the wall of the chimneys, which produces unwanted forces. Vortex-induced oscillation is one of such excitations which can lead to the failure of the chimneys. Therefore, vortex-induced oscillation of chimneys is of great concern to researchers and practitioners since many failures of chimneys due to vortex shedding have occurred in the past. As a consequence, extensive research has taken place on the subject over decades. Many laboratory experiments have been performed to verify the theoretical models proposed to predict vortex-induced forces, including aero-elastic effects. Comparatively, very few proto-type measurement data have been recorded to verify the proposed theoretical models. Because of this reason, the theoretical models developed with the help of experimental laboratory data are utilized for analyzing the chimneys for vortex-induced forces. This calls for reliability analysis of the predictions of the responses of the chimneys produced due to vortex shedding phenomena. Although several works of literature exist on the vortex-induced oscillation of chimneys, including code provisions, the reliability analysis of chimneys against failure caused due to vortex shedding is scanty. In the present study, the reliability analysis of chimneys against vortex shedding failure is presented, assuming the uncertainty in vortex shedding phenomena to be significantly more than other uncertainties, and hence, the latter is ignored. The vortex shedding is modeled as a stationary random process and is represented by a power spectral density function (PSDF). It is assumed that the vortex shedding forces are perfectly correlated and act over the top one-third height of the chimney. The PSDF of the tip displacement of the chimney is obtained by performing a frequency domain spectral analysis using a matrix approach. For this purpose, both chimney and random wind forces are discretized over a number of points along with the height of the chimney. The method of analysis duly accounts for the aero-elastic effects. The double barrier threshold crossing level, as proposed by Vanmarcke, is used for determining the probability of crossing different threshold levels of the tip displacement of the chimney. Assuming the annual distribution of the mean wind velocity to be a Gumbel type-I distribution, the fragility curve denoting the variation of the annual probability of threshold crossing against different threshold levels of the tip displacement of the chimney is determined. The reliability estimate is derived from the fragility curve. A 210m tall concrete chimney with a base diameter of 35m, top diameter as 21m, and thickness as 0.3m has been taken as an illustrative example. The terrain condition is assumed to be that corresponding to the city center. The expression for the PSDF of the vortex shedding force is taken to be used by Vickery and Basu. The results of the study show that the threshold crossing reliability of the tip displacement of the chimney is significantly influenced by the assumed structural damping and the Gumbel distribution parameters. Further, the aero-elastic effect influences the reliability estimate to a great extent for small structural damping.Keywords: chimney, fragility curve, reliability analysis, vortex-induced vibration
Procedia PDF Downloads 1614985 Person Re-Identification using Siamese Convolutional Neural Network
Authors: Sello Mokwena, Monyepao Thabang
Abstract:
In this study, we propose a comprehensive approach to address the challenges in person re-identification models. By combining a centroid tracking algorithm with a Siamese convolutional neural network model, our method excels in detecting, tracking, and capturing robust person features across non-overlapping camera views. The algorithm efficiently identifies individuals in the camera network, while the neural network extracts fine-grained global features for precise cross-image comparisons. The approach's effectiveness is further accentuated by leveraging the camera network topology for guidance. Our empirical analysis on benchmark datasets highlights its competitive performance, particularly evident when background subtraction techniques are selectively applied, underscoring its potential in advancing person re-identification techniques.Keywords: camera network, convolutional neural network topology, person tracking, person re-identification, siamese
Procedia PDF Downloads 734984 Optimal Relaxation Parameters for Obtaining Efficient Iterative Methods for the Solution of Electromagnetic Scattering Problems
Authors: Nadaniela Egidi, Pierluigi Maponi
Abstract:
The approximate solution of a time-harmonic electromagnetic scattering problem for inhomogeneous media is required in several application contexts, and its two-dimensional formulation is a Fredholm integral equation of the second kind. This integral equation provides a formulation for the direct scattering problem, but it has to be solved several times also in the numerical solution of the corresponding inverse scattering problem. The discretization of this Fredholm equation produces large and dense linear systems that are usually solved by iterative methods. In order to improve the efficiency of these iterative methods, we use the Symmetric SOR preconditioning, and we propose an algorithm for the evaluation of the associated relaxation parameter. We show the efficiency of the proposed algorithm by several numerical experiments, where we use two Krylov subspace methods, i.e., Bi-CGSTAB and GMRES.Keywords: Fredholm integral equation, iterative method, preconditioning, scattering problem
Procedia PDF Downloads 1054983 Matching on Bipartite Graphs with Applications to School Course Registration Systems
Authors: Zhihan Li
Abstract:
Nowadays, most universities use the course enrollment system considering students’ registration orders. However, the students’ preference level to certain courses is also one important factor to consider. In this research, the possibility of applying a preference-first system has been discussed and analyzed compared to the order-first system. A bipartite graph is applied to resemble the relationship between students and courses they tend to register. With the graph set up, we apply Ford-Fulkerson (F.F.) Algorithm to maximize parings between two sets of nodes, in our case, students and courses. Two models are proposed in this paper: the one considered students’ order first, and the one considered students’ preference first. By comparing and contrasting the two models, we highlight the usability of models which potentially leads to better designs for school course registration systems.Keywords: bipartite graph, Ford-Fulkerson (F.F.) algorithm, graph theory, maximum matching
Procedia PDF Downloads 1114982 Coordinated Interference Canceling Algorithm for Uplink Massive Multiple Input Multiple Output Systems
Authors: Messaoud Eljamai, Sami Hidouri
Abstract:
Massive multiple-input multiple-output (MIMO) is an emerging technology for new cellular networks such as 5G systems. Its principle is to use many antennas per cell in order to maximize the network's spectral efficiency. Inter-cellular interference remains a fundamental problem. The use of massive MIMO will not derogate from the rule. It improves performances only when the number of antennas is significantly greater than the number of users. This, considerably, limits the networks spectral efficiency. In this paper, a coordinated detector for an uplink massive MIMO system is proposed in order to mitigate the inter-cellular interference. The proposed scheme combines the coordinated multipoint technique with an interference-cancelling algorithm. It requires the serving cell to send their received symbols, after processing, decision and error detection, to the interfered cells via a backhaul link. Each interfered cell is capable of eliminating intercellular interferences by generating and subtracting the user’s contribution from the received signal. The resulting signal is more reliable than the original received signal. This allows the uplink massive MIMO system to improve their performances dramatically. Simulation results show that the proposed detector improves system spectral efficiency compared to classical linear detectors.Keywords: massive MIMO, COMP, interference canceling algorithm, spectral efficiency
Procedia PDF Downloads 1474981 Error Probability of Multi-User Detection Techniques
Authors: Komal Babbar
Abstract:
Multiuser Detection is the intelligent estimation/demodulation of transmitted bits in the presence of Multiple Access Interference. The authors have presented the Bit-error rate (BER) achieved by linear multi-user detectors: Matched filter (which treats the MAI as AWGN), Decorrelating and MMSE. In this work, authors investigate the bit error probability analysis for Matched filter, decorrelating, and MMSE. This problem arises in several practical CDMA applications where the receiver may not have full knowledge of the number of active users and their signature sequences. In particular, the behavior of MAI at the output of the Multi-user detectors (MUD) is examined under various asymptotic conditions including large signal to noise ratio; large near-far ratios; and a large number of users. In the last section Authors also shows Matlab Simulation results for Multiuser detection techniques i.e., Matched filter, Decorrelating, MMSE for 2 users and 10 users.Keywords: code division multiple access, decorrelating, matched filter, minimum mean square detection (MMSE) detection, multiple access interference (MAI), multiuser detection (MUD)
Procedia PDF Downloads 5284980 Application of Artificial Immune Systems Combined with Collaborative Filtering in Movie Recommendation System
Authors: Pei-Chann Chang, Jhen-Fu Liao, Chin-Hung Teng, Meng-Hui Chen
Abstract:
This research combines artificial immune system with user and item based collaborative filtering to create an efficient and accurate recommendation system. By applying the characteristic of antibodies and antigens in the artificial immune system and using Pearson correlation coefficient as the affinity threshold to cluster the data, our collaborative filtering can effectively find useful users and items for rating prediction. This research uses MovieLens dataset as our testing target to evaluate the effectiveness of the algorithm developed in this study. The experimental results show that the algorithm can effectively and accurately predict the movie ratings. Compared to some state of the art collaborative filtering systems, our system outperforms them in terms of the mean absolute error on the MovieLens dataset.Keywords: artificial immune system, collaborative filtering, recommendation system, similarity
Procedia PDF Downloads 5364979 Polymer Mediated Interaction between Grafted Nanosheets
Authors: Supriya Gupta, Paresh Chokshi
Abstract:
Polymer-particle interactions can be effectively utilized to produce composites that possess physicochemical properties superior to that of neat polymer. The incorporation of fillers with dimensions comparable to polymer chain size produces composites with extra-ordinary properties owing to very high surface to volume ratio. The dispersion of nanoparticles is achieved by inducing steric repulsion realized by grafting particles with polymeric chains. A comprehensive understanding of the interparticle interaction between these functionalized nanoparticles plays an important role in the synthesis of a stable polymer nanocomposite. With the focus on incorporation of clay sheets in a polymer matrix, we theoretically construct the polymer mediated interparticle potential for two nanosheets grafted with polymeric chains. The self-consistent field theory (SCFT) is employed to obtain the inhomogeneous composition field under equilibrium. Unlike the continuum models, SCFT is built from the microscopic description taking in to account the molecular interactions contributed by both intra- and inter-chain potentials. We present the results of SCFT calculations of the interaction potential curve for two grafted nanosheets immersed in the matrix of polymeric chains of dissimilar chemistry to that of the grafted chains. The interaction potential is repulsive at short separation and shows depletion attraction for moderate separations induced by high grafting density. It is found that the strength of attraction well can be tuned by altering the compatibility between the grafted and the mobile chains. Further, we construct the interaction potential between two nanosheets grafted with diblock copolymers with one of the blocks being chemically identical to the free polymeric chains. The interplay between the enthalpic interaction between the dissimilar species and the entropy of the free chains gives rise to a rich behavior in interaction potential curve obtained for two separate cases of free chains being chemically similar to either the grafted block or the free block of the grafted diblock chains.Keywords: clay nanosheets, polymer brush, polymer nanocomposites, self-consistent field theory
Procedia PDF Downloads 2524978 Determination of Mechanical Properties of Adhesives via Digital Image Correlation (DIC) Method
Authors: Murat Demir Aydin, Elanur Celebi
Abstract:
Adhesively bonded joints are used as an alternative to traditional joining methods due to the important advantages they provide. The most important consideration in the use of adhesively bonded joints is that these joints have appropriate requirements for their use in terms of safety. In order to ensure control of this condition, damage analysis of the adhesively bonded joints should be performed by determining the mechanical properties of the adhesives. When the literature is investigated; it is generally seen that the mechanical properties of adhesives are determined by traditional measurement methods. In this study, to determine the mechanical properties of adhesives, the Digital Image Correlation (DIC) method, which can be an alternative to traditional measurement methods, has been used. The DIC method is a new optical measurement method which is used to determine the parameters of displacement and strain in an appropriate and correct way. In this study, tensile tests of Thick Adherent Shear Test (TAST) samples formed using DP410 liquid structural adhesive and steel materials and bulk tensile specimens formed using and DP410 liquid structural adhesive was performed. The displacement and strain values of the samples were determined by DIC method and the shear stress-strain curves of the adhesive for TAST specimens and the tensile strain curves of the bulk adhesive specimens were obtained. Various methods such as numerical methods are required as conventional measurement methods (strain gauge, mechanic extensometer, etc.) are not sufficient in determining the strain and displacement values of the very thin adhesive layer such as TAST samples. As a result, the DIC method removes these requirements and easily achieves displacement measurements with sufficient accuracy.Keywords: structural adhesive, adhesively bonded joints, digital image correlation, thick adhered shear test (TAST)
Procedia PDF Downloads 3224977 Path Planning for Unmanned Aerial Vehicles in Constrained Environments for Locust Elimination
Authors: Aadiv Shah, Hari Nair, Vedant Mittal, Alice Cheeran
Abstract:
Present-day agricultural practices such as blanket spraying not only lead to excessive usage of pesticides but also harm the overall crop yield. This paper introduces an algorithm to optimize the traversal of an unmanned aerial vehicle (UAV) in constrained environments. The proposed system focuses on the agricultural application of targeted spraying for locust elimination. Given a satellite image of a farm, target zones that are prone to locust swarm formation are detected through the calculation of the normalized difference vegetation index (NDVI). This is followed by determining the optimal path for traversal of a UAV through these target zones using the proposed algorithm in order to perform pesticide spraying in the most efficient manner possible. Unlike the classic travelling salesman problem involving point-to-point optimization, the proposed algorithm determines an optimal path for multiple regions, independent of its geometry. Finally, the paper explores the idea of implementing reinforcement learning to model complex environmental behaviour and make the path planning mechanism for UAVs agnostic to external environment changes. This system not only presents a solution to the enormous losses incurred due to locust attacks but also an efficient way to automate agricultural practices across the globe in order to improve farmer ergonomics.Keywords: locust, NDVI, optimization, path planning, reinforcement learning, UAV
Procedia PDF Downloads 2514976 Curating Pluralistic Futures: Leveling up for Whole-Systems Change
Authors: Daniel Schimmelpfennig
Abstract:
This paper attempts to delineate the idea to curate the leveling up for whole-systems change. Curation is the act fo select, organize, look after, or present information from a professional point of view through expert knowledge. The trans-paradigmatic, trans-contextual, trans-disciplinary, trans-perspective of trans-media futures studies hopes to enable a move from a monochrome intellectual pursuit towards breathing a higher dimensionality. Progressing to the next level to equip actors for whole-systems change is in consideration of the commonly known symptoms of our time as well as in anticipation of future challenges, both a necessity and desirability. Systems of collective intelligence could potentially scale regenerative, adaptive, and anticipatory capacities. How could such a curation then be enacted and implemented, to initiate the process of leveling-up? The suggestion here is to focus on the metasystem transition, the bio-digital fusion, namely, by merging neurosciences, the ontological design of money as our operating system, and our understanding of the billions of years of time-proven permutations in nature, biomimicry, and biological metaphors like symbiogenesis. Evolutionary cybernetics accompanies the process of whole-systems change.Keywords: bio-digital fusion, evolutionary cybernetics, metasystem transition, symbiogenesis, transmedia futures studies
Procedia PDF Downloads 1554975 Micro- and Nanoparticle Transport and Deposition in Elliptic Obstructed Channels by Lattice Boltzmann Method
Authors: Salman Piri
Abstract:
In this study, a two-dimensional lattice Boltzmann method (LBM) was considered for the numerical simulation of fluid flow in a channel. Also, the Lagrangian method was used for particle tracking in one-way coupling. Three hundred spherical particles with specific diameters were released in the channel entry and an elliptical object was placed in the channel for flow obstruction. The effect of gravity, the drag force, the Saffman lift and the Brownian forces were evaluated in the particle motion trajectories. Also, the effect of the geometrical parameter, ellipse aspect ratio, and the flow characteristic or Reynolds number was surveyed for the transport and deposition of particles. Moreover, the influence of particle diameter between 0.01 and 10 µm was investigated. Results indicated that in small Reynolds, more inertial and gravitational trapping occurred on the obstacle surface for particles with larger diameters. Whereas, for nano-particles, influenced by Brownian diffusion and vortices behind the obstacle, the inertial and gravitational mechanisms were insignificant and diffusion was the dominant deposition mechanism. In addition, in Reynolds numbers larger than 400, there was no significant difference between the deposition of finer and larger particles. Also, in higher aspect ratios of the ellipse, more inertial trapping occurred for particles of larger diameter (10 micrometers), while in lower cases, interception and gravitational mechanisms were dominant.Keywords: ellipse aspect elito, particle tracking diffusion, lattice boltzman method, larangain particle tracking
Procedia PDF Downloads 794974 Development of Construction Cost Optimization System Using Genetic Algorithm Method
Authors: Hyeon-Seung Kim, Young-Hwan Kim, Sang-Mi Park, Min-Seo Kim, Jong-Myeung Shin, Leen-Seok Kang
Abstract:
The project budget at the planned stage might be changed by the insufficient government budget or the design change. There are many cases more especially in the case of a project performed for a long period of time. If the actual construction budget is insufficient comparing with the planned budget, the construction schedule should also be changed to match the changed budget. In that case, most project managers change the planned construction schedule by a heuristic approach without a reasonable consideration on the work priority. This study suggests an optimized methodology to modify the construction schedule according to the changed budget. The genetic algorithm was used to optimize the modified construction schedule within the changed budget. And a simulation system of construction cost histogram in accordance with the construction schedule was developed in the BIM (Building Information Modeling) environment.Keywords: 5D, BIM, GA, cost optimization
Procedia PDF Downloads 5884973 Proposals of Exposure Limits for Infrasound From Wind Turbines
Authors: M. Pawlaczyk-Łuszczyńska, T. Wszołek, A. Dudarewicz, P. Małecki, M. Kłaczyński, A. Bortkiewicz
Abstract:
Human tolerance to infrasound is defined by the hearing threshold. Infrasound that cannot be heard (or felt) is not annoying and is not thought to have any other adverse or health effects. Recent research has largely confirmed earlier findings. ISO 7196:1995 recommends the use of G-weighted characteristics for the assessment of infrasound. There is a strong correlation between G-weighted SPL and annoyance perception. The aim of this study was to propose exposure limits for infrasound from wind turbines. However, only a few countries have set limits for infrasound. These limits are usually no higher than 85-92 dBG, and none of them are specific to wind turbines. Over the years, a number of studies have been carried out to determine hearing thresholds below 20 Hz. It has been recognized that 10% of young people would be able to perceive 10 Hz at around 90 dB, and it has also been found that the difference in median hearing thresholds between young adults aged around 20 years and older adults aged over 60 years is around 10 dB, irrespective of frequency. This shows that older people (up to about 60 years of age) retain good hearing in the low frequency range, while their sensitivity to higher frequencies is often significantly reduced. In terms of exposure limits for infrasound, the average hearing threshold corresponds to a tone with a G-weighted SPL of about 96 dBG. In contrast, infrasound at Lp,G levels below 85-90 dBG is usually inaudible. The individual hearing threshold can, therefore be 10-15 dB lower than the average threshold, so the recommended limits for environmental infrasound could be 75 dBG or 80 dBG. It is worth noting that the G86 curve has been taken as the threshold of auditory perception of infrasound reached by 90-95% of the population, so the G75 and G80 curves can be taken as the criterion curve for wind turbine infrasound. Finally, two assessment methods and corresponding exposure limit values have been proposed for wind turbine infrasound, i.e. method I - based on G-weighted sound pressure level measurements and method II - based on frequency analysis in 1/3-octave bands in the frequency range 4-20 Hz. Separate limit values have been set for outdoor living areas in the open countryside (Area A) and for noise sensitive areas (Area B). In the case of Method I, infrasound limit values of 80 dBG (for areas A) and 75 dBG (for areas B) have been proposed, while in the case of Method II - criterion curves G80 and G75 have been chosen (for areas A and B, respectively).Keywords: infrasound, exposure limit, hearing thresholds, wind turbines
Procedia PDF Downloads 834972 Optimization Based Extreme Learning Machine for Watermarking of an Image in DWT Domain
Authors: RAM PAL SINGH, VIKASH CHAUDHARY, MONIKA VERMA
Abstract:
In this paper, we proposed the implementation of optimization based Extreme Learning Machine (ELM) for watermarking of B-channel of color image in discrete wavelet transform (DWT) domain. ELM, a regularization algorithm, works based on generalized single-hidden-layer feed-forward neural networks (SLFNs). However, hidden layer parameters, generally called feature mapping in context of ELM need not to be tuned every time. This paper shows the embedding and extraction processes of watermark with the help of ELM and results are compared with already used machine learning models for watermarking.Here, a cover image is divide into suitable numbers of non-overlapping blocks of required size and DWT is applied to each block to be transformed in low frequency sub-band domain. Basically, ELM gives a unified leaning platform with a feature mapping, that is, mapping between hidden layer and output layer of SLFNs, is tried for watermark embedding and extraction purpose in a cover image. Although ELM has widespread application right from binary classification, multiclass classification to regression and function estimation etc. Unlike SVM based algorithm which achieve suboptimal solution with high computational complexity, ELM can provide better generalization performance results with very small complexity. Efficacy of optimization method based ELM algorithm is measured by using quantitative and qualitative parameters on a watermarked image even though image is subjected to different types of geometrical and conventional attacks.Keywords: BER, DWT, extreme leaning machine (ELM), PSNR
Procedia PDF Downloads 3114971 An Optimization Model for Maximum Clique Problem Based on Semidefinite Programming
Authors: Derkaoui Orkia, Lehireche Ahmed
Abstract:
The topic of this article is to exploring the potentialities of a powerful optimization technique, namely Semidefinite Programming, for solving NP-hard problems. This approach provides tight relaxations of combinatorial and quadratic problems. In this work, we solve the maximum clique problem using this relaxation. The clique problem is the computational problem of finding cliques in a graph. It is widely acknowledged for its many applications in real-world problems. The numerical results show that it is possible to find a maximum clique in polynomial time, using an algorithm based on semidefinite programming. We implement a primal-dual interior points algorithm to solve this problem based on semidefinite programming. The semidefinite relaxation of this problem can be solved in polynomial time.Keywords: semidefinite programming, maximum clique problem, primal-dual interior point method, relaxation
Procedia PDF Downloads 2224970 Training a Neural Network Using Input Dropout with Aggressive Reweighting (IDAR) on Datasets with Many Useless Features
Authors: Stylianos Kampakis
Abstract:
This paper presents a new algorithm for neural networks called “Input Dropout with Aggressive Re-weighting” (IDAR) aimed specifically at datasets with many useless features. IDAR combines two techniques (dropout of input neurons and aggressive re weighting) in order to eliminate the influence of noisy features. The technique can be seen as a generalization of dropout. The algorithm is tested on two different benchmark data sets: a noisy version of the iris dataset and the MADELON data set. Its performance is compared against three other popular techniques for dealing with useless features: L2 regularization, LASSO and random forests. The results demonstrate that IDAR can be an effective technique for handling data sets with many useless features.Keywords: neural networks, feature selection, regularization, aggressive reweighting
Procedia PDF Downloads 4564969 EEG Signal Processing Methods to Differentiate Mental States
Authors: Sun H. Hwang, Young E. Lee, Yunhan Ga, Gilwon Yoon
Abstract:
EEG is a very complex signal with noises and other bio-potential interferences. EOG is the most distinct interfering signal when EEG signals are measured and analyzed. It is very important how to process raw EEG signals in order to obtain useful information. In this study, the EEG signal processing techniques such as EOG filtering and outlier removal were examined to minimize unwanted EOG signals and other noises. The two different mental states of resting and focusing were examined through EEG analysis. A focused state was induced by letting subjects to watch a red dot on the white screen. EEG data for 32 healthy subjects were measured. EEG data after 60-Hz notch filtering were processed by a commercially available EOG filtering and our presented algorithm based on the removal of outliers. The ratio of beta wave to theta wave was used as a parameter for determining the degree of focusing. The results show that our algorithm was more appropriate than the existing EOG filtering.Keywords: EEG, focus, mental state, outlier, signal processing
Procedia PDF Downloads 2844968 Using Genetic Algorithm to Organize Sustainable Urban Landscape in Historical Part of City
Authors: Shahab Mirzaean Mahabadi, Elham Ebrahimi
Abstract:
The urban development process in the historical urban context has predominately witnessed two main approaches: the first is the Preservation and conservation of the urban fabric and its value, and the second approach is urban renewal and redevelopment. The latter is generally supported by political and economic aspirations. These two approaches conflict evidently. The authors go through the history of urban planning in order to review the historical development of the mentioned approaches. In this article, various values which are inherent in the historical fabric of a city are illustrated by emphasizing on cultural identity and activity. In the following, it is tried to find an optimized plan which maximizes economic development and minimizes change in historical-cultural sites simultaneously. In the proposed model, regarding the decision maker’s intention, and the variety of functions, the selected zone is divided into a number of components. For each component, different alternatives can be assigned, namely, renovation, refurbishment, destruction, and change in function. The decision Variable in this model is to choose an alternative for each component. A set of decisions made upon all components results in a plan. A plan developed in this way can be evaluated based on the decision maker’s point of view. That is, interactions between selected alternatives can make a foundation for the assessment of urban context to design a historical-cultural landscape. A genetic algorithm (GA) approach is used to search for optimal future land use within the historical-culture landscape for a sustainable high-growth city.Keywords: urban sustainability, green city, regeneration, genetic algorithm
Procedia PDF Downloads 694967 Resource-Constrained Heterogeneous Workflow Scheduling Algorithms in Heterogeneous Computing Clusters
Authors: Lei Wang, Jiahao Zhou
Abstract:
The development of heterogeneous computing clusters provides a strong computility guarantee for large-scale workflows (e.g., scientific computing, artificial intelligence (AI), etc.). However, the tasks within large-scale workflows have also gradually become heterogeneous due to different demands on computing resources, which leads to the addition of a task resource-restricted constraint to the workflow scheduling problem on heterogeneous computing platforms. In this paper, we propose a heterogeneous constrained minimum makespan scheduling algorithm based on the idea of greedy strategy, which provides an efficient solution to the heterogeneous workflow scheduling problem in a heterogeneous platform. In this paper, we test the effectiveness of our proposed scheduling algorithm by randomly generating heterogeneous workflows with heterogeneous computing platform, and the experiments show that our method improves 15.2% over the state-of-the-art methods.Keywords: heterogeneous computing, workflow scheduling, constrained resources, minimal makespan
Procedia PDF Downloads 354966 Web Page Design Optimisation Based on Segment Analytics
Authors: Varsha V. Rohini, P. R. Shreya, B. Renukadevi
Abstract:
In the web analytics the information delivery and the web usage is optimized and the analysis of data is done. The analytics is the measurement, collection and analysis of webpage data. Page statistics and user metrics are the important factor in most of the web analytics tool. This is the limitation of the existing tools. It does not provide design inputs for the optimization of information. This paper aims at providing an extension for the scope of web analytics to provide analysis and statistics of each segment of a webpage. The number of click count is calculated and the concentration of links in a web page is obtained. Its user metrics are used to help in proper design of the displayed content in a webpage by Vision Based Page Segmentation (VIPS) algorithm. When the algorithm is applied on the web page it divides the entire web page into the visual block tree. The visual block tree generated will further divide the web page into visual blocks or segments which help us to understand the usage of each segment in a page and its content. The dynamic web pages and deep web pages are used to extend the scope of web page segment analytics. Space optimization concept is used with the help of the output obtained from the Vision Based Page Segmentation (VIPS) algorithm. This technique provides us the visibility of the user interaction with the WebPages and helps us to place the important links in the appropriate segments of the webpage and effectively manage space in a page and the concentration of links.Keywords: analytics, design optimization, visual block trees, vision based technology
Procedia PDF Downloads 2664965 Assessment of the Administration and Services of Public Access Computers in Academic Libraries in Kaduna State, Nigeria
Authors: Usman Ahmed Adam, Umar Ibrahim, Ezra S. Gbaje
Abstract:
This study is posed to explore the practice of Public Access Computers (PACs) in academic libraries in Kaduna State, Nigeria. The study aimed to determine the computers and other tools available, their services and challenges of the practices. Three questions were framed to identify number of public computers and tools available, their services and problems faced during the practice. The study used qualitative research design along with semi-constructed interview and observation as tools for data collection. Descriptive analysis was employed to analyze the data. The sample size of the study comprises 52 librarian and IT staff from the seven academic institutions in Kaduna State. The findings revealed that, PACs were provided for access to the Internet, digital resources, library catalogue and training services. The study further explored that, despite the limit number of the computers, users were not allowed to enjoy many services. The study recommends that libraries in Kaduna state should provide more public computers to be able to cover the population of their users; libraries should allow users to use the computers without limitations and restrictions.Keywords: academic libraries, computers in library, digital libraries, public computers
Procedia PDF Downloads 3524964 Change of Education Business in the Age of 5G
Authors: Heikki Ruohomaa, Vesa Salminen
Abstract:
Regions are facing huge competition to attract companies, businesses, inhabitants, students, etc. This way to improve living and business environment, which is rapidly changing due to digitalization. On the other hand, from the industry's point of view, the availability of a skilled labor force and an innovative environment are crucial factors. In this context, qualified staff has been seen to utilize the opportunities of digitalization and respond to the needs of future skills. World Manufacturing Forum has stated in the year 2019- report that in next five years, 40% of workers have to change their core competencies. Through digital transformation, new technologies like cloud, mobile, big data, 5G- infrastructure, platform- technology, data- analysis, and social networks with increasing intelligence and automation, enterprises can capitalize on new opportunities and optimize existing operations to achieve significant business improvement. Digitalization will be an important part of the everyday life of citizens and present in the working day of the average citizen and employee in the future. For that reason, the education system and education programs on all levels of education from diaper age to doctorate have been directed to fulfill this ecosystem strategy. Goal: The Fourth Industrial Revolution will bring unprecedented change to societies, education organizations and business environments. This article aims to identify how education, education content, the way education has proceeded, and overall whole the education business is changing. Most important is how we should respond to this inevitable co- evolution. Methodology: The study aims to verify how the learning process is boosted by new digital content, new learning software and tools, and customer-oriented learning environments. The change of education programs and individual education modules can be supported by applied research projects. You can use them in making proof- of- the concept of new technology, new ways to teach and train, and through the experiences gathered change education content, way to educate and finally education business as a whole. Major findings: Applied research projects can prove the concept- phases on real environment field labs to test technology opportunities and new tools for training purposes. Customer-oriented applied research projects are also excellent for students to make assignments and use new knowledge and content and teachers to test new tools and create new ways to educate. New content and problem-based learning are used in future education modules. This article introduces some case study experiences on customer-oriented digital transformation projects and how gathered knowledge on new digital content and a new way to educate has influenced education. The case study is related to experiences of research projects, customer-oriented field labs/learning environments and education programs of Häme University of Applied Sciences.Keywords: education process, digitalization content, digital tools for education, learning environments, transdisciplinary co-operation
Procedia PDF Downloads 1764963 Algorithms for Run-Time Task Mapping in NoC-Based Heterogeneous MPSoCs
Authors: M. K. Benhaoua, A. K. Singh, A. E. Benyamina, P. Boulet
Abstract:
Mapping parallelized tasks of applications onto these MPSoCs can be done either at design time (static) or at run-time (dynamic). Static mapping strategies find the best placement of tasks at design-time, and hence, these are not suitable for dynamic workload and seem incapable of runtime resource management. The number of tasks or applications executing in MPSoC platform can exceed the available resources, requiring efficient run-time mapping strategies to meet these constraints. This paper describes a new Spiral Dynamic Task Mapping heuristic for mapping applications onto NoC-based Heterogeneous MPSoC. This heuristic is based on packing strategy and routing Algorithm proposed also in this paper. Heuristic try to map the tasks of an application in a clustering region to reduce the communication overhead between the communicating tasks. The heuristic proposed in this paper attempts to map the tasks of an application that are most related to each other in a spiral manner and to find the best possible path load that minimizes the communication overhead. In this context, we have realized a simulation environment for experimental evaluations to map applications with varying number of tasks onto an 8x8 NoC-based Heterogeneous MPSoCs platform, we demonstrate that the new mapping heuristics with the new modified dijkstra routing algorithm proposed are capable of reducing the total execution time and energy consumption of applications when compared to state-of-the-art run-time mapping heuristics reported in the literature.Keywords: multiprocessor system on chip, MPSoC, network on chip, NoC, heterogeneous architectures, run-time mapping heuristics, routing algorithm
Procedia PDF Downloads 4894962 Developing Computational Thinking in Early Childhood Education
Authors: Kalliopi Kanaki, Michael Kalogiannakis
Abstract:
Nowadays, in the digital era, the early acquisition of basic programming skills and knowledge is encouraged, as it facilitates students’ exposure to computational thinking and empowers their creativity, problem-solving skills, and cognitive development. More and more researchers and educators investigate the introduction of computational thinking in K-12 since it is expected to be a fundamental skill for everyone by the middle of the 21st century, just like reading, writing and arithmetic are at the moment. In this paper, a doctoral research in the process is presented, which investigates the infusion of computational thinking into science curriculum in early childhood education. The whole attempt aims to develop young children’s computational thinking by introducing them to the fundamental concepts of object-oriented programming in an enjoyable, yet educational framework. The backbone of the research is the digital environment PhysGramming (an abbreviation of Physical Science Programming), which provides children the opportunity to create their own digital games, turning them from passive consumers to active creators of technology. PhysGramming deploys an innovative hybrid schema of visual and text-based programming techniques, with emphasis on object-orientation. Through PhysGramming, young students are familiarized with basic object-oriented programming concepts, such as classes, objects, and attributes, while, at the same time, get a view of object-oriented programming syntax. Nevertheless, the most noteworthy feature of PhysGramming is that children create their own digital games within the context of physical science courses, in a way that provides familiarization with the basic principles of object-oriented programming and computational thinking, even though no specific reference is made to these principles. Attuned to the ethical guidelines of educational research, interventions were conducted in two classes of second grade. The interventions were designed with respect to the thematic units of the curriculum of physical science courses, as a part of the learning activities of the class. PhysGramming was integrated into the classroom, after short introductory sessions. During the interventions, 6-7 years old children worked in pairs on computers and created their own digital games (group games, matching games, and puzzles). The authors participated in these interventions as observers in order to achieve a realistic evaluation of the proposed educational framework concerning its applicability in the classroom and its educational and pedagogical perspectives. To better examine if the objectives of the research are met, the investigation was focused on six criteria; the educational value of PhysGramming, its engaging and enjoyable characteristics, its child-friendliness, its appropriateness for the purpose that is proposed, its ability to monitor the user’s progress and its individualizing features. In this paper, the functionality of PhysGramming and the philosophy of its integration in the classroom are both described in detail. Information about the implemented interventions and the results obtained is also provided. Finally, several limitations of the research conducted that deserve attention are denoted.Keywords: computational thinking, early childhood education, object-oriented programming, physical science courses
Procedia PDF Downloads 1204961 Simulation and Analysis of Inverted Pendulum Controllers
Authors: Sheren H. Salah
Abstract:
The inverted pendulum is a highly nonlinear and open-loop unstable system. An inverted pendulum (IP) is a pendulum which has its mass above its pivot point. It is often implemented with the pivot point mounted on a cart that can move horizontally and may be called a cart and pole. The characteristics of the inverted pendulum make identification and control more challenging. This paper presents the simulation study of several control strategies for an inverted pendulum system. The goal is to determine which control strategy delivers better performance with respect to pendulum’s angle. The inverted pendulum represents a challenging control problem, which continually moves toward an uncontrolled state. For controlling the inverted pendulum. The simulation study that sliding mode control (SMC) control produced better response compared to Genetic Algorithm Control (GAs) and proportional-integral-derivative(PID) control.Keywords: Inverted Pendulum (IP) Proportional-Integral-Derivative (PID), Genetic Algorithm Control (GAs), Sliding Mode Control (SMC)
Procedia PDF Downloads 5554960 Curve Designing Using an Approximating 4-Point C^2 Ternary Non-Stationary Subdivision Scheme
Authors: Muhammad Younis
Abstract:
A ternary 4-point approximating non-stationary subdivision scheme has been introduced that generates the family of $C^2$ limiting curves. The theory of asymptotic equivalence is being used to analyze the convergence and smoothness of the scheme. The comparison of the proposed scheme has been demonstrated using different examples with the existing 4-point ternary approximating schemes, which shows that the limit curves of the proposed scheme behave more pleasantly and can generate conic sections as well.Keywords: ternary, non-stationary, approximation subdivision scheme, convergence and smoothness
Procedia PDF Downloads 4774959 Biogeography Based CO2 and Cost Optimization of RC Cantilever Retaining Walls
Authors: Ibrahim Aydogdu, Alper Akin
Abstract:
In this study, the development of minimizing the cost and the CO2 emission of the RC retaining wall design has been performed by Biogeography Based Optimization (BBO) algorithm. This has been achieved by developing computer programs utilizing BBO algorithm which minimize the cost and the CO2 emission of the RC retaining walls. Objective functions of the optimization problem are defined as the minimized cost, the CO2 emission and weighted aggregate of the cost and the CO2 functions of the RC retaining walls. In the formulation of the optimum design problem, the height and thickness of the stem, the length of the toe projection, the thickness of the stem at base level, the length and thickness of the base, the depth and thickness of the key, the distance from the toe to the key, the number and diameter of the reinforcement bars are treated as design variables. In the formulation of the optimization problem, flexural and shear strength constraints and minimum/maximum limitations for the reinforcement bar areas are derived from American Concrete Institute (ACI 318-14) design code. Moreover, the development length conditions for suitable detailing of reinforcement are treated as a constraint. The obtained optimum designs must satisfy the factor of safety for failure modes (overturning, sliding and bearing), strength, serviceability and other required limitations to attain practically acceptable shapes. To demonstrate the efficiency and robustness of the presented BBO algorithm, the optimum design example for retaining walls is presented and the results are compared to the previously obtained results available in the literature.Keywords: bio geography, meta-heuristic search, optimization, retaining wall
Procedia PDF Downloads 4014958 Technology and the Need for Integration in Public Education
Authors: Eric Morettin
Abstract:
Cybersecurity and digital literacy are pressing issues among Canadian citizens, yet formal education does not provide today’s students with the necessary knowledge and skills needed to adapt to these challenging issues within the physical and digital labor-market. Canada’s current education systems do not highlight the importance of these respective fields, aside from using technology for learning management systems and alternative methods of assignment completion. Educators are not properly trained to integrate technology into the compulsory courses within public education, to better prepare their learners in these topics and Canada’s digital economy. ICTC addresses these gaps in education and training through cross-Canadian educational programming in digital literacy and competency, cybersecurity and coding which is bridged with Canada’s provincially regulated K-12 curriculum guidelines. After analyzing Canada’s provincial education, it is apparent that there are gaps in learning related to technology, as well as inconsistent educational outcomes that do not adequately represent the current Canadian and global economies. Presently only New Brunswick, Nova Scotia, Ontario, and British Columbia offer curriculum guidelines for cybersecurity, computer programming, and digital literacy. The remaining provinces do not address these skills in their curriculum guidelines. Moreover, certain courses across some provinces not being updated since the 1990’s. The three territories respectfully take curriculum strands from other provinces and use them as their foundation in education. Yukon uses all British Columbia curriculum. Northwest Territories and Nunavut respectfully use a hybrid of Alberta and Saskatchewan curriculum as their foundation of learning. Education that is provincially regulated does not allow for consistency across the country’s educational outcomes and what Canada’s students will achieve – especially when curriculum outcomes have not been updated to reflect present day society. Through this, ICTC has aligned Canada’s provincially regulated curriculum and created opportunities for focused education in the realm of technology to better serve Canada’s present learners and teachers; while addressing inequalities and applicability within curriculum strands and outcomes across the country. As a result, lessons, units, and formal assessment strategies, have been created to benefit students and teachers in this interdisciplinary, cross-curricular, practice - as well as meeting their compulsory education requirements and developing skills and literacy in cyber education. Teachers can access these lessons and units through ICTC’s website, as well as receive professional development regarding the assessment and implementation of these offerings from ICTC’s education coordinators, whose combines experience exceeds 50 years of teaching in public, private, international, and Indigenous schools. We encourage you to take this opportunity that will benefit students and educators, and will bridge the learning and curriculum gaps in Canadian education to better reflect the ever-changing public, social, and career landscape that all citizens are a part of. Students are the future, and we at ICTC strive to ensure their futures are bright and prosperous.Keywords: cybersecurity, education, curriculum, teachers
Procedia PDF Downloads 82