Search results for: time complexity measurements.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7700

Search results for: time complexity measurements.

7280 A Microcontroller Implementation of Constrained Model Predictive Control

Authors: Amira Kheriji Abbes, Faouzi Bouani, Mekki Ksouri

Abstract:

Model Predictive Control (MPC) is an established control technique in a wide range of process industries. The reason for this success is its ability to handle multivariable systems and systems having input, output or state constraints. Neverthless comparing to PID controller, the implementation of the MPC in miniaturized devices like Field Programmable Gate Arrays (FPGA) and microcontrollers has historically been very small scale due to its complexity in implementation and its computation time requirement. At the same time, such embedded technologies have become an enabler for future manufacturing enterprisers as well as a transformer of organizations and markets. In this work, we take advantage of these recent advances in this area in the deployment of one of the most studied and applied control technique in the industrial engineering. In this paper, we propose an efficient firmware for the implementation of constrained MPC in the performed STM32 microcontroller using interior point method. Indeed, performances study shows good execution speed and low computational burden. These results encourage to develop predictive control algorithms to be programmed in industrial standard processes. The PID anti windup controller was also implemented in the STM32 in order to make a performance comparison with the MPC. The main features of the proposed constrained MPC framework are illustrated through two examples.

Keywords: Embedded software, microcontroller, constrainedModel Predictive Control, interior point method, PID antiwindup, Keil tool, C/Cµ language.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2773
7279 Kerma Profile Measurements in CT Chest Scans– a Comparison of Methodologies

Authors: Bruno B. Oliveira, Arnaldo P. Mourão, Teógenes A. da Silva

Abstract:

The Brazilian legislation has only established diagnostic reference levels (DRLs) in terms of Multiple Scan Average Dose (MSAD) as a quality control parameter for computed tomography (CT) scanners. Compliance with DRLs can be verified by measuring the Computed Tomography Kerma Index (Ca,100) with a pencil ionization chamber or by obtaining the kerma distribution in CT scans with radiochromic films or rod shape lithium fluoride termoluminescent dosimeters (TLD-100). TL dosimeters were used to record kerma profiles and to determine MSAD values of a Bright Speed model GE CT scanner. Measurements were done with radiochromic films and TL dosimeters distributed in cylinders positioned in the center and in four peripheral bores of a standard polymethylmethacrylate (PMMA) body CT dosimetry phantom. Irradiations were done using a protocol for adult chest. The maximum values were found at the midpoint of the longitudinal axis. The MSAD values obtained with three dosimetric techniques were compared.

Keywords: Kerma profile, CT, MSAD, patient dosimetry

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2037
7278 A New Heuristic Approach for Large Size Zero-One Multi Knapsack Problem Using Intercept Matrix

Authors: K. Krishna Veni, S. Raja Balachandar

Abstract:

This paper presents a heuristic to solve large size 0-1 Multi constrained Knapsack problem (01MKP) which is NP-hard. Many researchers are used heuristic operator to identify the redundant constraints of Linear Programming Problem before applying the regular procedure to solve it. We use the intercept matrix to identify the zero valued variables of 01MKP which is known as redundant variables. In this heuristic, first the dominance property of the intercept matrix of constraints is exploited to reduce the search space to find the optimal or near optimal solutions of 01MKP, second, we improve the solution by using the pseudo-utility ratio based on surrogate constraint of 01MKP. This heuristic is tested for benchmark problems of sizes upto 2500, taken from literature and the results are compared with optimum solutions. Space and computational complexity of solving 01MKP using this approach are also presented. The encouraging results especially for relatively large size test problems indicate that this heuristic can successfully be used for finding good solutions for highly constrained NP-hard problems.

Keywords: 0-1 Multi constrained Knapsack problem, heuristic, computational complexity, NP-Hard problems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831
7277 A Logic Based Framework for Planning for Mobile Agents

Authors: Rajdeep Niyogi

Abstract:

The objective of the paper is twofold. First, to develop a formal framework for planning for mobile agents. A logical language based on a temporal logic is proposed that can express a type of tasks which often arise in network management. Second, to design a planning algorithm for such tasks. The aim of this paper is to study the importance of finding plans for mobile agents. Although there has been a lot of research in mobile agents, not much work has been done to incorporate planning ideas for such agents. This paper makes an attempt in this direction. A theoretical study of finding plans for mobile agents is undertaken. A planning algorithm (based on the paradigm of mobile computing) is proposed and its space, time, and communication complexity is analyzed. The algorithm is illustrated by working out an example in detail.

Keywords: Acting, computer network, mobile agent, mobile computing, planning, temporal logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1394
7276 Frequency-Variation Based Method for Parameter Estimation of Transistor Amplifier

Authors: Akash Rathee, Harish Parthasarathy

Abstract:

In this paper, a frequency-variation based method has been proposed for transistor parameter estimation in a commonemitter transistor amplifier circuit. We design an algorithm to estimate the transistor parameters, based on noisy measurements of the output voltage when the input voltage is a sine wave of variable frequency and constant amplitude. The common emitter amplifier circuit has been modelled using the transistor Ebers-Moll equations and the perturbation technique has been used for separating the linear and nonlinear parts of the Ebers-Moll equations. This model of the amplifier has been used to determine the amplitude of the output sinusoid as a function of the frequency and the parameter vector. Then, applying the proposed method to the frequency components, the transistor parameters have been estimated. As compared to the conventional time-domain least squares method, the proposed method requires much less data storage and it results in more accurate parameter estimation, as it exploits the information in the time and frequency domain, simultaneously. The proposed method can be utilized for parameter estimation of an analog device in its operating range of frequencies, as it uses data collected from different frequencies output signals for parameter estimation.

Keywords: Perturbation Technique, Parameter estimation, frequency-variation based method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733
7275 Survey Gamma Radiation Measurements in Commercially-used Natural Tiling Rocks in Iran

Authors: A.Abbasi, F.Mirekhtiary

Abstract:

The gamma radiation in samples of a variety of natural tiling rocks (granites) produced and imported in Iran use in the building industry was measured, employing high-resolution Gamma-ray spectroscopy. The rock samples were pulverized, sealed in 0.5 liter plastic Marinelli beakers, and measured in the laboratory with an accumulating time between 50000 and 80000 second each. From the measured Gamma-ray spectra, activity concentrations were determined for 232Th (range from 6.5 to 172.2 Bq kg-1), 238U (from 7.5 to 178.1 Bq kg-1 ),226Ra( from 3.8 to 94.2 Bq kg-1 ) 40K (from 556.9 to 1539.2 Bq kg-1). From the 29 samples measured in this study, “Nehbndan ( Berjand )" appears to present the highest concentrations for 232Th,“Big Red Flower (China) "for 238U , “ Khoram dareh" for 226 Ra and “ Peranshahr" for 40K , respectively.

Keywords: activity concentration, natural radioactivity, tilingrocks (granites)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1387
7274 A Holistic Framework for Unifying Data Security and Management in Modern Enterprises

Authors: Ashly Joseph

Abstract:

Modern businesses struggle significantly to secure and manage their data properly as the volume and complexity of their data both expand exponentially. Through the use of a multi-layered defense strategy, a centralized management platform, and cutting-edge technologies like AI, this research paper presents a comprehensive framework to integrate data security and management. The constraints of current data protection and management strategies, technological advancements, and the evolving threat landscape are all examined in this article. It suggests best practices for putting into practice integrated data security and governance models, placing an emphasis on ongoing adaptation. The advantages mentioned include a strengthened security posture, simpler procedures, lower costs, and reduced complexity. Additionally, issues including skill shortages, antiquated systems, and cultural obstacles are examined. Security executives and Chief Information Security Officers are given practical advice on how to evaluate, plan, and put into place strong data-centric security and management capabilities. The goal of the paper is to provide a thorough study of the data security and management landscape and to arm contemporary businesses with the knowledge they need to be proactive in protecting their data assets.

Keywords: Data security, security management, cloud computing, cybersecurity, data governance, security architecture, data management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 203
7273 Demand and Supply Chain Simulation in Telecommunication Industry by Multi-Rate Expert Systems

Authors: Andrus Pedai, Igor Astrov

Abstract:

In modern telecommunications industry, demand & supply chain management (DSCM) needs reliable design and versatile tools to control the material flow. The objective for efficient DSCM is reducing inventory, lead times and related costs in order to assure reliable and on-time deliveries from manufacturing units towards customers. In this paper the multi-rate expert system based methodology for developing simulation tools that would enable optimal DSCM for multi region, high volume and high complexity manufacturing environment was proposed.

Keywords: Demand & supply chain management, expert systems, inventory control, multi-rate control, performance metrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1864
7272 The Effects of Consumer Inertia and Emotions on New Technology Acceptance

Authors: Chyi Jaw

Abstract:

Prior literature on innovation diffusion or acceptance has almost exclusively concentrated on consumers’ positive attitudes and behaviors for new products/services. Consumers’ negative attitudes or behaviors to innovations have received relatively little marketing attention, but it happens frequently in practice. This study discusses consumer psychological factors when they try to learn or use new technologies. According to recent research, technological innovation acceptance has been considered as a dynamic or mediated process. This research argues that consumers can experience inertia and emotions in the initial use of new technologies. However, given such consumer psychology, the argument can be made as to whether the inclusion of consumer inertia (routine seeking and cognitive rigidity) and emotions increases the predictive power of new technology acceptance model. As data from the empirical study find, the process is potentially consumer emotion changing (independent of performance benefits) because of technology complexity and consumer inertia, and impact innovative technology use significantly. Finally, the study presents the superior predictability of the hypothesized model, which let managers can better predict and influence the successful diffusion of complex technological innovations.

Keywords: Cognitive rigidity, consumer emotions, new technology acceptance, routine seeking, technology complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2774
7271 Screening of Congenital Heart Diseases with Fetal Phonocardiography

Authors: F. Kovács, K. Kádár, G. Hosszú, Á. T. Balogh, T. Zsedrovits, N. Kersner, A. Nagy, Gy. Jeney

Abstract:

The paper presents a novel screening method to indicate congenital heart diseases (CHD), which otherwise could remain undetected because of their low level. Therefore, not belonging to the high-risk population, the pregnancies are not subject to the regular fetal monitoring with ultrasound echocardiography. Based on the fact that CHD is a morphological defect of the heart causing turbulent blood flow, the turbulence appears as a murmur, which can be detected by fetal phonocardiography (fPCG). The proposed method applies measurements on the maternal abdomen and from the recorded sound signal a sophisticated processing determines the fetal heart murmur. The paper describes the problems and the additional advantages of the fPCG method including the possibility of measurements at home and its combination with the prescribed regular cardiotocographic (CTG) monitoring. The proposed screening process implemented on a telemedicine system provides an enhanced safety against hidden cardiac diseases.

Keywords: Cardiac murmurs, fetal phonocardiography, screening of CHDs, telemedicine system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2234
7270 Simulation Model for Optimizing Energy in Supply Chain Management

Authors: Nazli Akhlaghinia, Ali Rajabzadeh Ghatari

Abstract:

In today's world, with increasing environmental awareness, firms are facing severe pressure from various stakeholders, including the government and customers, to reduce their harmful effects on the environment. Over the past few decades, the increasing effects of global warming, climate change, waste, and air pollution have increased the global attention of experts to the issue of the green supply chain and led them to the optimal solution for greenery. Green supply chain management (GSCM) plays an important role in motivating the sustainability of the organization. With increasing environmental concerns, the main objective of the research is to use system thinking methodology and Vensim software for designing a dynamic system model for green supply chain and observing behaviors. Using this methodology, we look for the effects of a green supply chain structure on the behavioral dynamics of output variables. We try to simulate the complexity of GSCM in a period of 30 months and observe the complexity of behaviors of variables including sustainability, providing green products, and reducing energy consumption, and consequently reducing sample pollution.

Keywords: Supply chain management, green supply chain management, system dynamics, energy consumption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 850
7269 Calibration of 2D and 3D Optical Measuring Instruments in Industrial Environments at Submillimeter Range

Authors: A. Mínguez-Martínez, J. de Vicente

Abstract:

Modern manufacturing processes have led to the miniaturization of systems and, as a result, parts at the micro and nanoscale are produced. This trend seems to become increasingly important in the near future. Besides, as a requirement of Industry 4.0, the digitalization of the models of production and processes makes it very important to ensure that the dimensions of newly manufactured parts meet the specifications of the models. Therefore, it is possible to reduce the scrap and the cost of non-conformities, ensuring the stability of the production at the same time. To ensure the quality of manufactured parts, it becomes necessary to carry out traceable measurements at scales lower than one millimeter. Providing adequate traceability to the SI unit of length (the meter) to 2D and 3D measurements at this scale is a problem that does not have a unique solution in industrial environments. Researchers in the field of dimensional metrology all around the world are working on this issue. A solution for industrial environments, even if it is not complete, will enable working with some traceability. At this point, we believe that the study of the surfaces could provide us with a first approximation to a solution. In this paper, we propose a calibration procedure for the scales of optical measuring instruments, particularizing for a confocal microscope, using material standards easy to find and calibrate in metrology and quality laboratories in industrial environments. Confocal microscopes are measuring instruments capable of filtering the out-of-focus reflected light so that when it reaches the detector, it is possible to take pictures of the part of the surface that is focused. Varying and taking pictures at different Z levels of the focus, a specialized software interpolates between the different planes, and it could reconstruct the surface geometry into a 3D model. As it is easy to deduce, it is necessary to give traceability to each axis. As a complementary result, the roughness Ra parameter will be traced to the reference. Although the solution is designed for a confocal microscope, it may be used for the calibration of other optical measuring instruments, by applying minor changes.

Keywords: Industrial environment, confocal microscope, optical measuring instrument, traceability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 360
7268 A DEA Model for Performance Evaluation in The Presence of Time Lag Effect

Authors: Yanshuang Zhang, Byungho Jeong

Abstract:

Data Envelopment Analysis (DEA) is a methodology that computes efficiency values for decision making units (DMU) in a given period by comparing the outputs with the inputs. In many cases, there are some time lag between the consumption of inputs and the production of outputs. For a long-term research project, it is hard to avoid the production lead time phenomenon. This time lag effect should be considered in evaluating the performance of organizations. This paper suggests a model to calculate efficiency values for the performance evaluation problem with time lag. In the experimental part, the proposed methods are compared with the CCR and an existing time lag model using the data set of the 21st century frontier R&D program which is a long-term national R&D program of Korea.

Keywords: DEA, Efficiency, Time Lag

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1866
7267 Forecasting Enrollment Model Based on First-Order Fuzzy Time Series

Authors: Melike Şah, Konstantin Y.Degtiarev

Abstract:

This paper proposes a novel improvement of forecasting approach based on using time-invariant fuzzy time series. In contrast to traditional forecasting methods, fuzzy time series can be also applied to problems, in which historical data are linguistic values. It is shown that proposed time-invariant method improves the performance of forecasting process. Further, the effect of using different number of fuzzy sets is tested as well. As with the most of cited papers, historical enrollment of the University of Alabama is used in this study to illustrate the forecasting process. Subsequently, the performance of the proposed method is compared with existing fuzzy time series time-invariant models based on forecasting accuracy. It reveals a certain performance superiority of the proposed method over methods described in the literature.

Keywords: Forecasting, fuzzy time series, linguistic values, student enrollment, time-invariant model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2198
7266 New Explicit Group Newton's Iterative Methods for the Solutions of Burger's Equation

Authors: Tan K. B., Norhashidah Hj. M. Ali

Abstract:

In this article, we aim to discuss the formulation of two explicit group iterative finite difference methods for time-dependent two dimensional Burger-s problem on a variable mesh. For the non-linear problems, the discretization leads to a non-linear system whose Jacobian is a tridiagonal matrix. We discuss the Newton-s explicit group iterative methods for a general Burger-s equation. The proposed explicit group methods are derived from the standard point and rotated point Crank-Nicolson finite difference schemes. Their computational complexity analysis is discussed. Numerical results are given to justify the feasibility of these two proposed iterative methods.

Keywords: Standard point Crank-Nicolson (CN), Rotated point Crank-Nicolson (RCN), Explicit Group (EG), Explicit Decoupled Group (EDG).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1575
7265 A Hyper-Domain Image Watermarking Method based on Macro Edge Block and Wavelet Transform for Digital Signal Processor

Authors: Yi-Pin Hsu, Shin-Yu Lin

Abstract:

In order to protect original data, watermarking is first consideration direction for digital information copyright. In addition, to achieve high quality image, the algorithm maybe can not run on embedded system because the computation is very complexity. However, almost nowadays algorithms need to build on consumer production because integrator circuit has a huge progress and cheap price. In this paper, we propose a novel algorithm which efficient inserts watermarking on digital image and very easy to implement on digital signal processor. In further, we select a general and cheap digital signal processor which is made by analog device company to fit consumer application. The experimental results show that the image quality by watermarking insertion can achieve 46 dB can be accepted in human vision and can real-time execute on digital signal processor.

Keywords: watermarking, digital signal processor, embedded system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1222
7264 Finite-time Stability Analysis of Fractional-order with Multi-state Time Delay

Authors: Liqiong Liu, Shouming Zhong

Abstract:

In this paper, the finite-time stabilization of a class of multi-state time delay of fractional-order system is proposed. First, we define finite-time stability with the fractional-order system. Second, by using Generalized Gronwall's approach and the methods of the inequality, we get some conditions of finite-time stability for the fractional system with multi-state delay. Finally, a numerical example is given to illustrate the result.

Keywords: Finite-time stabilization, fractional-order system, Gronwall inequality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1880
7263 Digital Transformation as the Subject of the Knowledge Model of the Discursive Space

Authors: Rafal Maciag

Abstract:

Due to the development of the current civilization, one must create suitable models of its pervasive massive phenomena. Such a phenomenon is the digital transformation, which has a substantial number of disciplined, methodical interpretations forming the diversified reflection. This reflection could be understood pragmatically as the current temporal, a local differential state of knowledge. The model of the discursive space is proposed as a model for the analysis and description of this knowledge. Discursive space is understood as an autonomous multidimensional space where separate discourses traverse specific trajectories of what can be presented in multidimensional parallel coordinate system. Discursive space built on the world of facts preserves the complex character of that world. Digital transformation as a discursive space has a relativistic character that means that at the same time, it is created by the dynamic discourses and these discourses are molded by the shape of this space.

Keywords: Knowledge, digital transformation, discourse, discursive space, complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 618
7262 Symbolic Analysis of Large Circuits Using Discrete Wavelet Transform

Authors: Ali Al-Ataby , Fawzi Al-Naima

Abstract:

Symbolic Circuit Analysis (SCA) is a technique used to generate the symbolic expression of a network. It has become a well-established technique in circuit analysis and design. The symbolic expression of networks offers excellent way to perform frequency response analysis, sensitivity computation, stability measurements, performance optimization, and fault diagnosis. Many approaches have been proposed in the area of SCA offering different features and capabilities. Numerical Interpolation methods are very common in this context, especially by using the Fast Fourier Transform (FFT). The aim of this paper is to present a method for SCA that depends on the use of Wavelet Transform (WT) as a mathematical tool to generate the symbolic expression for large circuits with minimizing the analysis time by reducing the number of computations.

Keywords: Numerical Interpolation, Sparse Matrices, SymbolicAnalysis, Wavelet Transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1522
7261 Extension of Fish Shelf Life by Ozone Treatment

Authors: Behrouz Mosayebi Dehkordi, Neda Zokaie

Abstract:

The shelf life of fish was extended using disinfection properties of ozone. For this purpose, Trout specimens were exposed to ozone in the aqueous media for two hours and their microbial growth and biochemical properties were measured over time. Microbial growth of ozone treated fish was significantly slower than control sample, resulting in lower counts of bacteria. According to the biochemical tests; ozone treatment had no negative effects on fat, protein and humidity of fish. Peroxide and TVN (Total Volatile Nitrogen) measurements showed that treatment by ozone increased the trout shelf life from 4 days to 6 days. According to the sensory analysis, no changes were observed in color or flavor of the ozone treated trout.

Keywords: Fish, Ozone, Shelf life, Trout.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2210
7260 Ranking and Unranking Algorithms for k-ary Trees in Gray Code Order

Authors: Fateme Ashari-Ghomi, Najme Khorasani, Abbas Nowzari-Dalini

Abstract:

In this paper, we present two new ranking and unranking algorithms for k-ary trees represented by x-sequences in Gray code order. These algorithms are based on a gray code generation algorithm developed by Ahrabian et al.. In mentioned paper, a recursive backtracking generation algorithm for x-sequences corresponding to k-ary trees in Gray code was presented. This generation algorithm is based on Vajnovszki-s algorithm for generating binary trees in Gray code ordering. Up to our knowledge no ranking and unranking algorithms were given for x-sequences in this ordering. we present ranking and unranking algorithms with O(kn2) time complexity for x-sequences in this Gray code ordering

Keywords: k-ary Tree Generation, Ranking, Unranking, Gray Code.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2082
7259 Exploiting Two Intelligent Models to Predict Water Level: A Field Study of Urmia Lake, Iran

Authors: Shahab Kavehkar, Mohammad Ali Ghorbani, Valeriy Khokhlov, Afshin Ashrafzadeh, Sabereh Darbandi

Abstract:

Water level forecasting using records of past time series is of importance in water resources engineering and management. For example, water level affects groundwater tables in low-lying coastal areas, as well as hydrological regimes of some coastal rivers. Then, a reliable prediction of sea-level variations is required in coastal engineering and hydrologic studies. During the past two decades, the approaches based on the Genetic Programming (GP) and Artificial Neural Networks (ANN) were developed. In the present study, the GP is used to forecast daily water level variations for a set of time intervals using observed water levels. The measurements from a single tide gauge at Urmia Lake, Northwest Iran, were used to train and validate the GP approach for the period from January 1997 to July 2008. Statistics, the root mean square error and correlation coefficient, are used to verify model by comparing with a corresponding outputs from Artificial Neural Network model. The results show that both these artificial intelligence methodologies are satisfactory and can be considered as alternatives to the conventional harmonic analysis.

Keywords: Water-Level variation, forecasting, artificial neural networks, genetic programming, comparative analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2312
7258 3D Shape Knitting: Loop Alignment on a Surface with Positive Gaussian Curvature

Authors: C. T. Cheung, R. K. P. Ng, T. Y. Lo, Zhou Jinyun

Abstract:

This paper aims at manipulating loop alignment in knitting a three-dimensional (3D) shape by its geometry. Two loop alignment methods are introduced to handle a surface with positive Gaussian curvature. As weft knitting is a two-dimensional (2D) knitting mechanism that the knitting cam carrying the feeders moves in two directions only, left and right, the knitted fabric generated grows in width and length but not in depth. Therefore, a 3D shape is required to be flattened to a 2D plane with surface area preserved for knitting. On this flattened plane, dimensional measurements are taken for loop alignment. The way these measurements being taken derived two different loop alignment methods. In this paper, only plain knitted structure was considered. Each knitted loop was taken as a basic unit for loop alignment in order to achieve the required geometric dimensions, without the inclusion of other stitches which give textural dimensions to the fabric. Two loop alignment methods were experimented and compared. Only one of these two can successfully preserve the dimensions of the shape.

Keywords: 3D knitting, 3D shape, loop alignment, positive Gaussian curvature.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1529
7257 Courses Pre-Required Visualization Using Force Directed Placement Technique

Authors: Imen Ammari, Mourad Elloumi, Ala Eddine Barouni

Abstract:

Visualizing “Courses – Pre – Required - Architecture" on the screen has proven to be useful and helpful for university actors and specially for students. In fact, these students can easily identify courses and their pre required, perceive the courses to follow in the future, and then can choose rapidly the appropriate course to register in. Given a set of courses and their prerequired, we present an algorithm for visualization a graph entitled “Courses-Pre-Required-Graph" that present courses and their prerequired in order to help students to recognize, lonely, what courses to take in the future and perceive the contain of all courses that they will study. Our algorithm using “Force Directed Placement" technique visualizes the “Courses-Pre-Required-Graph" in such way that courses are easily identifiable. The time complexity of our drawing algorithm is O (n2), where n is the number of courses in the “Courses-Pre-Required-Graph".

Keywords: Courses–Pre-Required-Architecture, Courses-Pre- Required-Graph, Courses-Pre-Required-Visualization, Force directed Placement, Resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1301
7256 Depletion Layer Parameters of Al-MoO3-P-CdTe-Al MOS Structures

Authors: A. C. Sarmah

Abstract:

The Al-MoO3-P-CdTe-Al MOS sandwich structures were fabricated by vacuum deposition method on cleaned glass substrates. Capacitance versus voltage measurements were performed at different frequencies and sweep rates of applied voltages for oxide and semiconductor films of different thicknesses. In the negative voltage region of the C-V curve a high differential capacitance of the semiconductor was observed and at high frequencies (<10 kHz) the transition from accumulation to depletion and further to deep depletion was observed as the voltage was swept from negative to positive. A study have been undertaken to determine the value of acceptor density and some depletion layer parameters such as depletion layer capacitance, depletion width, impurity concentration, flat band voltage, Debye length, flat band capacitance, diffusion or built-in-potential, space charge per unit area etc. These were determined from C-V measurements for different oxide and semiconductor thicknesses.

Keywords: Debye length, Depletion width, flat band capacitance, impurity concentration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1550
7255 Surgery Scheduling Using Simulation with Arena

Authors: J. A. López, C.I. López, J.E. Olguín, C. Camargo, J. M. López

Abstract:

The institutions seek to improve their performance and quality of service, so that their patients are satisfied. This research project aims, conduct a time study program in the area of gynecological surgery, to determine the current level of capacity and optimize the programming time in order to adequately respond to demand. The system is analyzed by waiting lines and uses the simulation using ARENA to evaluate proposals for improvement and optimization programming time each of the surgeries.

Keywords: Time study, waiting lines, reducing time, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2730
7254 Does Leisure Time Use Contribute to a Wage Increase of the Thai People?

Authors: Siriwan Saksiriruthai

Abstract:

This paper develops models to analyze the relationship between leisure time and wage change. Using Thailand-s Time Use Survey and Labor Force Survey data, the estimation of wage changes in response to leisure time change indicates that media receiving, personal care and social participation and volunteer activities are the ones that significantly raise hourly wages. Thus, the finding suggests the stimulation in time use for media access to enhance knowledge and productivity, personal care for attractiveness and healthiness in order to raise productivity, and social activities to develop connections for possible future opportunities including wage increase. These activities should be promoted for productive leisure time and for welfare improvement.

Keywords: Leisure, wage, time use, Thailand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1558
7253 Evaluating the Validity of Computational Fluid Dynamics Model of Dispersion in a Complex Urban Geometry Using Two Sets of Experimental Measurements

Authors: Mohammad R. Kavian Nezhad, Carlos F. Lange, Brian A. Fleck

Abstract:

This research presents the validation study of a computational fluid dynamics (CFD) model developed to simulate the scalar dispersion emitted from rooftop sources around the buildings at the University of Alberta North Campus. The ANSYS CFX code was used to perform the numerical simulation of the wind regime and pollutant dispersion by solving the 3D steady Reynolds-averaged Navier-Stokes (RANS) equations on a building-scale high-resolution grid. The validation study was performed in two steps. First, the CFD model performance in 24 cases (eight wind directions and three wind speeds) was evaluated by comparing the predicted flow fields with the available data from the previous measurement campaign designed at the North Campus, using the standard deviation method (SDM), while the estimated results of the numerical model showed maximum average percent errors of approximately 53% and 37% for wind incidents from the North and Northwest, respectively. Good agreement with the measurements was observed for the other six directions, with an average error of less than 30%. In the second step, the reliability of the implemented turbulence model, numerical algorithm, modeling techniques, and the grid generation scheme was further evaluated using the Mock Urban Setting Test (MUST) dispersion dataset. Different statistical measures, including the fractional bias (FB), the mean geometric bias (MG), and the normalized mean square error (NMSE), were used to assess the accuracy of the predicted dispersion field. Our CFD results are in very good agreement with the field measurements.

Keywords: CFD, plume dispersion, complex urban geometry, validation study, wind flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 317
7252 Improving Human Hand Localization in Indoor Environment by Using Frequency Domain Analysis

Authors: Wipassorn Vinicchayakul, Pichaya Supanakoon, Sathaporn Promwong

Abstract:

A human’s hand localization is revised by using radar cross section (RCS) measurements with a minimum root mean square (RMS) error matching algorithm on a touchless keypad mock-up model. RCS and frequency transfer function measurements are carried out in an indoor environment on the frequency ranged from 3.0 to 11.0 GHz to cover federal communications commission (FCC) standards. The touchless keypad model is tested in two different distances between the hand and the keypad. The initial distance of 19.50 cm is identical to the heights of transmitting (Tx) and receiving (Rx) antennas, while the second distance is 29.50 cm from the keypad. Moreover, the effects of Rx angles relative to the hand of human factor are considered. The RCS input parameters are compared with power loss parameters at each frequency. From the results, the performance of the RCS input parameters with the second distance, 29.50 cm at 3 GHz is better than the others.

Keywords: Radar cross section (RCS), fingerprint-based localization, minimum root mean square (RMS) error matching algorithm, touchless keypad model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1331
7251 A New Heuristic Approach for the Large-Scale Generalized Assignment Problem

Authors: S. Raja Balachandar, K.Kannan

Abstract:

This paper presents a heuristic approach to solve the Generalized Assignment Problem (GAP) which is NP-hard. It is worth mentioning that many researches used to develop algorithms for identifying the redundant constraints and variables in linear programming model. Some of the algorithms are presented using intercept matrix of the constraints to identify redundant constraints and variables prior to the start of the solution process. Here a new heuristic approach based on the dominance property of the intercept matrix to find optimal or near optimal solution of the GAP is proposed. In this heuristic, redundant variables of the GAP are identified by applying the dominance property of the intercept matrix repeatedly. This heuristic approach is tested for 90 benchmark problems of sizes upto 4000, taken from OR-library and the results are compared with optimum solutions. Computational complexity is proved to be O(mn2) of solving GAP using this approach. The performance of our heuristic is compared with the best state-ofthe- art heuristic algorithms with respect to both the quality of the solutions. The encouraging results especially for relatively large size test problems indicate that this heuristic approach can successfully be used for finding good solutions for highly constrained NP-hard problems.

Keywords: Combinatorial Optimization Problem, Generalized Assignment Problem, Intercept Matrix, Heuristic, Computational Complexity, NP-Hard Problems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2325