Search results for: Critical Thinking and Problem Solving Skills and Teamwork Skills
762 Parameter Optimization and Thermal Simulation in Laser Joining of Coach Peel Panels of Dissimilar Materials
Authors: Masoud Mohammadpour, Blair Carlson, Radovan Kovacevic
Abstract:
The quality of laser welded-brazed (LWB) joints were strongly dependent on the main process parameters, therefore the effect of laser power (3.2–4 kW), welding speed (60–80 mm/s) and wire feed rate (70–90 mm/s) on mechanical strength and surface roughness were investigated in this study. The comprehensive optimization process by means of response surface methodology (RSM) and desirability function was used for multi-criteria optimization. The experiments were planned based on Box– Behnken design implementing linear and quadratic polynomial equations for predicting the desired output properties. Finally, validation experiments were conducted on an optimized process condition which exhibited good agreement between the predicted and experimental results. AlSi3Mn1 was selected as the filler material for joining aluminum alloy 6022 and hot-dip galvanized steel in coach peel configuration. The high scanning speed could control the thickness of IMC as thin as 5 µm. The thermal simulations of joining process were conducted by the Finite Element Method (FEM), and results were validated through experimental data. The Fe/Al interfacial thermal history evidenced that the duration of critical temperature range (700–900 °C) in this high scanning speed process was less than 1 s. This short interaction time leads to the formation of reaction-control IMC layer instead of diffusion-control mechanisms.
Keywords: Laser welding-brazing, finite element, response surface methodology, multi-response optimization, cross-beam laser.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 960761 Design of Two-Channel Quincunx Quadrature Mirror Filter Banks Using Digital All-Pass Lattice Filters
Authors: Ju-Hong Lee, Chong-Jia Ciou
Abstract:
This paper deals with the problem of two-dimensional (2-D) recursive two-channel quincunx quadrature mirror filter (QQMF) banks design. The analysis and synthesis filters of the 2-D recursive QQMF bank are composed of 2-D recursive digital allpass lattice filters (DALFs) with symmetric half-plane (SHP) support regions. Using the 2-D doubly complementary half-band (DC-HB) property possessed by the analysis and synthesis filters, we facilitate the design of the proposed QQMF bank. For finding the coefficients of the 2-D recursive SHP DALFs, we present a structure of 2-D recursive digital allpass filters by using 2-D SHP recursive digital all-pass lattice filters (DALFs). The novelty of using 2-D SHP recursive DALFs to construct a 2-D recursive QQMF bank is that the resulting 2-D recursive QQMF bank provides better performance than the existing 2-D recursive QQMF banks. Simulation results are also presented for illustration and comparison.
Keywords: All-pass digital filter, doubly complementary, lattice structure, symmetric half-plane digital filter, quincunx QMF bank.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 885760 Improved Modulo 2n +1 Adder Design
Authors: Somayeh Timarchi, Keivan Navi
Abstract:
Efficient modulo 2n+1 adders are important for several applications including residue number system, digital signal processors and cryptography algorithms. In this paper we present a novel modulo 2n+1 addition algorithm for a recently represented number system. The proposed approach is introduced for the reduction of the power dissipated. In a conventional modulo 2n+1 adder, all operands have (n+1)-bit length. To avoid using (n+1)-bit circuits, the diminished-1 and carry save diminished-1 number systems can be effectively used in applications. In the paper, we also derive two new architectures for designing modulo 2n+1 adder, based on n-bit ripple-carry adder. The first architecture is a faster design whereas the second one uses less hardware. In the proposed method, the special treatment required for zero operands in Diminished-1 number system is removed. In the fastest modulo 2n+1 adders in normal binary system, there are 3-operand adders. This problem is also resolved in this paper. The proposed architectures are compared with some efficient adders based on ripple-carry adder and highspeed adder. It is shown that the hardware overhead and power consumption will be reduced. As well as power reduction, in some cases, power-delay product will be also reduced.Keywords: Modulo 2n+1 arithmetic, residue number system, low power, ripple-carry adders.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2902759 Evolutionary Algorithms for Learning Primitive Fuzzy Behaviors and Behavior Coordination in Multi-Objective Optimization Problems
Authors: Li Shoutao, Gordon Lee
Abstract:
Evolutionary robotics is concerned with the design of intelligent systems with life-like properties by means of simulated evolution. Approaches in evolutionary robotics can be categorized according to the control structures that represent the behavior and the parameters of the controller that undergo adaptation. The basic idea is to automatically synthesize behaviors that enable the robot to perform useful tasks in complex environments. The evolutionary algorithm searches through the space of parameterized controllers that map sensory perceptions to control actions, thus realizing a specific robotic behavior. Further, the evolutionary algorithm maintains and improves a population of candidate behaviors by means of selection, recombination and mutation. A fitness function evaluates the performance of the resulting behavior according to the robot-s task or mission. In this paper, the focus is in the use of genetic algorithms to solve a multi-objective optimization problem representing robot behaviors; in particular, the A-Compander Law is employed in selecting the weight of each objective during the optimization process. Results using an adaptive fitness function show that this approach can efficiently react to complex tasks under variable environments.Keywords: adaptive fuzzy neural inference, evolutionary tuning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1509758 Support Vector Machine based Intelligent Watermark Decoding for Anticipated Attack
Authors: Syed Fahad Tahir, Asifullah Khan, Abdul Majid, Anwar M. Mirza
Abstract:
In this paper, we present an innovative scheme of blindly extracting message bits from an image distorted by an attack. Support Vector Machine (SVM) is used to nonlinearly classify the bits of the embedded message. Traditionally, a hard decoder is used with the assumption that the underlying modeling of the Discrete Cosine Transform (DCT) coefficients does not appreciably change. In case of an attack, the distribution of the image coefficients is heavily altered. The distribution of the sufficient statistics at the receiving end corresponding to the antipodal signals overlap and a simple hard decoder fails to classify them properly. We are considering message retrieval of antipodal signal as a binary classification problem. Machine learning techniques like SVM is used to retrieve the message, when certain specific class of attacks is most probable. In order to validate SVM based decoding scheme, we have taken Gaussian noise as a test case. We generate a data set using 125 images and 25 different keys. Polynomial kernel of SVM has achieved 100 percent accuracy on test data.Keywords: Bit Correct Ratio (BCR), Grid Search, Intelligent Decoding, Jackknife Technique, Support Vector Machine (SVM), Watermarking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668757 Effect of Fractional Flow Curves on the Heavy Oil and Light Oil Recoveries in Petroleum Reservoirs
Authors: Abdul Jamil Nazari, Shigeo Honma
Abstract:
This paper evaluates and compares the effect of fractional flow curves on the heavy oil and light oil recoveries in a petroleum reservoir. Fingering of flowing water is one of the serious problems of the oil displacement by water and another problem is the estimation of the amount of recover oil from a petroleum reservoir. To address these problems, the fractional flow of heavy oil and light oil are investigated. The fractional flow approach treats the multi-phases flow rate as a total mixed fluid and then describes the individual phases as fractional of the total flow. Laboratory experiments are implemented for two different types of oils, heavy oil, and light oil, to experimentally obtain relative permeability and fractional flow curves. Application of the light oil fractional curve, which exhibits a regular S-shape, to the water flooding method showed that a large amount of mobile oil in the reservoir is displaced by water injection. In contrast, the fractional flow curve of heavy oil does not display an S-shape because of its high viscosity. Although the advance of the injected waterfront is faster than in light oil reservoirs, a significant amount of mobile oil remains behind the waterfront.
Keywords: Fractional flow curve, oil recovery, relative permeability, water fingering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471756 Nonlinear Analysis of Postural Sway in Multiple Sclerosis
Authors: Hua Cao, Laurent Peyrodie, Olivier Agnani, Cécile Donzé
Abstract:
Multiple Sclerosis (MS) is a disease which affects the central nervous system and causes balance problem. In clinical, this disorder is usually evaluated using static posturography. Some linear or nonlinear measures, extracted from the posturographic data (i.e. center of pressure, COP) recorded during a balance test, has been used to analyze postural control of MS patients. In this study, the trend (TREND) and the sample entropy (SampEn), two nonlinear parameters were chosen to investigate their relationships with the expanded disability status scale (EDSS) score. 40 volunteers with different EDSS scores participated in our experiments with eyes open (EO) and closed (EC). TREND and 2 types of SampEn (SampEn1 and SampEn2) were calculated for each combined COP’s position signal. The results have shown that TREND had a weak negative correlation to EDSS while SampEn2 had a strong positive correlation to EDSS. Compared to TREND and SampEn1, SampEn2 showed a better significant correlation to EDSS and an ability to discriminate the MS patients in the EC case. In addition, the outcome of the study suggests that the multi-dimensional nonlinear analysis could provide some information about the impact of disability progression in MS on dynamics of the COP data.Keywords: Balance, multiple sclerosis, nonlinear analysis, postural sway.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1970755 Group Learning for the Design of Human Resource Development for Enterprise
Authors: Hao-Hsi Tseng, Hsin-Yun Lee, Yu-Cheng Kuo
Abstract:
In order to understand whether there is a better than the learning function of learning methods and improve the CAD Courses for enterprise’s design human resource development, this research is applied in learning practical learning computer graphics software. In this study, Revit building information model for learning content, design of two different modes of learning curriculum to learning, learning functions, respectively, and project learning. Via a post-test, questionnaires and student interviews, etc., to study the effectiveness of a comparative analysis of two different modes of learning. Students participate in a period of three weeks after a total of nine-hour course, and finally written and hands-on test. In addition, fill in the questionnaire response by the student learning, a total of fifteen questionnaire title, problem type into the base operating software, application software and software-based concept features three directions. In addition to the questionnaire, and participants were invited to two different learning methods to conduct interviews to learn more about learning students the idea of two different modes. The study found that the ad hoc short-term courses in learning, better learning outcomes. On the other hand, functional style for the whole course students are more satisfied, and the ad hoc style student is difficult to accept the ad hoc style of learning.Keywords: Development, education, human resource, learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1732754 Development of Energy Benchmarks Using Mandatory Energy and Emissions Reporting Data: Ontario Post-Secondary Residences
Authors: C. Xavier Mendieta, J. J McArthur
Abstract:
Governments are playing an increasingly active role in reducing carbon emissions, and a key strategy has been the introduction of mandatory energy disclosure policies. These policies have resulted in a significant amount of publicly available data, providing researchers with a unique opportunity to develop location-specific energy and carbon emission benchmarks from this data set, which can then be used to develop building archetypes and used to inform urban energy models. This study presents the development of such a benchmark using the public reporting data. The data from Ontario’s Ministry of Energy for Post-Secondary Educational Institutions are being used to develop a series of building archetype dynamic building loads and energy benchmarks to fill a gap in the currently available building database. This paper presents the development of a benchmark for college and university residences within ASHRAE climate zone 6 areas in Ontario using the mandatory disclosure energy and greenhouse gas emissions data. The methodology presented includes data cleaning, statistical analysis, and benchmark development, and lessons learned from this investigation are presented and discussed to inform the development of future energy benchmarks from this larger data set. The key findings from this initial benchmarking study are: (1) the importance of careful data screening and outlier identification to develop a valid dataset; (2) the key features used to develop a model of the data are building age, size, and occupancy schedules and these can be used to estimate energy consumption; and (3) policy changes affecting the primary energy generation significantly affected greenhouse gas emissions, and consideration of these factors was critical to evaluate the validity of the reported data.Keywords: Building archetypes, data analysis, energy benchmarks, GHG emissions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1022753 A Multigrid Approach for Three-Dimensional Inverse Heat Conduction Problems
Authors: Jianhua Zhou, Yuwen Zhang
Abstract:
A two-step multigrid approach is proposed to solve the inverse heat conduction problem in a 3-D object under laser irradiation. In the first step, the location of the laser center is estimated using a coarse and uniform grid system. In the second step, the front-surface temperature is recovered in good accuracy using a multiple grid system in which fine mesh is used at laser spot center to capture the drastic temperature rise in this region but coarse mesh is employed in the peripheral region to reduce the total number of sensors required. The effectiveness of the two-step approach and the multiple grid system are demonstrated by the illustrative inverse solutions. If the measurement data for the temperature and heat flux on the back surface do not contain random error, the proposed multigrid approach can yield more accurate inverse solutions. When the back-surface measurement data contain random noise, accurate inverse solutions cannot be obtained if both temperature and heat flux are measured on the back surface.
Keywords: Conduction, inverse problems, conjugated gradient method, laser.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 842752 Large Amplitude Free Vibration of a Very Sag Marine Cable
Authors: O. Punjarat, S. Chucheepsakul, T. Phanyasahachart
Abstract:
This paper focuses on a variational formulation of large amplitude free vibration behavior of a very sag marine cable. In the static equilibrium state, the marine cable has a very large sag configuration. In the motion state, the marine cable is assumed to vibrate in in-plane motion with large amplitude from the static equilibrium position. The total virtual work-energy of the marine cable at the dynamic state is formulated which involves the virtual strain energy due to axial deformation, the virtual work done by effective weight, and the inertia forces. The equations of motion for the large amplitude free vibration of marine cable are obtained by taking into account the difference between the Euler’s equation in the static state and the displaced state. Based on the Galerkin finite element procedure, the linear and nonlinear stiffness matrices, and mass matrices of the marine cable are obtained and the eigenvalue problem is solved. The natural frequency spectrum and the large amplitude free vibration behavior of marine cable are presented.
Keywords: Axial deformation, free vibration, Galerkin Finite Element Method, large amplitude, variational method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 825751 Optimizing Spatial Trend Detection By Artificial Immune Systems
Authors: M. Derakhshanfar, B. Minaei-Bidgoli
Abstract:
Spatial trends are one of the valuable patterns in geo databases. They play an important role in data analysis and knowledge discovery from spatial data. A spatial trend is a regular change of one or more non spatial attributes when spatially moving away from a start object. Spatial trend detection is a graph search problem therefore heuristic methods can be good solution. Artificial immune system (AIS) is a special method for searching and optimizing. AIS is a novel evolutionary paradigm inspired by the biological immune system. The models based on immune system principles, such as the clonal selection theory, the immune network model or the negative selection algorithm, have been finding increasing applications in fields of science and engineering. In this paper, we develop a novel immunological algorithm based on clonal selection algorithm (CSA) for spatial trend detection. We are created neighborhood graph and neighborhood path, then select spatial trends that their affinity is high for antibody. In an evolutionary process with artificial immune algorithm, affinity of low trends is increased with mutation until stop condition is satisfied.Keywords: Spatial Data Mining, Spatial Trend Detection, Heuristic Methods, Artificial Immune System, Clonal Selection Algorithm (CSA)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2045750 Disparities versus Similarities: WHO GPPQCL and ISO/IEC 17025:2017 International Standards for Quality Management Systems in Pharmaceutical Laboratories
Authors: M. A. Okezue, K. L. Clase, S. R. Byrn, P. Shivanand
Abstract:
Medicines regulatory authorities expect pharmaceutical companies and contract research organizations to seek ways to certify that their laboratory control measurements are reliable. Establishing and maintaining laboratory quality standards are essential in ensuring the accuracy of test results. ‘ISO/IEC 17025:2017’ and ‘WHO Good Practices for Pharmaceutical Quality Control Laboratories (GPPQCL)’ are two quality standards commonly employed in developing laboratory quality systems. A review was conducted on the two standards to elaborate on areas on convergence and divergence. The goal was to understand how differences in each standard's requirements may influence laboratories' choices as to which document is easier to adopt for quality systems. A qualitative review method compared similar items in the two standards while mapping out areas where there were specific differences in the requirements of the two documents. The review also provided a detailed description of the clauses and parts covering management and technical requirements in these laboratory standards. The review showed that both documents share requirements for over ten critical areas covering objectives, infrastructure, management systems, and laboratory processes. There were, however, differences in standard expectations where GPPQCL emphasizes system procedures for planning and future budgets that will ensure continuity. Conversely, ISO 17025 was more focused on the risk management approach to establish laboratory quality systems. Elements in the two documents form common standard requirements to assure the validity of laboratory test results that promote mutual recognition. The ISO standard currently has more global patronage than GPPQCL.
Keywords: ISO/IEC 17025:2017, laboratory standards, quality control, WHO GPPQCL
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1118749 Improvement of Frictional Coefficient of Modified Shoe Soles onto Icy and Snowy Road by Tilting of Added Glass Fibers into Rubber
Authors: Shunya Wakayama, Kazuya Okubo, Toru Fujii, Daisuke Sakata, Noriyuki Kado, Hiroshi Furutachi
Abstract:
The purpose of this study is to propose an effective method to improve frictional coefficient between shoe rubber soles with added glass fibers and the surfaces of icy and snowy road in order to prevent slip-and-fall accidents by the users. The additional fibers into the rubber were uniformly tilted to the perpendicular direction of the frictional surface, where tilting angles were -60, -30, +30, +60, 90 degrees and 0 (as normal specimen), respectively. It was found that parallel arraignment was effective to improve the frictional coefficient when glass fibers were embedded in the shoe rubber, while perpendicular to normal direction of the embedded glass fibers on the shoe surface was also effective to do that once after they were exposed from the shoe rubber with its abrasion. These improvements were explained by the increase of stiffness against the shear deformation of the rubber at critical frictional state and adequate scratching of fibers when fibers were protruded in perpendicular to frictional direction, respectively. Most effective angle of tilting of frictional coefficient between rubber specimens and a stone was perpendicular (= 0 degree) to frictional direction. Combinative modified rubber specimen having 2 layers was fabricated where tilting angle of protruded fibers was 0 degree near the contact surface and tilting angle of embedded fibers was 90 degrees near back surface in thickness direction to further improve the frictional coefficient. Current study suggested that effective arraignments in tilting angle of the added fibers should be applied in designing rubber shoe soles to keep the safeties for users in regions of cold climates.Keywords: Frictional coefficient, icy and snowy road, shoe rubber soles, tilting angle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700748 Flow Analysis of Viscous Nanofluid Due to Rotating Rigid Disk with Navier’s Slip: A Numerical Study
Authors: Khalil Ur Rehman, M. Y. Malik, Usman Ali
Abstract:
In this paper, the problem proposed by Von Karman is treated in the attendance of additional flow field effects when the liquid is spaced above the rotating rigid disk. To be more specific, a purely viscous fluid flow yield by rotating rigid disk with Navier’s condition is considered in both magnetohydrodynamic and hydrodynamic frames. The rotating flow regime is manifested with heat source/sink and chemically reactive species. Moreover, the features of thermophoresis and Brownian motion are reported by considering nanofluid model. The flow field formulation is obtained mathematically in terms of high order differential equations. The reduced system of equations is solved numerically through self-coded computational algorithm. The pertinent outcomes are discussed systematically and provided through graphical and tabular practices. A simultaneous way of study makes this attempt attractive in this sense that the article contains dual framework and validation of results with existing work confirms the execution of self-coded algorithm for fluid flow regime over a rotating rigid disk.
Keywords: Nanoparticles, Newtonian fluid model, chemical reaction, heat source/sink.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 987747 A Comparative Study of Additive and Nonparametric Regression Estimators and Variable Selection Procedures
Authors: Adriano Z. Zambom, Preethi Ravikumar
Abstract:
One of the biggest challenges in nonparametric regression is the curse of dimensionality. Additive models are known to overcome this problem by estimating only the individual additive effects of each covariate. However, if the model is misspecified, the accuracy of the estimator compared to the fully nonparametric one is unknown. In this work the efficiency of completely nonparametric regression estimators such as the Loess is compared to the estimators that assume additivity in several situations, including additive and non-additive regression scenarios. The comparison is done by computing the oracle mean square error of the estimators with regards to the true nonparametric regression function. Then, a backward elimination selection procedure based on the Akaike Information Criteria is proposed, which is computed from either the additive or the nonparametric model. Simulations show that if the additive model is misspecified, the percentage of time it fails to select important variables can be higher than that of the fully nonparametric approach. A dimension reduction step is included when nonparametric estimator cannot be computed due to the curse of dimensionality. Finally, the Boston housing dataset is analyzed using the proposed backward elimination procedure and the selected variables are identified.Keywords: Additive models, local polynomial regression, residuals, mean square error, variable selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1010746 Challenges in Adopting 3R Concept in the Heritage Building Restoration
Authors: H. H. Goh, K. C. Goh, T. W. Seow, N. S. Said, S. E. P. Ang
Abstract:
Malaysia is rich with historic buildings, particularly in Penang and Malacca states. Restoration activities are increasingly important as these states are recognized under UNESCO World Heritage Sites. Restoration activities help to maintain the uniqueness and value of a heritage building. However, increasing in restoration activities has resulted in large quantities of waste. To cope with this problem, the 3R concept (reduce, reuse and recycle) is introduced. The 3R concept is one of the waste management hierarchies. This concept is still yet to apply in the building restoration industry compared to the construction industry. Therefore, this study aims to promote the 3R concept in the heritage building restoration industry. This study aims to examine the importance of 3R concept and to identify challenges in applying the 3R concept in the heritage building restoration industry. This study focused on contractors and consultants who are involved in heritage restoration projects in Penang. Literature review and interviews helps to reach the research objective. Data that obtained is analyzed by using content analysis. For the research, application of 3R concept is important to conserve natural resources and reduce pollution problems. However, limited space to organise waste is the obstruction during the implementation of this concept. In conclusion, the 3R concept plays an important role in promoting environmental conservation and helping in reducing the construction waste.
Keywords: 3R Concept, Heritage building, Restoration activities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3163745 Spike Sorting Method Using Exponential Autoregressive Modeling of Action Potentials
Authors: Sajjad Farashi
Abstract:
Neurons in the nervous system communicate with each other by producing electrical signals called spikes. To investigate the physiological function of nervous system it is essential to study the activity of neurons by detecting and sorting spikes in the recorded signal. In this paper a method is proposed for considering the spike sorting problem which is based on the nonlinear modeling of spikes using exponential autoregressive model. The genetic algorithm is utilized for model parameter estimation. In this regard some selected model coefficients are used as features for sorting purposes. For optimal selection of model coefficients, self-organizing feature map is used. The results show that modeling of spikes with nonlinear autoregressive model outperforms its linear counterpart. Also the extracted features based on the coefficients of exponential autoregressive model are better than wavelet based extracted features and get more compact and well-separated clusters. In the case of spikes different in small-scale structures where principal component analysis fails to get separated clouds in the feature space, the proposed method can obtain well-separated cluster which removes the necessity of applying complex classifiers.
Keywords: Exponential autoregressive model, Neural data, spike sorting, time series modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1769744 Steel Dust as a Coating Agent for Iron Ore Pellets at Ironmaking
Authors: M. Bahgat, H. Hanafy, H. Al-Tassan
Abstract:
Cluster formation is an essential phenomenon during direct reduction processes at shaft furnaces. Decreasing the reducing temperature to avoid this problem can cause a significant drop in throughput. In order to prevent sticking of pellets, a coating material basically inactive under the reducing conditions prevailing in the shaft furnace, should be applied to cover the outer layer of the pellets. In the present work, steel dust is used as coating material for iron ore pellets to explore dust coating effectiveness and determines the best coating conditions. Steel dust coating is applied for iron ore pellets in various concentrations. Dust slurry concentrations of 5.0-30% were used to have a coated steel dust amount of 1.0-5.0 kg per ton iron ore. Coated pellets with various concentrations were reduced isothermally in weight loss technique with simulated gas mixture to the composition of reducing gases at shaft furnaces. The influences of various coating conditions on the reduction behavior and the morphology were studied. The optimum reduced samples were comparatively applied for sticking index measurement. It was found that the optimized steel dust coating condition that achieve higher reducibility with lower sticking index was 30% steel dust slurry concentration with 3.0 kg steel dust/ton ore.Keywords: Ironmaking, coating, steel dust, reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 937743 Simulating Economic Order Quantity and Reorder Point Policy for a Repairable Items Inventory System
Authors: Mojahid F. Saeed Osman
Abstract:
Repairable items inventory system is a management tool used to incorporate all information concerning inventory levels and movements for repaired and new items. This paper presents development of an effective simulation model for managing the inventory of repairable items for a production system where production lines send their faulty items to a repair shop considering the stochastic failure behavior and repair times. The developed model imitates the process of handling the on-hand inventory of repaired items and the replenishment of the inventory of new items using Economic Order Quantity and Reorder Point ordering policy in a flexible and risk-free environment. We demonstrate the appropriateness and effectiveness of the proposed simulation model using an illustrative case problem. The developed simulation model can be used as a reliable tool for estimating a healthy on-hand inventory of new and repaired items, backordered items, and downtime due to unavailability of repaired items, and validating and examining Economic Order Quantity and Reorder Point ordering policy, which would further be compared with other ordering strategies as future work.
Keywords: Inventory system, repairable items, simulation, maintenance, economic order quantity, reorder point.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 674742 Employing QR Code as an Effective Educational Tool for Quick Access to Sources of Kindergarten Concepts
Authors: Ahmed Amin Mousa, M. Abd El-Salam
Abstract:
This study discusses a simple solution for the problem of shortage in learning resources for kindergarten teachers. Occasionally, kindergarten teachers cannot access proper resources by usual search methods as libraries or search engines. Furthermore, these methods require a long time and efforts for preparing. The study is expected to facilitate accessing learning resources. Moreover, it suggests a potential direction for using QR code inside the classroom. The present work proposes that QR code can be used for digitizing kindergarten curriculums and accessing various learning resources. It investigates using QR code for saving information related to the concepts which kindergarten teachers use in the current educational situation. The researchers have established a guide for kindergarten teachers based on the Egyptian official curriculum. The guide provides different learning resources for each scientific and mathematical concept in the curriculum, and each learning resource is represented as a QR code image that contains its URL. Therefore, kindergarten teachers can use smartphone applications for reading QR codes and displaying the related learning resources for students immediately. The guide has been provided to a group of 108 teachers for using inside their classrooms. The results showed that the teachers approved the guide, and gave a good response.
Keywords: Kindergarten, child, learning resources, QR code, smart phone, mobile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1554741 Finite Element Prediction on the Machining Stability of Milling Machine with Experimental Verification
Authors: Jui P. Hung, Yuan L. Lai, Hui T. You
Abstract:
Chatter vibration has been a troublesome problem for a machine tool toward the high precision and high speed machining. Essentially, the machining performance is determined by the dynamic characteristics of the machine tool structure and dynamics of cutting process, which can further be identified in terms of the stability lobe diagram. Therefore, realization on the machine tool dynamic behavior can help to enhance the cutting stability. To assess the dynamic characteristics and machining stability of a vertical milling system under the influence of a linear guide, this study developed a finite element model integrated the modeling of linear components with the implementation of contact stiffness at the rolling interface. Both the finite element simulations and experimental measurements reveal that the linear guide with different preload greatly affects the vibration behavior and milling stability of the vertical column spindle head system, which also clearly indicate that the predictions of the machining stability agree well with the cutting tests. It is believed that the proposed model can be successfully applied to evaluate the dynamics performance of machine tool systems of various configurations.Keywords: Machining stability, Vertical milling machine, Linearguide, Contact stiffness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2643740 Comparative Spatial Analysis of a Re-arranged Hospital Building
Authors: Burak Köken, Hatice D. Arslan, Bilgehan Y. Çakmak
Abstract:
Analyzing the relation networks between the hospital buildings which have complex structure and distinctive spatial relationships is quite difficult. The hospital buildings which require specialty in spatial relationship solutions during design and selfinnovation through the developing technology should survive and keep giving service even after the disasters such as earthquakes. In this study, a hospital building where the load-bearing system was strengthened because of the insufficient earthquake performance and the construction of an additional building was required to meet the increasing need for space was discussed and a comparative spatial evaluation of the hospital building was made with regard to its status before the change and after the change. For this reason, spatial organizations of the building before change and after the change were analyzed by means of Space Syntax method and the effects of the change on space organization parameters were searched by applying an analytical procedure. Using Depthmap UCL software, Connectivity, Visual Mean Depth, Beta and Visual Integration analyses were conducted. Based on the data obtained after the analyses, it was seen that the relationships between spaces of the building increased after the change and the building has become more explicit and understandable for the occupants. Furthermore, it was determined according to findings of the analysis that the increase in depth causes difficulty in perceiving the spaces and the changes considering this problem generally ease spatial use.Keywords: Architecture, hospital building, space syntax, strengthening.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2207739 A Refined Application of QFD in SCM, A New Approach
Authors: Nooshin La'l Mohamadi
Abstract:
Due to the fact that in the new century customers tend to express globally increasing demands, networks of interconnected businesses have been established in societies and the management of such networks seems to be a major key through gaining competitive advantages. Supply chain management encompasses such managerial activities. Within a supply chain, a critical role is played by quality. QFD is a widely-utilized tool which serves the purpose of not only bringing quality to the ultimate provision of products or service packages required by the end customer or the retailer, but it can also initiate us into a satisfactory relationship with our initial customer; that is the wholesaler. However, the wholesalers- cooperation is considerably based on the capabilities that are heavily dependent on their locations and existing circumstances. Therefore, it is undeniable that for all companies each wholesaler possesses a specific importance ratio which can heavily influence the figures calculated in the House of Quality in QFD. Moreover, due to the competitiveness of the marketplace today, it-s been widely recognized that consumers- expression of demands has been highly volatile in periods of production. Apparently, such instability and proneness to change has been very tangibly noticed and taking it into account during the analysis of HOQ is widely influential and doubtlessly required. For a more reliable outcome in such matters, this article demonstrates the application viability of Analytic Network Process for considering the wholesalers- reputation and simultaneously introduces a mortality coefficient for the reliability and stability of the consumers- expressed demands in course of time. Following to this, the paper provides further elaboration on the relevant contributory factors and approaches through the calculation of such coefficients. In the end, the article concludes that an empirical application is needed to achieve broader validity.Keywords: Analytic Network Process, Quality Function Deployment, QFD flaws, Supply Chain Management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1425738 On Figuring the City Characteristics and Landscape in Overall Urban Design: A Case Study in Xiangyang Central City, China
Authors: Guyue Zhu, Liangping Hong
Abstract:
Chinese overall urban design faces a large number of problems such as the neglect of urban characteristics, generalization of content, and difficulty in implementation. Focusing on these issues, this paper proposes the main points of shaping urban characteristics in overall urban design: focuses on core problems in city function and scale, landscape pattern, historical culture, social resources and modern city style and digs the urban characteristic genes. Then, we put forward “core problem location and characteristic gene enhancement” as a kind of overall urban design technical method. Firstly, based on the main problems in urban space as a whole, for the operability goal, the method extracts the key genes and integrates into the multi-dimension system in a targeted manner. Secondly, hierarchical management and guidance system is established which may be in line with administrative management. Finally, by converting the results, action plan is drawn up that can be dynamically implemented. Based on the above idea and method, a practical exploration has been performed in the case of Xiangyang central city.Keywords: City characteristics, overall urban design, planning implementation, Xiangyang central city.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 948737 FEM Simulation of HE Blast-Fragmentation Warhead and the Calculation of Lethal Range
Authors: G. Tanapornraweekit, W. Kulsirikasem
Abstract:
This paper presents the simulation of fragmentation warhead using a hydrocode, Autodyn. The goal of this research is to determine the lethal range of such a warhead. This study investigates the lethal range of warheads with and without steel balls as preformed fragments. The results from the FE simulation, i.e. initial velocities and ejected spray angles of fragments, are further processed using an analytical approach so as to determine a fragment hit density and probability of kill of a modelled warhead. In order to simulate a plenty of preformed fragments inside a warhead, the model requires expensive computation resources. Therefore, this study attempts to model the problem in an alternative approach by considering an equivalent mass of preformed fragments to the mass of warhead casing. This approach yields approximately 7% and 20% difference of fragment velocities from the analytical results for one and two layers of preformed fragments, respectively. The lethal ranges of the simulated warheads are 42.6 m and 56.5 m for warheads with one and two layers of preformed fragments, respectively, compared to 13.85 m for a warhead without preformed fragment. These lethal ranges are based on the requirement of fragment hit density. The lethal ranges which are based on the probability of kill are 27.5 m, 61 m and 70 m for warheads with no preformed fragment, one and two layers of preformed fragments, respectively.Keywords: Lethal Range, Natural Fragment, Preformed Fragment, Warhead.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4308736 Improving Detection of Illegitimate Scores and Assessment in Most Advantageous Tenders
Authors: Hao-Hsi Tseng, Hsin-Yun Lee
Abstract:
Adopting Most Advantageous Tender (MAT) for the government procurement projects has become popular in Taiwan. As time pass by, the problems of MAT has appeared gradually. People condemn two points that are the result might be manipulated by a single committee member’s partiality and how to make a fair decision when the winner has two or more. Arrow’s Impossibility Theorem proposed that the best scoring method should meet the four reasonable criteria. According to these four criteria this paper constructed an “Illegitimate Scores Checking Scheme” for a scoring method and used the scheme to find out the illegitimate of the current evaluation method of MAT. This paper also proposed a new scoring method that is called the “Standardizing Overall Evaluated Score Method”. This method makes each committee member’s influence tend to be identical. Thus, the committee members can scoring freely according to their partiality without losing the fairness. Finally, it was examined by a large-scale simulation, and the experiment revealed that the it improved the problem of dictatorship and perfectly avoided the situation of cyclical majorities, simultaneously. This result verified that the Standardizing Overall Evaluated Score Method is better than any current evaluation method of MAT.Keywords: Arrow’s impossibility theorem, most advantageous tender, illegitimate scores checking scheme, standard score.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1469735 Business Domain Modelling Using an Integrated Framework
Authors: Mohammed Salahat, Steve Wade
Abstract:
This paper presents an application of a “Systematic Soft Domain Driven Design Framework” as a soft systems approach to domain-driven design of information systems development. The framework use SSM as a guiding methodology within which we have embedded a sequence of design tasks based on the UML leading to the implementation of a software system using the Naked Objects framework. This framework have been used in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, and a real case study “Information Retrieval System for academic research” is used, in this paper, to show further practice and evaluation of the framework in different business domain. We argue that there are advantages from combining and using techniques from different methodologies in this way for business domain modelling. The framework is overviewed and justified as multimethodology using Mingers multimethodology ideas.Keywords: SSM, UML, domain-driven design, soft domaindriven design, naked objects, soft language, information retrieval, multimethodology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1776734 An Improved Method on Static Binary Analysis to Enhance the Context-Sensitive CFI
Authors: Qintao Shen, Lei Luo, Jun Ma, Jie Yu, Qingbo Wu, Yongqi Ma, Zhengji Liu
Abstract:
Control Flow Integrity (CFI) is one of the most promising technique to defend Code-Reuse Attacks (CRAs). Traditional CFI Systems and recent Context-Sensitive CFI use coarse control flow graphs (CFGs) to analyze whether the control flow hijack occurs, left vast space for attackers at indirect call-sites. Coarse CFGs make it difficult to decide which target to execute at indirect control-flow transfers, and weaken the existing CFI systems actually. It is an unsolved problem to extract CFGs precisely and perfectly from binaries now. In this paper, we present an algorithm to get a more precise CFG from binaries. Parameters are analyzed at indirect call-sites and functions firstly. By comparing counts of parameters prepared before call-sites and consumed by functions, targets of indirect calls are reduced. Then the control flow would be more constrained at indirect call-sites in runtime. Combined with CCFI, we implement our policy. Experimental results on some popular programs show that our approach is efficient. Further analysis show that it can mitigate COOP and other advanced attacks.Keywords: Contex-sensitive, CFI, binary analysis, code reuse attack.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 942733 Work System Design in Productivity for Small and Medium Enterprises: A Systematic Literature Review
Authors: S. Halofaki, D. R. Seenivasagam, P. Bijay, K. Singh, R. Ananthanarayanan
Abstract:
This comprehensive literature review delves into the effects and applications of work system design on the performance of Small and Medium-sized Enterprises (SMEs). The review process involved three independent reviewers who screened 514 articles through a four-step procedure: removing duplicates, assessing keyword relevance, evaluating abstract content, and thoroughly reviewing full-text articles. Various criteria such as relevance to the research topic, publication type, study type, language, publication date, and methodological quality were employed to exclude certain publications. A portion of articles that met the predefined inclusion criteria were included as a result of this systematic literature review. These selected publications underwent data extraction and analysis to compile insights regarding the influence of work system design on SME performance. Additionally, the quality of the included studies was assessed, and the level of confidence in the body of evidence was established. The findings of this review shed light on how work system design impacts SME performance, emphasizing important implications and applications. Furthermore, the review offers suggestions for further research in this critical area and summarizes the current state of knowledge in the field. Understanding the intricate connections between work system design and SME success can enhance operational efficiency, employee engagement, and overall competitiveness for SMEs. This comprehensive examination of the literature contributes significantly to both academic research and practical decision-making for SMEs.
Keywords: Literature review, productivity, small and medium-sized enterprises, SMEs, work system design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 100