Search results for: linear base isolator
4646 Inventory Management System of Seasonal Raw Materials of Feeds at San Jose Batangas through Integer Linear Programming and VBA
Authors: Glenda Marie D. Balitaan
Abstract:
The branch of business management that deals with inventory planning and control is known as inventory management. It comprises keeping track of supply levels and forecasting demand, as well as scheduling when and how to plan. Keeping excess inventory results in a loss of money, takes up physical space, and raises the risk of damage, spoilage, and loss. On the other hand, too little inventory frequently causes operations to be disrupted and raises the possibility of low customer satisfaction, both of which can be detrimental to a company's reputation. The United Victorious Feed mill Corporation's present inventory management practices were assessed in terms of inventory level, warehouse allocation, ordering frequency, shelf life, and production requirement. To help the company achieve their optimal level of inventory, a mathematical model was created using Integer Linear Programming. Due to the season, the goal function was to reduce the cost of purchasing US Soya and Yellow Corn. Warehouse space, annual production requirements, and shelf life were all considered. To ensure that the user only uses one application to record all relevant information, like production output and delivery, the researcher built a Visual Basic system. Additionally, the technology allows management to change the model's parameters.Keywords: inventory management, integer linear programming, inventory management system, feed mill
Procedia PDF Downloads 834645 Mobile Wireless Investigation Platform
Authors: Dimitar Karastoyanov, Todor Penchev
Abstract:
The paper presents the research of a kind of autonomous mobile robots, intended for work and adaptive perception in unknown and unstructured environment. The objective are robots, dedicated for multi-sensory environment perception and exploration, like measurements and samples taking, discovering and putting a mark on the objects as well as environment interactions–transportation, carrying in and out of equipment and objects. At that ground classification of the different types mobile robots in accordance with the way of locomotion (wheel- or chain-driven, walking, etc.), used drive mechanisms, kind of sensors, end effectors, area of application, etc. is made. Modular system for the mechanical construction of the mobile robots is proposed. Special PLC on the base of AtMega128 processor for robot control is developed. Electronic modules for the wireless communication on the base of Jennic processor as well as the specific software are developed. The methods, means and algorithms for adaptive environment behaviour and tasks realization are examined. The methods of group control of mobile robots and for suspicious objects detecting and handling are discussed too.Keywords: mobile robots, wireless communications, environment investigations, group control, suspicious objects
Procedia PDF Downloads 3564644 Effect of Irrigation and Hydrogel on the Water Use Efficiency of Zeto-Tiled Green-Gram Relay System in the Eastern Indo Gangetic-Plain
Authors: Benukar Biswas, S. Banerjee, P. K. Bandhyopadhyaya, S. K. Patra, S. Sarkar
Abstract:
Jute can be sown as relay crop in between the lines of 15-20 days old green gram for additional pulse yield without reducing the yield of jute. The main problem of this system is water use efficiency (WUE). The increase in water productivity and reduction in production cost were reported in the zero-tilled crop. The hydrogel can hold water up to 400 times of its weight and can release 95 % of the retained water. The present field study was carried out during 2015-16 at BCKV (tropical sub-humid, 1560 mm annual rainfall, 22058/ N, 88051/ E, 9.75 m AMSL, sandy loam soil, aeric Haplaquept, pH 6.75, organic carbon 5.4 g kg-1, available N 85 kg ha-1, P2O5 15.3 kg ha-1 and K2O 40 kg ha-1) with four levels of irrigation regimes: no irrigation - RF, cumulative pan evaporation 250mm (CPE250), CPE125 and CPE83 and three levels of hydrogel: no hydrogel (H0), 2.5 kg ha-1 (H2.5) and 5 kg ha-1 (H5). Throughout the crop growing period a linear positive relationship remained between Leaf Area Index (LAI) and evapotranspiration rate. The strength of the relationship between ETa and LAI started increasing and reached its peak at 7 WAS (R2=0.78) when green gram was at its maturity, and both the crops covered the nearly entire base area. This relation starts weakening from 13 WAS due to jute leaf shading. A linear relationship between system yield and ET was also obtained in the present study. The variation in system yield might be predicted 75% with ET alone. Effective rainfall was reduced with increasing irrigation frequency due to enhanced water supply in contrast to hydrogel application due to the difference in water storage capacity. Irrigation contributed a major source of variability of ET. Higher irrigation frequency resulted in higher ET loss ranging from 574 mm in RF to 764 mm in CPE83. Hydrogel application also increased water storage on a sustained basis and supplied to crops resulting higher ET from 639 mm in H0 to 671mm in H5. WUE ranged between 0.4 kg m-3 (RF) to 0.63 kg m-3 (CPE83 H5). WUE increased with increased application of irrigation water from 0.42 kg m-3 in RF to 0.57 kg m-3 in CPE 83. Hydrogel application significantly improves the WUE from 0.45 kg m-3 in H0 to 0.50 in H2.5 and 0.54 in H5. Under relatively dry root zone (RF), both evaporation and transpiration remain at suboptimal level resulting in lower ET as well as lower system yield. Green gram – jute relay system can be water use efficient with 38% higher yield with application of hydrogel @ 2.5 kg ha-1 under deficit irrigation regime of CPE 125 over rainfed system without application of the gel. Application of gel conditioner improved water storage, checked excess water loss from the system, and mitigated ET demand of the relay system for a longer time. Hence, irrigation frequency was reduced from five times at CPE 83 to only three times in CPE 125.Keywords: zero tillage, deficit irrigation, hydrogel, relay system
Procedia PDF Downloads 2334643 Mechanistic Study of Composite Pavement Behavior in Heavy Duty Area
Authors: Makara Rith, Young Kyu Kim, Seung Woo Lee
Abstract:
In heavy duty areas, asphalt pavement constructed as entrance roadway may expose distresses such as cracking and rutting during service life. To mitigate these problems, composite pavement with a roller-compacted concrete base may be a good alternative; however, it should be initially investigated. Structural performances such as fatigue cracking and rut depth may be changed due to variation of some design factors. Therefore, this study focuses on the variation effect of material modulus, layer thickness and loading on composite pavement performances. Stress and strain at the critical location are determined and used as the input of transfer function for corresponding distresses to evaluate the pavement performance. Also, composite pavement satisfying the design criteria may be selected as a design section for heavy duty areas. Consequently, this investigation indicates that composite pavement has the ability to eliminate fatigue cracking in asphalt surfaces and significantly reduce rut depth. In addition, a thick or strong rigid base can significantly reduce rut depth and prolong fatigue life of this layer.Keywords: composite pavement, ports, cracking, rutting
Procedia PDF Downloads 2064642 Proposed Algorithms to Assess Concussion Potential in Rear-End Motor Vehicle Collisions: A Meta-Analysis
Authors: Rami Hashish, Manon Limousis-Gayda, Caitlin McCleery
Abstract:
Introduction: Mild traumatic brain injuries, also referred to as concussions, represent an increasing burden to society. Due to limited objective diagnostic measures, concussions are diagnosed by assessing subjective symptoms, often leading to disputes to their presence. Common biomechanical measures associated with concussion are high linear and/or angular acceleration to the head. With regards to linear acceleration, approximately 80g’s has previously been shown to equate with a 50% probability of concussion. Motor vehicle collisions (MVCs) are a leading cause of concussion, due to high head accelerations experienced. The change in velocity (delta-V) of a vehicle in an MVC is an established metric for impact severity. As acceleration is the rate of delta-V with respect to time, the purpose of this paper is to determine the relation between delta-V (and occupant parameters) with linear head acceleration. Methods: A meta-analysis was conducted for manuscripts collected using the following keywords: head acceleration, concussion, brain injury, head kinematics, delta-V, change in velocity, motor vehicle collision, and rear-end. Ultimately, 280 studies were surveyed, 14 of which fulfilled the inclusion criteria as studies investigating the human response to impacts, reporting head acceleration, and delta-V of the occupant’s vehicle. Statistical analysis was conducted with SPSS and R. The best fit line analysis allowed for an initial understanding of the relation between head acceleration and delta-V. To further investigate the effect of occupant parameters on head acceleration, a quadratic model and a full linear mixed model was developed. Results: From the 14 selected studies, 139 crashes were analyzed with head accelerations and delta-V values ranging from 0.6 to 17.2g and 1.3 to 11.1 km/h, respectively. Initial analysis indicated that the best line of fit (Model 1) was defined as Head Acceleration = 0.465Keywords: acceleration, brain injury, change in velocity, Delta-V, TBI
Procedia PDF Downloads 2334641 Detecting Earnings Management via Statistical and Neural Networks Techniques
Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie
Abstract:
Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.Keywords: earnings management, generalized linear regression, neural networks multi-layer perceptron, Tehran stock exchange
Procedia PDF Downloads 4214640 MBES-CARIS Data Validation for the Bathymetric Mapping of Shallow Water in the Kingdom of Bahrain on the Arabian Gulf
Authors: Abderrazak Bannari, Ghadeer Kadhem
Abstract:
The objectives of this paper are the validation and the evaluation of MBES-CARIS BASE surface data performance for bathymetric mapping of shallow water in the Kingdom of Bahrain. The latter is an archipelago with a total land area of about 765.30 km², approximately 126 km of coastline and 8,000 km² of marine area, located in the Arabian Gulf, east of Saudi Arabia and west of Qatar (26° 00’ N, 50° 33’ E). To achieve our objectives, bathymetric attributed grid files (X, Y, and depth) generated from the coverage of ship-track MBSE data with 300 x 300 m cells, processed with CARIS-HIPS, were downloaded from the General Bathymetric Chart of the Oceans (GEBCO). Then, brought into ArcGIS and converted into a raster format following five steps: Exportation of GEBCO BASE surface data to the ASCII file; conversion of ASCII file to a points shape file; extraction of the area points covering the water boundary of the Kingdom of Bahrain and multiplying the depth values by -1 to get the negative values. Then, the simple Kriging method was used in ArcMap environment to generate a new raster bathymetric grid surface of 30×30 m cells, which was the basis of the subsequent analysis. Finally, for validation purposes, 2200 bathymetric points were extracted from a medium scale nautical map (1:100 000) considering different depths over the Bahrain national water boundary. The nautical map was scanned, georeferenced and overlaid on the MBES-CARIS generated raster bathymetric grid surface (step 5 above), and then homologous depth points were selected. Statistical analysis, expressed as a linear error at the 95% confidence level, showed a strong correlation coefficient (R² = 0.96) and a low RMSE (± 0.57 m) between the nautical map and derived MBSE-CARIS depths if we consider only the shallow areas with depths of less than 10 m (about 800 validation points). When we consider only deeper areas (> 10 m) the correlation coefficient is equal to 0.73 and the RMSE is equal to ± 2.43 m while if we consider the totality of 2200 validation points including all depths, the correlation coefficient is still significant (R² = 0.81) with satisfactory RMSE (± 1.57 m). Certainly, this significant variation can be caused by the MBSE that did not completely cover the bottom in several of the deeper pockmarks because of the rapid change in depth. In addition, steep slopes and the rough seafloor probably affect the acquired MBSE raw data. In addition, the interpolation of missed area values between MBSE acquisition swaths-lines (ship-tracked sounding data) may not reflect the true depths of these missed areas. However, globally the results of the MBES-CARIS data are very appropriate for bathymetric mapping of shallow water areas.Keywords: bathymetry mapping, multibeam echosounder systems, CARIS-HIPS, shallow water
Procedia PDF Downloads 3814639 Tribological Behavior of Pongamia Oil Based Biodiesel Blended Lubricant at Different Load
Authors: Yashvir Singh, Amneesh Singla, Swapnil Bhurat
Abstract:
Around the globe, there is demand for the development of bio-based lubricant which will be biodegradable, non toxic, and environmentally-friendly. This paper outlines the friction and wear characteristics of ponagamia biodiesel contaminated bio-lubricant by using pin-on-disc tribometer. To formulate the bio-lubricants, Ponagamia oil based biodiesel were blended in the ratios 5, 10, and 20% by volume with the base lubricant SAE 20 W 40. Tribological characteristics of these blends were carried out at 2.5 m/s sliding velocity and loads applied were 50, 100, 150 N. Experimental results showed that the lubrication regime that occurred during the test was boundary lubrication while the main wear mechanisms was the adhesive wear. During testing, the lowest wear was found with the addition of 5 and 10% Ponagamia oil based biodiesel, and above this contamination, the wear rate was increased considerably. The addition of 5 and 10% Ponagamia oil based biodiesel with the base lubricant acted as a very good lubricant additive which reduced the friction and wear rate during the test. It has been concluded that the PBO 5 and PBO 10 can act as an alternative lubricant to increase the mechanical efficiency at 2.5 m/s sliding velocity and contribute in reduction of dependence on the petroleum based products.Keywords: friction, load, pongamia oil blend, sliding velocity, wear
Procedia PDF Downloads 3094638 On the Representation of Actuator Faults Diagnosis and Systems Invertibility
Authors: F. Sallem, B. Dahhou, A. Kamoun
Abstract:
In this work, the main problem considered is the detection and the isolation of the actuator fault. A new formulation of the linear system is generated to obtain the conditions of the actuator fault diagnosis. The proposed method is based on the representation of the actuator as a subsystem connected with the process system in cascade manner. The designed formulation is generated to obtain the conditions of the actuator fault detection and isolation. Detectability conditions are expressed in terms of the invertibility notions. An example and a comparative analysis with the classic formulation illustrate the performances of such approach for simple actuator fault diagnosis by using the linear model of nuclear reactor.Keywords: actuator fault, Fault detection, left invertibility, nuclear reactor, observability, parameter intervals, system inversion
Procedia PDF Downloads 4054637 A Web-Based Self-Learning Grammar for Spoken Language Understanding
Authors: S. Biondi, V. Catania, R. Di Natale, A. R. Intilisano, D. Panno
Abstract:
One of the major goals of Spoken Dialog Systems (SDS) is to understand what the user utters. In the SDS domain, the Spoken Language Understanding (SLU) Module classifies user utterances by means of a pre-definite conceptual knowledge. The SLU module is able to recognize only the meaning previously included in its knowledge base. Due the vastity of that knowledge, the information storing is a very expensive process. Updating and managing the knowledge base are time-consuming and error-prone processes because of the rapidly growing number of entities like proper nouns and domain-specific nouns. This paper proposes a solution to the problem of Name Entity Recognition (NER) applied to a SDS domain. The proposed solution attempts to automatically recognize the meaning associated with an utterance by using the PANKOW (Pattern based Annotation through Knowledge On the Web) method at runtime. The method being proposed extracts information from the Web to increase the SLU knowledge module and reduces the development effort. In particular, the Google Search Engine is used to extract information from the Facebook social network.Keywords: spoken dialog system, spoken language understanding, web semantic, name entity recognition
Procedia PDF Downloads 3384636 Simulation and Analysis of Passive Parameters of Building in eQuest: A Case Study in Istanbul, Turkey
Authors: Mahdiyeh Zafaranchi
Abstract:
With rapid development of urbanization and improvement of living standards in the world, energy consumption and carbon emissions of the building sector are expected to increase in the near future; because of that, energy-saving issues have become more important among the engineers. Besides, the building sector is a major contributor to energy consumption and carbon emissions. The concept of efficient building appeared as a response to the need for reducing energy demand in this sector which has the main purpose of shifting from standard buildings to low-energy buildings. Although energy-saving should happen in all steps of a building during the life cycle (material production, construction, demolition), the main concept of efficient energy building is saving energy during the life expectancy of a building by using passive and active systems, and should not sacrifice comfort and quality to reach these goals. The main aim of this study is to investigate passive strategies (do not need energy consumption or use renewable energy) to achieve energy-efficient buildings. Energy retrofit measures were explored by eQuest software using a case study as a base model. The study investigates predictive accuracy for the major factors like thermal transmittance (U-value) of the material, windows, shading devices, thermal insulation, rate of the exposed envelope, window/wall ration, lighting system in the energy consumption of the building. The base model was located in Istanbul, Turkey. The impact of eight passive parameters on energy consumption had been indicated. After analyzing the base model by eQuest, a final scenario was suggested which had a good energy performance. The results showed a decrease in the U-values of materials, the rate of exposing buildings, and windows had a significant effect on energy consumption. Finally, savings in electric consumption of about 10.5%, and gas consumption by about 8.37% in the suggested model were achieved annually.Keywords: efficient building, electric and gas consumption, eQuest, Passive parameters
Procedia PDF Downloads 1124635 Investigation of the Material Behaviour of Polymeric Interlayers in Broken Laminated Glass
Authors: Martin Botz, Michael Kraus, Geralt Siebert
Abstract:
The use of laminated glass gains increasing importance in structural engineering. For safety reasons, at least two glass panes are laminated together with a polymeric interlayer. In case of breakage of one or all of the glass panes, the glass fragments are still connected to the interlayer due to adhesion forces and a certain residual load-bearing capacity is left in the system. Polymer interlayers used in the laminated glass show a viscoelastic material behavior, e.g. stresses and strains in the interlayer are dependent on load duration and temperature. In the intact stage only small strains appear in the interlayer, thus the material can be described in a linear way. In the broken stage, large strains can appear and a non-linear viscoelasticity material theory is necessary. Relaxation tests on two different types of polymeric interlayers are performed at different temperatures and strain amplitudes to determine the border to the non-linear material regime. Based on the small-scale specimen results further tests on broken laminated glass panes are conducted. So-called ‘through-crack-bending’ (TCB) tests are performed, in which the laminated glass has a defined crack pattern. The test set-up is realized in a way that one glass layer is still able to transfer compressive stresses but tensile stresses have to be transferred by the interlayer solely. The TCB-tests are also conducted under different temperatures but constant force (creep test). Aims of these experiments are to elaborate if the results of small-scale tests on the interlayer are transferable to a laminated glass system in the broken stage. In this study, limits of the applicability of linear-viscoelasticity are established in the context of two commercially available polymer-interlayers. Furthermore, it is shown that the results of small-scale tests agree to a certain degree to the results of the TCB large-scale experiments. In a future step, the results can be used to develop material models for the post breakage performance of laminated glass.Keywords: glass breakage, laminated glass, relaxation test, viscoelasticity
Procedia PDF Downloads 1214634 Digital Phase Shifting Holography in a Non-Linear Interferometer using Undetected Photons
Authors: Sebastian Töpfer, Marta Gilaberte Basset, Jorge Fuenzalida, Fabian Steinlechner, Juan P. Torres, Markus Gräfe
Abstract:
This work introduces a combination of digital phase-shifting holography with a non-linear interferometer using undetected photons. Non-linear interferometers can be used in combination with a measurement scheme called quantum imaging with undetected photons, which allows for the separation of the wavelengths used for sampling an object and detecting it in the imaging sensor. This method recently faced increasing attention, as it allows to use of exotic wavelengths (e.g., mid-infrared, ultraviolet) for object interaction while at the same time keeping the detection in spectral areas with highly developed, comparable low-cost imaging sensors. The object information, including its transmission and phase influence, is recorded in the form of an interferometric pattern. To collect these, this work combines the method of quantum imaging with undetected photons with digital phase-shifting holography with a minimal sampling of the interference. With this, the quantum imaging scheme gets extended in its measurement capabilities and brings it one step closer to application. Quantum imaging with undetected photons uses correlated photons generated by spontaneous parametric down-conversion in a non-linear interferometer to create indistinguishable photon pairs, which leads to an effect called induced coherence without induced emission. Placing an object inside changes the interferometric pattern depending on the object’s properties. Digital phase-shifting holography records multiple images of the interference with determined phase shifts to reconstruct the complete interference shape, which can afterward be used to analyze the changes introduced by the object and conclude its properties. An extensive characterization of this method was done using a proof-of-principle setup. The measured spatial resolution, phase accuracy, and transmission accuracy are compared for different combinations of camera exposure times and the number of interference sampling steps. The current limits of this method are shown to allow further improvements. To summarize, this work presents an alternative holographic measurement method using non-linear interferometers in combination with quantum imaging to enable new ways of measuring and motivating continuing research.Keywords: digital holography, quantum imaging, quantum holography, quantum metrology
Procedia PDF Downloads 924633 Heritage Tourism Balance between Historic Culture and Marketing Innovation: The Case Study of Taiwan
Authors: Lin Chih-Ken
Abstract:
This paper explores the A Li Shan hotel of Taiwan during the Japanese occupation period, after over a hundred years of time, it has been handed over to the hotel managing enterprise to retain the historic building and the culture. Applying the innovative marketing strategies, coordinate the local government traveling policy then combined local tea agriculture and forestry specialty integrated marketing, to create the special hotel located in the Alishan National Scenic Area with the characteristics of landscape, innovative marketing and history, to attract domestic tourism and visitors around the world. This study interview the hotel owner, managers, employees and guests, in addition to collected message feedback from reservation website, to apply Ambidexterity Marketing Theory and Resource Base Theory to analyze the main impact factors. The conclusion showed that the integration of several key factors and make good use of resource strength generate heterogeneous product characteristics to attracting wider range of visitors.Keywords: heritage tourism, historic hotel, marketing ambidexterity, resource base theory
Procedia PDF Downloads 2654632 Absorbed Dose Measurements for Teletherapy Prediction of Superficial Dose Using Halcyon Linear Accelerator
Authors: Raymond Limen Njinga, Adeneye Samuel Olaolu, Akinyode Ojumoola Ajimo
Abstract:
Introduction: Measurement of entrance dose and dose at different depths is essential to avoid overdose and underdose of patients. The aim of this study is to verify the variation in the absorbed dose using a water-equivalent material. Materials and Methods: The plastic phantom was arranged on the couch of the halcyon linear accelerator by Varian, with the farmer ionization chamber inserted and connected to the electrometer. The image of the setup was taken using the High-Quality Single 1280x1280x16 higher on the service mode to check the alignment with the isocenter. The beam quality TPR₂₀,₁₀ (Tissue phantom ratio) was done to check the beam quality of the machine at a field size of 10 cm x 10 cm. The calibration was done using SAD type set-up at a depth of 5 cm. This process was repeated for ten consecutive weeks, and the values were recorded. Results: The results of the beam output for the teletherapy machine were satisfactory and accepted in comparison with the commissioned measurement of 0.62. The beam quality TPR₂₀,₁₀ (Tissue phantom ratio) was reasonable with respect to the beam quality of the machine at a field size of 10 cm x 10 cm. Conclusion: The results of the beam quality and the absorbed dose rate showed a good consistency over the period of ten weeks with the commissioned measurement value.Keywords: linear accelerator, absorbed dose rate, isocenter, phantom, ionization chamber
Procedia PDF Downloads 614631 A New Reliability Allocation Method Based on Fuzzy Numbers
Authors: Peng Li, Chuanri Li, Tao Li
Abstract:
Reliability allocation is quite important during early design and development stages for a system to apportion its specified reliability goal to subsystems. This paper improves the reliability fuzzy allocation method and gives concrete processes on determining the factor set, the factor weight set, judgment set, and multi-grade fuzzy comprehensive evaluation. To determine the weight of factor set, the modified trapezoidal numbers are proposed to reduce errors caused by subjective factors. To decrease the fuzziness in the fuzzy division, an approximation method based on linear programming is employed. To compute the explicit values of fuzzy numbers, centroid method of defuzzification is considered. An example is provided to illustrate the application of the proposed reliability allocation method based on fuzzy arithmetic.Keywords: reliability allocation, fuzzy arithmetic, allocation weight, linear programming
Procedia PDF Downloads 3424630 The Role of Dynamic Ankle Foot Orthosis on Temporo-Spatial Parameters of Gait and Balance in Patients with Hereditary Spastic Paraparesis: Six-Months Follow Up
Abstract:
Background: Recently a supramalleolar type of dynamic ankle foot orthosis (DAFO) has been increasingly used to support all of the dynamic arches of the foot and redistribute the pressure under the plantar surface of the foot to reduce the muscle tone. DAFO helps to maintain balance and postural control by providing stability and proprioceptive feedback in children with disease like Cerebral Palsy, Muscular Dystrophies, Down syndrome, and congenital hypotonia. Aim: The aim of this study was to investigate the role of Dynamic ankle foot orthosis (DAFO) on temporo-spatial parameters of gait and balance in three children with hereditary spastic paraparesis (HSP). Material Method: 13, 14, and 8 years old three children with HSP were included in the study. To provide correction on weight bearing and to improve gait, DAFO was made. Lower extremity spasticity (including gastocnemius, hamstrings and hip adductor muscles) using modified Ashworth Scale (MAS) (0-5), The temporo-spatial gait parameters (walking speed, cadence, base of support, step length) and Timed Up & Go test (TUG) were evaluated. All of the assessments about gait were compared with (with DAFO and shoes) and without DAFO (with shoes only) situations. Also after six months follow up period, assessments were repeated by the same physical therapist. Results: MAS scores for lower extremity were between “2-3” for the first child, “0-2” for the second child and “1-2” for the third child. TUG scores (sec) decreased from 20.2 to 18 for case one, from 9.4 to 9 for case two and from 12,4 to 12 for case three in the condition with shoes only and also from 15,2 to 14 for case one, from 7,2 to 7,1 for case two and from 10 to 7,3 for case three in the condition with DAFO and shoes. Gait speed (m/sec) while wearing shoes only was similar but while wearing DAFO and shoes increased from 0,4 to 0,5 for case one, from 1,5 to 1,6 for case two and from 1,0 to 1,2 for case three. Base of support scores (cm) wearing shoes only decreased from 18,5 to 14 for case one, from 13 to 12 for case three and were similar as 11 for case two. While wearing DAFO and shoes, base of support decreased from 10 to 9 for case one, from 11,5 to 10 for case three and was similar as 8 for case two. Conclusion: The use of a DAFO in a patient with HSP normalized the temporo-spatial gait parameters and improved balance. Walking speed is a gold standard for evaluating gait quality. With the use of DAFO, walking speed increased in this three children with HSP. With DAFO, better TUG scores shows that functional ambulation improved. Reduction in base of support and more symmetrical step lengths with DAFO indicated better balance. These encouraging results warrant further study on wider series.Keywords: dynamic ankle foot orthosis, gait, hereditary spastic paraparesis, balance in patient
Procedia PDF Downloads 3544629 Stability of Hybrid Systems
Authors: Kreangkri Ratchagit
Abstract:
This paper is concerned with exponential stability of switched linear systems with interval time-varying delays. The time delay is any continuous function belonging to a given interval, in which the lower bound of delay is not restricted to zero. By constructing a suitable augmented Lyapunov-Krasovskii functional combined with Leibniz-Newton’s formula, a switching rule for the exponential stability of switched linear systems with interval time-varying delays and new delay-dependent sufficient conditions for the exponential stability of the systems are first established in terms of LMIs. Finally, some examples are exploited to illustrate the effectiveness of the proposed schemes.Keywords: exponential stability, hybrid systems, timevarying delays, Lyapunov-Krasovskii functional, Leibniz-Newton’s formula
Procedia PDF Downloads 4584628 A Continuous Boundary Value Method of Order 8 for Solving the General Second Order Multipoint Boundary Value Problems
Authors: T. A. Biala
Abstract:
This paper deals with the numerical integration of the general second order multipoint boundary value problems. This has been achieved by the development of a continuous linear multistep method (LMM). The continuous LMM is used to construct a main discrete method to be used with some initial and final methods (also obtained from the continuous LMM) so that they form a discrete analogue of the continuous second order boundary value problems. These methods are used as boundary value methods and adapted to cope with the integration of the general second order multipoint boundary value problems. The convergence, the use and the region of absolute stability of the methods are discussed. Several numerical examples are implemented to elucidate our solution process.Keywords: linear multistep methods, boundary value methods, second order multipoint boundary value problems, convergence
Procedia PDF Downloads 3774627 Reconstructed Phase Space Features for Estimating Post Traumatic Stress Disorder
Authors: Andre Wittenborn, Jarek Krajewski
Abstract:
Trauma-related sadness in speech can alter the voice in several ways. The generation of non-linear aerodynamic phenomena within the vocal tract is crucial when analyzing trauma-influenced speech production. They include non-laminar flow and formation of jets rather than well-behaved laminar flow aspects. Especially state-space reconstruction methods based on chaotic dynamics and fractal theory have been suggested to describe these aerodynamic turbulence-related phenomena of the speech production system. To extract the non-linear properties of the speech signal, we used the time delay embedding method to reconstruct from a scalar time series (reconstructed phase space, RPS). This approach results in the extraction of 7238 Features per .wav file (N= 47, 32 m, 15 f). The speech material was prompted by telling about autobiographical related sadness-inducing experiences (sampling rate 16 kHz, 8-bit resolution). After combining these features in a support vector machine based machine learning approach (leave-one-sample out validation), we achieved a correlation of r = .41 with the well-established, self-report ground truth measure (RATS) of post-traumatic stress disorder (PTSD).Keywords: non-linear dynamics features, post traumatic stress disorder, reconstructed phase space, support vector machine
Procedia PDF Downloads 1024626 OpenFOAM Based Simulation of High Reynolds Number Separated Flows Using Bridging Method of Turbulence
Authors: Sagar Saroha, Sawan S. Sinha, Sunil Lakshmipathy
Abstract:
Reynolds averaged Navier-Stokes (RANS) model is the popular computational tool for prediction of turbulent flows. Being computationally less expensive as compared to direct numerical simulation (DNS), RANS has received wide acceptance in industry and research community as well. However, for high Reynolds number flows, the traditional RANS approach based on the Boussinesq hypothesis is incapacitated to capture all the essential flow characteristics, and thus, its performance is restricted in high Reynolds number flows of practical interest. RANS performance turns out to be inadequate in regimes like flow over curved surfaces, flows with rapid changes in the mean strain rate, duct flows involving secondary streamlines and three-dimensional separated flows. In the recent decade, partially averaged Navier-Stokes (PANS) methodology has gained acceptability among seamless bridging methods of turbulence- placed between DNS and RANS. PANS methodology, being a scale resolving bridging method, is inherently more suitable than RANS for simulating turbulent flows. The superior ability of PANS method has been demonstrated for some cases like swirling flows, high-speed mixing environment, and high Reynolds number turbulent flows. In our work, we intend to evaluate PANS in case of separated turbulent flows past bluff bodies -which is of broad aerodynamic research and industrial application. PANS equations, being derived from base RANS, continue to inherit the inadequacies from the parent RANS model based on linear eddy-viscosity model (LEVM) closure. To enhance PANS’ capabilities for simulating separated flows, the shortcomings of the LEVM closure need to be addressed. Inabilities of the LEVMs have inspired the development of non-linear eddy viscosity models (NLEVM). To explore the potential improvement in PANS performance, in our study we evaluate the PANS behavior in conjugation with NLEVM. Our work can be categorized into three significant steps: (i) Extraction of PANS version of NLEVM from RANS model, (ii) testing the model in the homogeneous turbulence environment and (iii) application and evaluation of the model in the canonical case of separated non-homogeneous flow field (flow past prismatic bodies and bodies of revolution at high Reynolds number). PANS version of NLEVM shall be derived and implemented in OpenFOAM -an open source solver. Homogeneous flows evaluation will comprise the study of the influence of the PANS’ filter-width control parameter on the turbulent stresses; the homogeneous analysis performed over typical velocity fields and asymptotic analysis of Reynolds stress tensor. Non-homogeneous flow case will include the study of mean integrated quantities and various instantaneous flow field features including wake structures. Performance of PANS + NLEVM shall be compared against the LEVM based PANS and LEVM based RANS. This assessment will contribute to significant improvement of the predictive ability of the computational fluid dynamics (CFD) tools in massively separated turbulent flows past bluff bodies.Keywords: bridging methods of turbulence, high Re-CFD, non-linear PANS, separated turbulent flows
Procedia PDF Downloads 1454625 Integrated Information System on Human Resource Management in Project-Based Organizations
Authors: Akbar Farahani, Afsaneh Hassani, Peyman M. Farkhondeh
Abstract:
Human Resource Management as one of the core processes of the project-based companies, despite its key role in the success and competitive advantage, is relatively unknown. In the project-based companies, due to the accelerated movement of knowledge in the work activities and the temporary nature of the project, the need to develop mechanisms for achieving optimal management of this issues is very challenging. Approach to human resource management in these companies evolves with goals, strategies, and operational processes. Therefore, the need for appropriate tools to facilitate implementation of the optimized human resource management in the project is more than before,Which currently with the development of information technology and modern communication, appropriate to address the optimal approach for dynamic management of human resources in the project have been provided.This is done by using the referral system implemented in Mahab GCE that provides 1: the ability to use humans in projects without geographic limitation and 2:information on the activities and outcomes of referrals.Furthermore, by using this system, recording the lessons learned after any particular activity on projects,accessing quantitative information, procedures, documentation of learned practices that have been stored in the data base as well as using them in future projects is provided.Keywords: human resource management, project base company, ERP, referrals system
Procedia PDF Downloads 4774624 N₂O₂ Salphen-Like Ligand and Its Pd(II), Ag(I) and Cu(II) Complexes as Potentially Anticancer Agents: Design, Synthesis, Antimicrobial, CT-DNA Binding and Molecular Docking
Authors: Laila H. Abdel-Rahman, Mohamed Shaker S. Adam, Ahmed M. Abu-Dief, Hanan El-Sayed Ahmed
Abstract:
In this investigation, Cu(II), Pd(II) and Ag(I) complexes with the tetra-dentate DSPH Schiff base ligand were synthesized. The DSPH Schiff base and its complexes were characterized by using different physicochemical and spectral analysis. The results revealed that the metal ions coordinated with DSPH ligand through azomethine nitrogen and phenolic oxygen. Cu(II), Pd(II) and Ag(I) complexes are present in a 1:1 molar ratio. Pd(II) and Ag(I) complexes have square planar geometries while, Cu(II) has a distorted octahedral (Oh) geometry. All investigated complexes are nonelectrolytes. The investigated compounds were tested against different strains of bacteria and fungi. Both prepared compounds showed good results of inhibition against the selected pathogenic microorganism. Moreover, the interaction of investigated complexes with CT-DNA was studied via various techniques and the binding modes are mainly intercalative and grooving modes. Operating Environment MOE package was used to do docking studies for the investigated complexes to explore the potential binding mode and energy. Furthermore, the growth inhibitory effect of the investigated compounds was examined on some cancer cells lines.Keywords: tetradentate, antimicrobial, CT-DNA interaction, docking, anticancer
Procedia PDF Downloads 2444623 Innovative Screening Tool Based on Physical Properties of Blood
Authors: Basant Singh Sikarwar, Mukesh Roy, Ayush Goyal, Priya Ranjan
Abstract:
This work combines two bodies of knowledge which includes biomedical basis of blood stain formation and fluid communities’ wisdom that such formation of blood stain depends heavily on physical properties. Moreover biomedical research tells that different patterns in stains of blood are robust indicator of blood donor’s health or lack thereof. Based on these valuable insights an innovative screening tool is proposed which can act as an aide in the diagnosis of diseases such Anemia, Hyperlipidaemia, Tuberculosis, Blood cancer, Leukemia, Malaria etc., with enhanced confidence in the proposed analysis. To realize this powerful technique, simple, robust and low-cost micro-fluidic devices, a micro-capillary viscometer and a pendant drop tensiometer are designed and proposed to be fabricated to measure the viscosity, surface tension and wettability of various blood samples. Once prognosis and diagnosis data has been generated, automated linear and nonlinear classifiers have been applied into the automated reasoning and presentation of results. A support vector machine (SVM) classifies data on a linear fashion. Discriminant analysis and nonlinear embedding’s are coupled with nonlinear manifold detection in data and detected decisions are made accordingly. In this way, physical properties can be used, using linear and non-linear classification techniques, for screening of various diseases in humans and cattle. Experiments are carried out to validate the physical properties measurement devices. This framework can be further developed towards a real life portable disease screening cum diagnostics tool. Small-scale production of screening cum diagnostic devices is proposed to carry out independent test.Keywords: blood, physical properties, diagnostic, nonlinear, classifier, device, surface tension, viscosity, wettability
Procedia PDF Downloads 3764622 Formulation and Characterization of Drug Loaded Niosomal Gel for Anti-Inflammatory Activity
Authors: Sunil Kamboj, Vipin Saini, Suman Bala, Gaurav Sharma
Abstract:
The main aim of the present research was to encapsulate mefenamic acid in niosomes and incorporate the prepared niosomes in the carbopol gel base for sustained therapeutic action. Mefenamic acid loaded niosomes were prepared by thin film hydration technique and evaluated for entrapment efficiency, vesicular size and zeta potential. The entrapment efficiency of the prepared niosomes was found to increase with decreasing the HLB values of surfactants and vesicle size was found to increase with increasing the cholesterol concentration. Niosomal vesicles with good entrapment efficiencies were incorporated in carbopol gel base to form the niosomal gel. The prepared niosomal gel was evaluated for pH, viscosity, spreadability, extrudability and skin permeation study across the rat skin.The results of permeation study revealed that the gel formulated with span 60 niosomes sustained the drug release for 12 h. Further the in vivo study showed the good inhibition of inflammation by the gel prepared with span 60 niosomes.Keywords: mefenamic acid, niosomal gel, nonionic surfactants, sustained release
Procedia PDF Downloads 4094621 Optimization of Slider Crank Mechanism Using Design of Experiments and Multi-Linear Regression
Authors: Galal Elkobrosy, Amr M. Abdelrazek, Bassuny M. Elsouhily, Mohamed E. Khidr
Abstract:
Crank shaft length, connecting rod length, crank angle, engine rpm, cylinder bore, mass of piston and compression ratio are the inputs that can control the performance of the slider crank mechanism and then its efficiency. Several combinations of these seven inputs are used and compared. The throughput engine torque predicted by the simulation is analyzed through two different regression models, with and without interaction terms, developed according to multi-linear regression using LU decomposition to solve system of algebraic equations. These models are validated. A regression model in seven inputs including their interaction terms lowered the polynomial degree from 3rd degree to 1st degree and suggested valid predictions and stable explanations.Keywords: design of experiments, regression analysis, SI engine, statistical modeling
Procedia PDF Downloads 1864620 An Absolute Femtosecond Rangefinder for Metrological Support in Coordinate Measurements
Authors: Denis A. Sokolov, Andrey V. Mazurkevich
Abstract:
In the modern world, there is an increasing demand for highly precise measurements in various fields, such as aircraft, shipbuilding, and rocket engineering. This has resulted in the development of appropriate measuring instruments that are capable of measuring the coordinates of objects within a range of up to 100 meters, with an accuracy of up to one micron. The calibration process for such optoelectronic measuring devices (trackers and total stations) involves comparing the measurement results from these devices to a reference measurement based on a linear or spatial basis. The reference used in such measurements could be a reference base or a reference range finder with the capability to measure angle increments (EDM). The base would serve as a set of reference points for this purpose. The concept of the EDM for replicating the unit of measurement has been implemented on a mobile platform, which allows for angular changes in the direction of laser radiation in two planes. To determine the distance to an object, a high-precision interferometer with its own design is employed. The laser radiation travels to the corner reflectors, which form a spatial reference with precisely known positions. When the femtosecond pulses from the reference arm and the measuring arm coincide, an interference signal is created, repeating at the frequency of the laser pulses. The distance between reference points determined by interference signals is calculated in accordance with recommendations from the International Bureau of Weights and Measures for the indirect measurement of time of light passage according to the definition of a meter. This distance is D/2 = c/2nF, approximately 2.5 meters, where c is the speed of light in a vacuum, n is the refractive index of a medium, and F is the frequency of femtosecond pulse repetition. The achieved uncertainty of type A measurement of the distance to reflectors 64 m (N•D/2, where N is an integer) away and spaced apart relative to each other at a distance of 1 m does not exceed 5 microns. The angular uncertainty is calculated theoretically since standard high-precision ring encoders will be used and are not a focus of research in this study. The Type B uncertainty components are not taken into account either, as the components that contribute most do not depend on the selected coordinate measuring method. This technology is being explored in the context of laboratory applications under controlled environmental conditions, where it is possible to achieve an advantage in terms of accuracy. In general, the EDM tests showed high accuracy, and theoretical calculations and experimental studies on an EDM prototype have shown that the uncertainty type A of distance measurements to reflectors can be less than 1 micrometer. The results of this research will be utilized to develop a highly accurate mobile absolute range finder designed for the calibration of high-precision laser trackers and laser rangefinders, as well as other equipment, using a 64 meter laboratory comparator as a reference.Keywords: femtosecond laser, pulse correlation, interferometer, laser absolute range finder, coordinate measurement
Procedia PDF Downloads 594619 Pakistan’s Taxation System: A Critical Appraisal
Authors: Khalid Javed, Rashid Mahmood
Abstract:
The constitution empowers the Federal Government to collect taxes on income other than agricultural income, taxes on capital value, customs, excise duties and sales taxes. The Central Board of Revenue (CBR) and its subordinate departments administer the tax system. Each of the three principal taxes has a different history and different set of issues. For a large number of income tax payers the core of the business process is pre-audit and assessment by a tax official. This process gives considerable discretion to tax officials, with potential for abuse. Moreover, this process is also not tenable as the number of taxpayers increase. The report is focused on a total overhaul of the process and organization of income tax. Sales tax is recent and its process and organization is adjusted to the needs of an expanding tax base. These are based on self-assessment and selective audit. Similarly, in customs the accent is on accelerating and broadening the changes begun in recent years. Before long, central excise will be subsumed in sales tax. During the nineties, despite many changes in the tax regime and introduction of withholding and presumptive taxes, Federal Government tax to GDP ratio has varied narrowly around eleven percent. The tax base has grown but still remains narrow and skewed. The number of income tax filers is around one million.Keywords: central board of revenue, GDP, sale tax, income tax
Procedia PDF Downloads 4424618 Discrete Sliding Modes Regulator with Exponential Holder for Non-Linear Systems
Authors: G. Obregon-Pulido , G. C. Solis-Perales, J. A. Meda-Campaña
Abstract:
In this paper, we present a sliding mode controller in discrete time. The design of the controller is based on the theory of regulation for nonlinear systems. In the problem of disturbance rejection and/or output tracking, it is known that in discrete time, a controller that uses the zero-order holder only guarantees tracking at the sampling instances but not between instances. It is shown that using the so-called exponential holder, it is possible to guarantee asymptotic zero output tracking error, also between the sampling instant. For stabilizing the problem of close loop system we introduce the sliding mode approach relaxing the requirements of the existence of a linear stabilizing control law.Keywords: regulation theory, sliding modes, discrete controller, ripple-free tracking
Procedia PDF Downloads 544617 Correlation of Unsuited and Suited 5ᵗʰ Female Hybrid III Anthropometric Test Device Model under Multi-Axial Simulated Orion Abort and Landing Conditions
Authors: Christian J. Kennett, Mark A. Baldwin
Abstract:
As several companies are working towards returning American astronauts back to space on US-made spacecraft, NASA developed a human flight certification-by-test and analysis approach due to the cost-prohibitive nature of extensive testing. This process relies heavily on the quality of analytical models to accurately predict crew injury potential specific to each spacecraft and under dynamic environments not tested. As the prime contractor on the Orion spacecraft, Lockheed Martin was tasked with quantifying the correlation of analytical anthropometric test devices (ATDs), also known as crash test dummies, against test measurements under representative impact conditions. Multiple dynamic impact sled tests were conducted to characterize Hybrid III 5th ATD lumbar, head, and neck responses with and without a modified shuttle-era advanced crew escape suit (ACES) under simulated Orion landing and abort conditions. Each ATD was restrained via a 5-point harness in a mockup Orion seat fixed to a dynamic impact sled at the Wright Patterson Air Force Base (WPAFB) Biodynamics Laboratory in the horizontal impact accelerator (HIA). ATDs were subject to multiple impact magnitudes, half-sine pulse rise times, and XZ - ‘eyeballs out/down’ or Z-axis ‘eyeballs down’ orientations for landing or an X-axis ‘eyeballs in’ orientation for abort. Several helmet constraint devices were evaluated during suited testing. Unique finite element models (FEMs) were developed of the unsuited and suited sled test configurations using an analytical 5th ATD model developed by LSTC (Livermore, CA) and deformable representations of the seat, suit, helmet constraint countermeasures, and body restraints. Explicit FE analyses were conducted using the non-linear solver LS-DYNA. Head linear and rotational acceleration, head rotational velocity, upper neck force and moment, and lumbar force time histories were compared between test and analysis using the enhanced error assessment of response time histories (EEARTH) composite score index. The EEARTH rating paired with the correlation and analysis (CORA) corridor rating provided a composite ISO score that was used to asses model correlation accuracy. NASA occupant protection subject matter experts established an ISO score of 0.5 or greater as the minimum expectation for correlating analytical and experimental ATD responses. Unsuited 5th ATD head X, Z, and resultant linear accelerations, head Y rotational accelerations and velocities, neck X and Z forces, and lumbar Z forces all showed consistent ISO scores above 0.5 in the XZ impact orientation, regardless of peak g-level or rise time. Upper neck Y moments were near or above the 0.5 score for most of the XZ cases. Similar trends were found in the XZ and Z-axis suited tests despite the addition of several different countermeasures for restraining the helmet. For the X-axis ‘eyeballs in’ loading direction, only resultant head linear acceleration and lumbar Z-axis force produced ISO scores above 0.5 whether unsuited or suited. The analytical LSTC 5th ATD model showed good correlation across multiple head, neck, and lumbar responses in both the unsuited and suited configurations when loaded in the XZ ‘eyeballs out/down’ direction. Upper neck moments were consistently the most difficult to predict, regardless of impact direction or test configuration.Keywords: impact biomechanics, manned spaceflight, model correlation, multi-axial loading
Procedia PDF Downloads 114