Search results for: automatic selective door operations
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3446

Search results for: automatic selective door operations

896 The Usefulness of Medical Scribes in the Emengecy Department

Authors: Victor Kang, Sirene Bellahnid, Amy Al-Simaani

Abstract:

Efficient documentation and completion of clerical tasks are pillars of efficient patient-centered care in acute settings such as the emergency department (ED). Medical scribes aid physicians with documentation, navigation of electronic health records, results gathering, and communication coordination with other healthcare teams. However, the use of medical scribes is not widespread, with some hospitals even continuing to discontinue their programs. One reason for this could be the lack of studies that have outlined concrete improvements in efficiency and patient and provider satisfaction in emergency departments before and after incorporating scribes. Methods: We conducted a review of the literature concerning the implementation of a medical scribe program and emergency department performance. For this review, a narrative synthesis accompanied by textual commentaries was chosen to present the selected papers. PubMed was searched exclusively. Initially, no date limits were set, but seeing as the electronic medical record was officially implemented in Canada in 2013, studies published after this date were preferred as they provided insight into the interplay between its implementation and scribes on quality improvement. Results: Throughput, efficiency, and cost-effectiveness were the most commonly used parameters in evaluating scribes in the Emergency Department. Important throughput metrics, specifically door-to-doctor and disposition time, were significantly decreased in emergency departments that utilized scribes. Of note, this was shown to be the case in community hospitals, where the burden of documentation and clerical tasks would fall directly upon the attending physician. Academic centers differ in that they rely heavily on residents and students; so the implementation of scribes has been shown to have limited effect on these metrics. However, unique to academic centers was the provider’s perception of incrased time for teaching was unique to academic centers. Consequently, providers express increased work satisfaction in relation to time spent with patients and in teaching. Patients, on the other hand, did not demonstrate a decrease in satisfaction in regards to the care that was provided, but there was no significant increase observed either. Of the studies we reviewed, one of the biggest limitations was the lack of significance in the data. While many individual studies reported that medical scribes in emergency rooms improved relative value units, patient satisfaction, provider satisfaction, and increased number of patients seen, there was no statistically significant improvement in the above criteria when compiled in a systematic review. There is also a clear publication bias; very few studies with negative results were published. To prove significance, data from more emergency rooms with scribe programs would need to be compiled which also includes emergency rooms who did not report noticeable benefits. Furthermore, most data sets focused only on scribes in academic centers. Conclusion: Ultimately, the literature suggests that while emergency room physicians who have access to medical scribes report higher satisfaction due to lower clerical burdens and can see more patients per shift, there is still variability in terms of patient and provider satisfaction. Whether or not this variability exists due to differences in training (in-house trainees versus contractors), population profile (adult versus pediatric), setting (academic versus community), or which shifts scribe work cannot be determined based on the studies that exist. Ultimately, more scribe programs need to be evaluated to determine whether these variables affect outcomes and prove whether scribes significantly improve emergency room efficiency.

Keywords: emergency medicine, medical scribe, scribe, documentation

Procedia PDF Downloads 87
895 Radio Frequency Identification Device Based Emergency Department Critical Care Billing: A Framework for Actionable Intelligence

Authors: Shivaram P. Arunachalam, Mustafa Y. Sir, Andy Boggust, David M. Nestler, Thomas R. Hellmich, Kalyan S. Pasupathy

Abstract:

Emergency departments (EDs) provide urgent care to patients throughout the day in a complex and chaotic environment. Real-time location systems (RTLS) are increasingly being utilized in healthcare settings, and have shown to improve safety, reduce cost, and increase patient satisfaction. Radio Frequency Identification Device (RFID) data in an ED has been shown to compute variables such as patient-provider contact time, which is associated with patient outcomes such as 30-day hospitalization. These variables can provide avenues for improving ED operational efficiency. A major challenge with ED financial operations is under-coding of critical care services due to physicians’ difficulty reporting accurate times for critical care provided under Current Procedural Terminology (CPT) codes 99291 and 99292. In this work, the authors propose a framework to optimize ED critical care billing using RFID data. RFID estimated physician-patient contact times could accurately quantify direct critical care services which will help model a data-driven approach for ED critical care billing. This paper will describe the framework and provide insights into opportunities to prevent under coding as well as over coding to avoid insurance audits. Future work will focus on data analytics to demonstrate the feasibility of the framework described.

Keywords: critical care billing, CPT codes, emergency department, RFID

Procedia PDF Downloads 128
894 Analysis of Long-term Results After External Dacryocystorhinostomy Surgery in Patients Suffered from Diabetes Mellitus

Authors: N. Musayeva, N. Rustamova, N. Bagirov, S. Ibadov

Abstract:

Purpose: to analyze the long-term results of external dacryocystorhinostomy (DCR), which remains the preferred primary procedure in the surgical treatment of lacrimal duct obstruction in chronic dacryocystitis. Methodology: long-term results of external DCR (after 3 years) performed on 90 patients (90 eyes) with chronic dacryocystitis from 2018 to 2020 were evaluated. The Azerbaijan National Center of Ophthalmology, named after acad. Zarifa Aliyeva. 15 of the patients were men, 75 – women. The average age was 45±3.2 years. Surgical operations were performed under local anesthesia. All patients suffered from diabetes mellitus for more than 3 years. All patients underwent external DCR and silicone drainage (tube) was implanted. In the postoperative period (after 3 years), lacrimation, purulent discharge, and the condition of the scar at the operation site were assessed. Results: All patients were under observation for more than 18 months. In general, the effectiveness of the surgical operation was 93.34%. Recurrence of disease was observed in 6 patients and in 3 patients (3.33%), the scar at the site of the operation was rough (non-cosmetic). In 3 patients (3.33%) – the surgically formed anastomosis between the lacrimal sac and the nasal bone was obstructed by scar tissue. These patients were reoperated by trans canalicular laser DCR. Conclusion: Despite the long-term (more than a hundred years) use of external DCR, it remains one of the primary techniques in the surgery of chronic dacryocystitis. Due to the high success rate and good long-term results of DCR in the treatment of chronic dacryocystitis in patients suffering from diabetes mellitus, we recommend external DCR for this group of patients.

Keywords: chronic dacryocystitis, diabetes mellitus, external dacryocystorhinostomy, long-term results

Procedia PDF Downloads 57
893 Response Development of larvae Portunus pelagicus to Artificial Feeding Predigest

Authors: Siti Aslamyah, Yushinta Fujaya, Okto Rimaldi

Abstract:

One of the problems faced in the crab hatchery operations is the reliance on the use of natural feed. This study aims to analyze the response of larval development and determine the initial stages crab larvae begin to fully able to accept artificial feeding predigest with the help of probiotic Bacillus sp. The experiment was conducted in June 2014 through July 2014 at the location of the scale backyard hatcheries, Bojo village Mallusettasi sub-district, district Barru. This study was conducted in two stages larval rearing. The first stage is designed in a completely randomized design with 5 treatments and each with 3 repetitions, ie, without the use of artificial feeding; predigest feed given from zoea 1 - megalopa; predigest feed given since zoea 2 - megalopa; predigest feed given from zoea 3 - megalopa; and feed predigest given since zoea 4 - megalopa. The second stage of the two treatments, i.e. comparing artificial feeding without and with predigest. The results showed that the artificial feeding predigest able to replace the use of natural feed started zoea 3 generated based on the survival rate. Artificial feeding predigest provide a higher survival rate (16%) compared to artificial diets without predigest only 10.8%. However, feed predigest not give a different effect on the rate of development of stadia. Cell activity in larvae that received artificial feed predigest higher with RNA-DNA ratio of 8.88 compared with no predigest only 5:36. This research is very valuable information for crab hatchery hatchery scale households have limitations in preparing natural food.

Keywords: artificial feeding, development of stadia, larvae Portunus pelagicus, predigest

Procedia PDF Downloads 528
892 Analytical and Numerical Results for Free Vibration of Laminated Composites Plates

Authors: Mohamed Amine Ben Henni, Taher Hassaine Daouadji, Boussad Abbes, Yu Ming Li, Fazilay Abbes

Abstract:

The reinforcement and repair of concrete structures by bonding composite materials have become relatively common operations. Different types of composite materials can be used: carbon fiber reinforced polymer (CFRP), glass fiber reinforced polymer (GFRP) as well as functionally graded material (FGM). The development of analytical and numerical models describing the mechanical behavior of structures in civil engineering reinforced by composite materials is necessary. These models will enable engineers to select, design, and size adequate reinforcements for the various types of damaged structures. This study focuses on the free vibration behavior of orthotropic laminated composite plates using a refined shear deformation theory. In these models, the distribution of transverse shear stresses is considered as parabolic satisfying the zero-shear stress condition on the top and bottom surfaces of the plates without using shear correction factors. In this analysis, the equation of motion for simply supported thick laminated rectangular plates is obtained by using the Hamilton’s principle. The accuracy of the developed model is demonstrated by comparing our results with solutions derived from other higher order models and with data found in the literature. Besides, a finite-element analysis is used to calculate the natural frequencies of laminated composite plates and is compared with those obtained by the analytical approach.

Keywords: composites materials, laminated composite plate, finite-element analysis, free vibration

Procedia PDF Downloads 286
891 Identification of Membrane Foulants in Direct Contact Membrane Distillation for the Treatment of Reject Brine

Authors: Shefaa Mansour, Hassan Arafat, Shadi Hasan

Abstract:

Management of reverse osmosis (RO) brine has become a major area of research due to the environmental concerns associated with it. This study worked on studying the feasibility of the direct contact membrane distillation (DCMD) system in the treatment of this RO brine. The system displayed great potential in terms of its flux and salt rejection, where different operating conditions such as the feed temperature, feed salinity, feed and permeate flow rates were varied. The highest flux of 16.7 LMH was reported with a salt rejection of 99.5%. Although the DCMD has displayed potential of enhanced water recovery from highly saline solutions, one of the major drawbacks associated with the operation is the fouling of the membranes which impairs the system performance. An operational run of 77 hours for the treatment of RO brine of 56,500 ppm salinity was performed in order to investigate the impact of fouling of the membrane on the overall operation of the system over long time operations. Over this time period, the flux was observed to have reduced by four times its initial flux. The fouled membrane was characterized through different techniques for the identification of the organic and inorganic foulants that have deposited on the membrane surface. The Infrared Spectroscopy method (IR) was used to identify the organic foulants where SEM images displayed the surface characteristics of the membrane. As for the inorganic foulants, they were identified using X-ray Diffraction (XRD), Ion Chromatography (IC) and Energy Dispersive Spectroscopy (EDS). The major foulants found on the surface of the membrane were inorganic salts such as sodium chloride and calcium sulfate.

Keywords: brine treatment, membrane distillation, fouling, characterization

Procedia PDF Downloads 431
890 Study on Shelf Life and Textural Properties of Minimal Processed Mixed Fruits

Authors: Kaavya Rathnakumar

Abstract:

Minimally processed fruits have the attributes of convenience and fresh like quality. In minimally processed products, the cells of the tissue are alive, and the essential nutrients and flavours are retained. Some of the procedures include washing, trimming, sorting, cutting, slicing and shredding. Fruits such as pineapple and guava were taken for the study of textural properties for a period of five days. After the performance of various unit operations 50g cubes of pineapple and guava has been weighed. For determining the textural properties, samples were taken in which set of 12 samples were treated by using 1% citric acid solution and dried for 5 minutes the remaining set of 12 samples were untreated. In set of treated samples 6 were vacuum packed and stored in the refrigerator, and the other sample was normally stored. For untreated samples was done in a similar way. In texture profile analysis the force required for 1cm penetration of 2mm cylindrical needle inside the fruits were recorded for all packages. It was observed that guava the fresh sample had a force of penetration of 3250mm and as the days increased the force decreased to 357.4 mm for vacuum packed refrigerated storage. In the case of pineapple, the force of penetration of the fresh sample was 2325mm which was decreased to 26.3mm on the fourth day and very low at the fifth day for vacuum packed refrigerated storage. But in case of untreated samples, the fruits were spoiled may be because of no pre-treatment and packaging. Comparatively, it was found that vacuum packed refrigerated samples had higher shelf life than normal packed samples in ambient conditions.

Keywords: 1% citric acid solution, normal packed, refrigerated storage, vacuum packed

Procedia PDF Downloads 190
889 Agile Implementation of 'PULL' Principles in a Manufacturing Process Chain for Aerospace Composite Parts

Authors: Torsten Mielitz, Dietmar Schulz, York C. Roth

Abstract:

Market forecasts show a significant increase in the demand for aircraft within the next two decades and production rates will be adapted accordingly. Improvements and optimizations in the industrial system are becoming more important to cope with future challenges in manufacturing and assembly. Highest quality standards have to be met for aerospace parts, whereas cost effective production in industrial systems and methodologies are also a key driver. A look at other industries like e.g., automotive shows well established processes to streamline existing manufacturing systems. In this paper, the implementation of 'PULL' principles in an existing manufacturing process chain for a large scale composite part is presented. A nonlinear extrapolation based on 'Little's Law' showed a risk of a significant increase of parts needed in the process chain to meet future demand. A project has been set up to mitigate the risk whereas the methodology has been changed from a traditional milestone approach in the beginning towards an agile way of working in the end in order to facilitate immediate benefits in the shop-floor. Finally, delivery rates could be increased avoiding more semi-finished parts in the process chain (work in progress & inventory) by the successful implementation of the 'PULL' philosophy in the shop-floor between the work stations. Lessons learned during the running project as well as implementation and operations phases are discussed in order to share best practices.

Keywords: aerospace composite part manufacturing, PULL principles, shop-floor implementation, lessons learned

Procedia PDF Downloads 167
888 Development of a Tilt-Rotor Aircraft Model Using System Identification Technique

Authors: Ferdinando Montemari, Antonio Vitale, Nicola Genito, Giovanni Cuciniello

Abstract:

The introduction of tilt-rotor aircraft into the existing civilian air transportation system will provide beneficial effects due to tilt-rotor capability to combine the characteristics of a helicopter and a fixed-wing aircraft into one vehicle. The disposability of reliable tilt-rotor simulation models supports the development of such vehicle. Indeed, simulation models are required to design automatic control systems that increase safety, reduce pilot's workload and stress, and ensure the optimal aircraft configuration with respect to flight envelope limits, especially during the most critical flight phases such as conversion from helicopter to aircraft mode and vice versa. This article presents a process to build a simplified tilt-rotor simulation model, derived from the analysis of flight data. The model aims to reproduce the complex dynamics of tilt-rotor during the in-flight conversion phase. It uses a set of scheduled linear transfer functions to relate the autopilot reference inputs to the most relevant rigid body state variables. The model also computes information about the rotor flapping dynamics, which are useful to evaluate the aircraft control margin in terms of rotor collective and cyclic commands. The rotor flapping model is derived through a mixed theoretical-empirical approach, which includes physical analytical equations (applicable to helicopter configuration) and parametric corrective functions. The latter are introduced to best fit the actual rotor behavior and balance the differences existing between helicopter and tilt-rotor during flight. Time-domain system identification from flight data is exploited to optimize the model structure and to estimate the model parameters. The presented model-building process was applied to simulated flight data of the ERICA Tilt-Rotor, generated by using a high fidelity simulation model implemented in FlightLab environment. The validation of the obtained model was very satisfying, confirming the validity of the proposed approach.

Keywords: flapping dynamics, flight dynamics, system identification, tilt-rotor modeling and simulation

Procedia PDF Downloads 192
887 A Normalized Non-Stationary Wavelet Based Analysis Approach for a Computer Assisted Classification of Laryngoscopic High-Speed Video Recordings

Authors: Mona K. Fehling, Jakob Unger, Dietmar J. Hecker, Bernhard Schick, Joerg Lohscheller

Abstract:

Voice disorders origin from disturbances of the vibration patterns of the two vocal folds located within the human larynx. Consequently, the visual examination of vocal fold vibrations is an integral part within the clinical diagnostic process. For an objective analysis of the vocal fold vibration patterns, the two-dimensional vocal fold dynamics are captured during sustained phonation using an endoscopic high-speed camera. In this work, we present an approach allowing a fully automatic analysis of the high-speed video data including a computerized classification of healthy and pathological voices. The approach bases on a wavelet-based analysis of so-called phonovibrograms (PVG), which are extracted from the high-speed videos and comprise the entire two-dimensional vibration pattern of each vocal fold individually. Using a principal component analysis (PCA) strategy a low-dimensional feature set is computed from each phonovibrogram. From the PCA-space clinically relevant measures can be derived that quantify objectively vibration abnormalities. In the first part of the work it will be shown that, using a machine learning approach, the derived measures are suitable to distinguish automatically between healthy and pathological voices. Within the approach the formation of the PCA-space and consequently the extracted quantitative measures depend on the clinical data, which were used to compute the principle components. Therefore, in the second part of the work we proposed a strategy to achieve a normalization of the PCA-space by registering the PCA-space to a coordinate system using a set of synthetically generated vibration patterns. The results show that owing to the normalization step potential ambiguousness of the parameter space can be eliminated. The normalization further allows a direct comparison of research results, which bases on PCA-spaces obtained from different clinical subjects.

Keywords: Wavelet-based analysis, Multiscale product, normalization, computer assisted classification, high-speed laryngoscopy, vocal fold analysis, phonovibrogram

Procedia PDF Downloads 261
886 Smart Automated Furrow Irrigation: A Preliminary Evaluation

Authors: Jasim Uddin, Rod Smith, Malcolm Gillies

Abstract:

Surface irrigation is the most popular irrigation method all over the world. However, two issues: low efficiency and huge labour involvement concern irrigators due to scarcity in recent years. To address these issues, a smart automated furrow is conceptualised that can be operated using digital devices like smartphone, iPad or computer and a preliminary evaluation was conducted in this study. The smart automated system is the integration of commercially available software and hardware. It includes real-time surface irrigation optimisation software (SISCO) and Rubicon Water’s surface irrigation automation hardware and software. The automated system consists of automatic water delivery system with 300 mm flexible pipes attached to both sides of a remotely controlled valve to operate the irrigation. A water level sensor to obtain the real-time inflow rate from the measured head in the channel, advance sensors to measure the advance time to particular points of an irrigated field, a solar-powered telemetry system including a base station to communicate all the field sensors with the main server. On the basis of field data, the software (SISCO) is optimised the ongoing irrigation and determine the optimum cut-off for particular irrigation and send this information to the control valve to stop the irrigation in a particular (cut-off) time. The preliminary evaluation shows that the automated surface irrigation worked reasonably well without manual intervention. The evaluation of farmers managed irrigation events show the potentials to save a significant amount of water and labour. A substantial amount of economic and social benefits are expected in rural industries by adopting this system. The future outcome of this work would be a fully tested commercial adaptive real-time furrow irrigation system able to compete with the pressurised alternative of centre pivot or lateral move machines on capital cost, water and labour savings but without the massive energy costs.

Keywords: furrow irrigation, smart automation, infiltration, SISCO, real-time irrigation, adoptive control

Procedia PDF Downloads 444
885 Optimization of Personnel Selection Problems via Unconstrained Geometric Programming

Authors: Vildan Kistik, Tuncay Can

Abstract:

From a business perspective, cost and profit are two key factors for businesses. The intent of most businesses is to minimize the cost to maximize or equalize the profit, so as to provide the greatest benefit to itself. However, the physical system is very complicated because of technological constructions, rapid increase of competitive environments and similar factors. In such a system it is not easy to maximize profits or to minimize costs. Businesses must decide on the competence and competence of the personnel to be recruited, taking into consideration many criteria in selecting personnel. There are many criteria to determine the competence and competence of a staff member. Factors such as the level of education, experience, psychological and sociological position, and human relationships that exist in the field are just some of the important factors in selecting a staff for a firm. Personnel selection is a very important and costly process in terms of businesses in today's competitive market. Although there are many mathematical methods developed for the selection of personnel, unfortunately the use of these mathematical methods is rarely encountered in real life. In this study, unlike other methods, an exponential programming model was established based on the possibilities of failing in case the selected personnel was started to work. With the necessary transformations, the problem has been transformed into unconstrained Geometrical Programming problem and personnel selection problem is approached with geometric programming technique. Personnel selection scenarios for a classroom were established with the help of normal distribution and optimum solutions were obtained. In the most appropriate solutions, the personnel selection process for the classroom has been achieved with minimum cost.

Keywords: geometric programming, personnel selection, non-linear programming, operations research

Procedia PDF Downloads 264
884 Integrated Microsystem for Multiplexed Genosensor Detection of Biowarfare Agents

Authors: Samuel B. Dulay, Sandra Julich, Herbert Tomaso, Ciara K. O'Sullivan

Abstract:

An early, rapid and definite detection for the presence of biowarfare agents, pathogens, viruses and toxins is required in different situations which include civil rescue and security units, homeland security, military operations, public transportation securities such as airports, metro and railway stations due to its harmful effect on the human population. In this work, an electrochemical genosensor array that allows simultaneous detection of different biowarfare agents within an integrated microsystem that provides an easy handling of the technology which combines a microfluidics setup with a multiplexing genosensor array has been developed and optimised for the following targets: Bacillus anthracis, Brucella abortis and melitensis, Bacteriophage lambda, Francisella tularensis, Burkholderia mallei and pseudomallei, Coxiella burnetii, Yersinia pestis, and Bacillus thuringiensis. The electrode array was modified via co-immobilisation of a 1:100 (mol/mol) mixture of a thiolated probe and an oligoethyleneglycol-terminated monopodal thiol. PCR products from these relevant biowarfare agents were detected reproducibly through a sandwich assay format with the target hybridised between a surface immobilised probe into the electrode and a horseradish peroxidase-labelled secondary reporter probe, which provided an enzyme based electrochemical signal. The potential of the designed microsystem for multiplexed genosensor detection and cross-reactivity studies over potential interfering DNA sequences has demonstrated high selectivity using the developed platform producing high-throughput.

Keywords: biowarfare agents, genosensors, multipled detection, microsystem

Procedia PDF Downloads 268
883 A Study on Marble-Slag Based Geopolymer Green Concrete

Authors: Zong-Xian Qiu, Ta-Wui Cheng, Wei-Hao Lee, Yung-Chin Ding

Abstract:

The greenhouse effect is an important issue since it has been responsible for global warming. Carbon dioxide plays an important part of role in the greenhouse effect. Therefore, human has the responsibility for reducing CO₂ emissions in their daily operations. Except iron making and power plants, another major CO₂ production industry is cement industry. According to the statistics by EPA of Taiwan, production 1 ton of Portland cement will produce 520.29 kg of CO₂. There are over 7.8 million tons of CO₂ produced annually. Thus, trying to development low CO₂ emission green concrete is an important issue, and it can reduce CO₂ emission problems in Taiwan. The purpose of this study is trying to use marble wastes and slag as the raw materials to fabricate geopolymer green concrete. The result shows the marble based geopolymer green concrete have good workability and the compressive strength after curing for 28 days and 365 days can be reached 44MPa and 53MPa in indoor environment, 28MPa and 40.43MPa in outdoor environment. The acid resistance test shows the geopolymer green concrete have good resistance for chemical attack. The coefficient of permeability of geopolymer green concrete is better than Portland concrete. By comparing with Portland cement products, the marble based geopolymer not only reduce CO₂ emission problems but also provides great performance in practices. According to the experiment results shown that geopolymer concrete has great potential for further engineering development in the future, the new material could be expected to replace the Portland cement products in the future days.

Keywords: marble, slag, geopolymer, green concrete, CO₂ emission

Procedia PDF Downloads 134
882 Degradation of Heating, Ventilation, and Air Conditioning Components across Locations

Authors: Timothy E. Frank, Josh R. Aldred, Sophie B. Boulware, Michelle K. Cabonce, Justin H. White

Abstract:

Materials degrade at different rates in different environments depending on factors such as temperature, aridity, salinity, and solar radiation. Therefore, predicting asset longevity depends, in part, on the environmental conditions to which the asset is exposed. Heating, ventilation, and air conditioning (HVAC) systems are critical to building operations yet are responsible for a significant proportion of their energy consumption. HVAC energy use increases substantially with slight operational inefficiencies. Understanding the environmental influences on HVAC degradation in detail will inform maintenance schedules and capital investment, reduce energy use, and increase lifecycle management efficiency. HVAC inspection records spanning 14 years from 21 locations across the United States were compiled and associated with the climate conditions to which they were exposed. Three environmental features were explored in this study: average high temperature, average low temperature, and annual precipitation, as well as four non-environmental features. Initial insights showed no correlations between individual features and the rate of HVAC component degradation. Using neighborhood component analysis, however, the most critical features related to degradation were identified. Two models were considered, and results varied between them. However, longitude and latitude emerged as potentially the best predictors of average HVAC component degradation. Further research is needed to evaluate additional environmental features, increase the resolution of the environmental data, and develop more robust models to achieve more conclusive results.

Keywords: climate, degradation, HVAC, neighborhood component analysis

Procedia PDF Downloads 421
881 Social Identification among Employees: A System Dynamic Approach

Authors: Muhammad Abdullah, Salman Iqbal, Mamoona Rasheed

Abstract:

Social identity among people is an important source of pride and self-esteem, consequently, people struggle to preserve a positive perception of their groups and collectives. The purpose of this paper is to explain the process of social identification and to highlight the underlying causal factors of social identity among employees. There is a little research about how the social identity of employees is shaped in Pakistan’s organizational culture. This study is based on social identity theory. This study uses Systems’ approach as a research methodology. The feedback loop approach is applied to explain the underlying key elements of employee behavior that collectively form social identity among social groups in corporate arena. The findings of this study reveal that effective, evaluative and cognitive components of an individual’s personality are associated with the social identification. The system dynamic feedback loop approach has revealed the underlying structure that is associated with social identity, social group formation, and effective component proved to be the most associated factor. This may also enable to understand how social groups become stable and individuals act according to the group requirements. The value of this paper lies in the understanding gained about the underlying key factors that play a crucial role in social group formation in organizations. It may help to understand the rationale behind how employees socially categorize themselves within organizations. It may also help to design effective and more cohesive teams for better operations and long-term results. This may help to share knowledge among employees as well. The underlying structure behind the social identification is highlighted with the help of system modeling.

Keywords: affective commitment, cognitive commitment, evaluated commitment, system thinking

Procedia PDF Downloads 132
880 Porcelain Paste Processing by Robocasting 3D: Parameters Tuning

Authors: A. S. V. Carvalho, J. Luis, L. S. O. Pires, J. M. Oliveira

Abstract:

Additive manufacturing technologies (AM) experienced a remarkable growth in the latest years due to the development and diffusion of a wide range of three-dimensional (3D) printing techniques. Nowadays we can find techniques available for non-industrial users, like fused filament fabrication, but techniques like 3D printing, polyjet, selective laser sintering and stereolithography are mainly spread in the industry. Robocasting (R3D) shows a great potential due to its ability to shape materials with a wide range of viscosity. Industrial porcelain compositions showing different rheological behaviour can be prepared and used as candidate materials to be processed by R3D. The use of this AM technique in industry is very residual. In this work, a specific porcelain composition with suitable rheological properties will be processed by R3D, and a systematic study of the printing parameters tuning will be shown. The porcelain composition was formulated based on an industrial spray dried porcelain powder. The powder particle size and morphology was analysed. The powders were mixed with water and an organic binder on a ball mill at 200 rpm/min for 24 hours. The batch viscosity was adjusted by the addition of an acid solution and mixed again. The paste density, viscosity, zeta potential, particle size distribution and pH were determined. In a R3D system, different speed and pressure settings were studied to access their impact on the fabrication of porcelain models. These models were dried at 80 °C, during 24 hours and sintered in air at 1350 °C for 2 hours. The stability of the models, its walls and surface quality were studied and their physical properties were accessed. The microstructure and layer adhesion were observed by SEM. The studied processing parameters have a high impact on the models quality. Moreover, they have a high impact on the stacking of the filaments. The adequate tuning of the parameters has a huge influence on the final properties of the porcelain models. This work contributes to a better assimilation of AM technologies in ceramic industry. Acknowledgments: The RoboCer3D project – project of additive rapid manufacturing through 3D printing ceramic material (POCI-01-0247-FEDER-003350) financed by Compete 2020, PT 2020, European Regional Development Fund – FEDER through the International and Competitive Operational Program (POCI) under the PT2020 partnership agreement.

Keywords: additive manufacturing, porcelain, robocasting, R3D

Procedia PDF Downloads 158
879 The Results of the Systematic Archaeological Survey of Sistan (Iran)

Authors: Reza Mehrafarin, Nafiseh Mirshekari

Abstract:

The Sistan plain has always been a site for the settlement of various human societies, thanks to its favorable environmental conditions, such as abundant water from the Hirmand River and fertile sedimentary soil. Consequently, there was a need for a systematic archaeological investigation in the area. The survey had multiple objectives, with the most significant ones being the creation of an archaeological map and the identification and documentation of all ancient sites to establish their records and chronology. The survey was carried out in two phases, with each phase covering half of the area. The research method involved fieldwork, with two teams of professional archaeologists conducting a comprehensive survey of each of the 22 areas in Sistan. Once an area was identified, various recording, scientific, and field operations were executed to study the site. In the first phase (2007), an intensive field survey focused on the residential area of Sistan, including its northern and eastern regions. This phase resulted in the identification of 808 sites in eleven selected areas. In the second phase (2009), the desert area of Sistan, or its southern half, was surveyed, leading to the identification of approximately 853 sites. Overall, these surveys resulted in the identification of 1661 sites in Sistan. Among these sites, approximately 899 belong to the Bronze Age (late 4th millennium BCE to early 2nd millennium BCE). Of these sites, around 501 date back to the historical period, while nearly 590 sites pertain to the Islamic period. The archaeological investigations of both phases revealed that Sistan has consistently possessed fertile soil, abundant water, and a skilled workforce, making it capable of becoming Iran's granary and the center of the East once again if these conditions are restored.

Keywords: sistan, field surveys, archaeology, archaeological map

Procedia PDF Downloads 60
878 Cybersecurity Strategies for Protecting Oil and Gas Industrial Control Systems

Authors: Gaurav Kumar Sinha

Abstract:

The oil and gas industry is a critical component of the global economy, relying heavily on industrial control systems (ICS) to manage and monitor operations. However, these systems are increasingly becoming targets for cyber-attacks, posing significant risks to operational continuity, safety, and environmental integrity. This paper explores comprehensive cybersecurity strategies for protecting oil and gas industrial control systems. It delves into the unique vulnerabilities of ICS in this sector, including outdated legacy systems, integration with IT networks, and the increased connectivity brought by the Industrial Internet of Things (IIoT). We propose a multi-layered defense approach that includes the implementation of robust network security protocols, regular system updates and patch management, advanced threat detection and response mechanisms, and stringent access control measures. We illustrate the effectiveness of these strategies in mitigating cyber risks and ensuring the resilient and secure operation of oil and gas industrial control systems. The findings underscore the necessity for a proactive and adaptive cybersecurity framework to safeguard critical infrastructure in the face of evolving cyber threats.

Keywords: cybersecurity, industrial control systems, oil and gas, cyber-attacks, network security, IoT, threat detection, system updates, patch management, access control, cybersecurity awareness, critical infrastructure, resilience, cyber threats, legacy systems, IT integration, multi-layered defense, operational continuity, safety, environmental integrity

Procedia PDF Downloads 32
877 Study of the Hydrodynamic of Electrochemical Ion Pumping for Lithium Recovery

Authors: Maria Sofia Palagonia, Doriano Brogioli, Fabio La Mantia

Abstract:

In the last decade, lithium has become an important raw material in various sectors, in particular for rechargeable batteries. Its production is expected to grow more and more in the future, especially for mobile energy storage and electromobility. Until now it is mostly produced by the evaporation of water from salt lakes, which led to a huge water consumption, a large amount of waste produced and a strong environmental impact. A new, clean and faster electrochemical technique to recover lithium has been recently proposed: electrochemical ion pumping. It consists in capturing lithium ions from a feed solution by intercalation in a lithium-selective material, followed by releasing them into a recovery solution; both steps are driven by the passage of a current. In this work, a new configuration of the electrochemical cell is presented, used to study and optimize the process of the intercalation of lithium ions through the hydrodynamic condition. Lithium Manganese Oxide (LiMn₂O₄) was used as a cathode to intercalate lithium ions selectively during the reduction, while Nickel Hexacyano Ferrate (NiHCF), used as an anode, releases positive ion. The effect of hydrodynamics on the process has been studied by conducting the experiments at various fluxes of the electrolyte through the electrodes, in terms of charge circulated through the cell, captured lithium per unit mass of material and overvoltage. The result shows that flowing the electrolyte inside the cell improves the lithium capture, in particular at low lithium concentration. Indeed, in Atacama feed solution, at 40 mM of lithium, the amount of lithium captured does not increase considerably with the flux of the electrolyte. Instead, when the concentration of the lithium ions is 5 mM, the amount of captured lithium in a single capture cycle increases by increasing the flux, thus leading to the conclusion that the slowest step in the process is the transport of the lithium ion in the liquid phase. Furthermore, an influence of the concentration of other cations in solution on the process performance was observed. In particular, the capturing of the lithium using a different concentration of NaCl together with 5 mM of LiCl was performed, and the results show that the presence of NaCl limits the amount of the captured lithium. Further studies can be performed in order to understand why the full capacity of the material is not reached at the highest flow rate. This is probably due to the porous structure of the material since the liquid phase is likely not affected by the convection flow inside the pores. This work proves that electrochemical ion pumping, with a suitable hydrodynamic design, enables the recovery of lithium from feed solutions at the lower concentration than the sources that are currently exploited, down to 1 mM.

Keywords: desalination battery, electrochemical ion pumping, hydrodynamic, lithium

Procedia PDF Downloads 204
876 An Information-Based Approach for Preference Method in Multi-Attribute Decision Making

Authors: Serhat Tuzun, Tufan Demirel

Abstract:

Multi-Criteria Decision Making (MCDM) is the modelling of real-life to solve problems we encounter. It is a discipline that aids decision makers who are faced with conflicting alternatives to make an optimal decision. MCDM problems can be classified into two main categories: Multi-Attribute Decision Making (MADM) and Multi-Objective Decision Making (MODM), based on the different purposes and different data types. Although various MADM techniques were developed for the problems encountered, their methodology is limited in modelling real-life. Moreover, objective results are hard to obtain, and the findings are generally derived from subjective data. Although, new and modified techniques are developed by presenting new approaches such as fuzzy logic; comprehensive techniques, even though they are better in modelling real-life, could not find a place in real world applications for being hard to apply due to its complex structure. These constraints restrict the development of MADM. This study aims to conduct a comprehensive analysis of preference methods in MADM and propose an approach based on information. For this purpose, a detailed literature review has been conducted, current approaches with their advantages and disadvantages have been analyzed. Then, the approach has been introduced. In this approach, performance values of the criteria are calculated in two steps: first by determining the distribution of each attribute and standardizing them, then calculating the information of each attribute as informational energy.

Keywords: literature review, multi-attribute decision making, operations research, preference method, informational energy

Procedia PDF Downloads 218
875 Design of Effective Decoupling Point in Build-To-Order Systems: Focusing on Trade-Off Relation between Order-To-Delivery Lead Time and Work in Progress

Authors: Zhiyong Li, Hiroshi Katayama

Abstract:

Since 1990s, e-commerce and internet business have been grown gradually over the word and customers tend to express their demand attributes in terms of specification requirement on parts, component, product structure etc. This paper deals with designing effective decoupling points for build to order systems under e-commerce environment, which can be realized through tradeoff relation analysis between two major criteria, customer order lead time and value of work in progress. These KPIs are critical for successful BTO business, namely time-based service effectiveness on coping with customer requirements for the first issue and cost effective ness with risk aversive operations for the second issue. Approach of this paper consists of investigation of successful business standing for BTO scheme, manufacturing model development of this scheme, quantitative evaluation of proposed models by calculation of two KPI values under various decoupling point distributions and discussion of the results brought by pattern of decoupling point distribution, where some cases provide the pareto optimum performances. To extract the relevant trade-off relation between considered KPIs among 2-dimensional resultant performance, useful logic developed by former research work, i.e. Katayama and Fonseca, is applied. Obtained characteristics are evaluated as effective information for managing BTO manufacturing businesses.

Keywords: build-to-order (BTO), decoupling point, e-commerce, order-to-delivery lead time (ODLT), work in progress (WIP)

Procedia PDF Downloads 320
874 Observations on the Eastern Red Sea Elasmobranchs: Data on Their Distribution and Ecology

Authors: Frappi Sofia, Nicolas Pilcher, Sander DenHaring, Royale Hardenstine, Luis Silva, Collin Williams, Mattie Rodrigue, Vincent Pieriborne, Mohammed Qurban, Carlos M. Duarte

Abstract:

Nowadays, elasmobranch populations are disappearing at a dangerous rate, mainly due to overexploitation, extensive fisheries, as well as climate change. The decline of these species can trigger a cascade effect, which may eventually lead to detrimental impacts on local ecosystems. The Elasmobranch in the Red Sea is facing one of the highest risks of extinction, mainly due to unregulated fisheries activities. Thus, it is of paramount importance to assess their current distribution and unveil their environmental preferences in order to improve conservation measures. Important data have been collected throughout the whole red Sea during the Red Sea Decade Expedition (RSDE) to achieve this goal. Elasmobranch sightings were gathered through the use of submarines, remotely operated underwater vehicles (ROV), scuba diving operations, and helicopter surveys. Over a period of 5 months, we collected 891 sightings, 52 with submarines, 138 with the ROV, 67 with the scuba diving teams, and 634 from helicopters. In total, we observed 657 and 234 individuals from the superorder Batoidea and Selachimorpha, respectively. The most common shark encountered was Iago omanensis, a deep-water shark of the order Carcharhiniformes. To each sighting, data on temperature, salinity density, and dissolved oxygen were integrated to reveal favorable conditions for each species. Additionally, an extensive literature review on elasmobranch research in the Eastern Red Sea has been carried out in order to obtain more data on local populations and to be able to highlight patterns of their distribution.

Keywords: distribution, elasmobranchs, habitat, rays, red sea, sharks

Procedia PDF Downloads 76
873 Epoxomicin Affects Proliferating Neural Progenitor Cells of Rat

Authors: Bahaa Eldin A. Fouda, Khaled N. Yossef, Mohamed Elhosseny, Ahmed Lotfy, Mohamed Salama, Mohamed Sobh

Abstract:

Developmental neurotoxicity (DNT) entails the toxic effects imparted by various chemicals on the brain during the early childhood period. As human brains are vulnerable during this period, various chemicals would have their maximum effects on brains during early childhood. Some toxicants have been confirmed to induce developmental toxic effects on CNS e.g. lead, however; most of the agents cannot be identified with certainty due the defective nature of predictive toxicology models used. A novel alternative method that can overcome most of the limitations of conventional techniques is the use of 3D neurospheres system. This in-vitro system can recapitulate most of the changes during the period of brain development making it an ideal model for predicting neurotoxic effects. In the present study, we verified the possible DNT of epoxomicin which is a naturally occurring selective proteasome inhibitor with anti-inflammatory activity. Rat neural progenitor cells were isolated from rat embryos (E14) extracted from placental tissue. The cortices were aseptically dissected out from the brains of the fetuses and the tissues were triturated by repeated passage through a fire-polished constricted Pasteur pipette. The dispersed tissues were allowed to settle for 3 min. The supernatant was, then, transferred to a fresh tube and centrifuged at 1,000 g for 5 min. The pellet was placed in Hank’s balanced salt solution cultured as free-floating neurospheres in proliferation medium. Two doses of epoxomicin (1µM and 10µM) were used in cultured neuropsheres for a period of 14 days. For proliferation analysis, spheres were cultured in proliferation medium. After 0, 4, 5, 11, and 14 days, sphere size was determined by software analyses. The diameter of each neurosphere was measured and exported to excel file further to statistical analysis. For viability analysis, trypsin-EDTA solution were added to neurospheres for 3 min to dissociate them into single cells suspension, then viability evaluated by the Trypan Blue exclusion test. Epoxomicin was found to affect proliferation and viability of neuropsheres, these effects were positively correlated to doses and progress of time. This study confirms the DNT effects of epoxomicin on 3D neurospheres model. The effects on proliferation suggest possible gross morphologic changes while the decrease in viability propose possible focal lesion on exposure to epoxomicin during early childhood.

Keywords: neural progentor cells, epoxomicin, neurosphere, medical and health sciences

Procedia PDF Downloads 418
872 Investigating Visual Statistical Learning during Aging Using the Eye-Tracking Method

Authors: Zahra Kazemi Saleh, Bénédicte Poulin-Charronnat, Annie Vinter

Abstract:

This study examines the effects of aging on visual statistical learning, using eye-tracking techniques to investigate this cognitive phenomenon. Visual statistical learning is a fundamental brain function that enables the automatic and implicit recognition, processing, and internalization of environmental patterns over time. Some previous research has suggested the robustness of this learning mechanism throughout the aging process, underscoring its importance in the context of education and rehabilitation for the elderly. The study included three distinct groups of participants, including 21 young adults (Mage: 19.73), 20 young-old adults (Mage: 67.22), and 17 old-old adults (Mage: 79.34). Participants were exposed to a series of 12 arbitrary black shapes organized into 6 pairs, each with different spatial configurations and orientations (horizontal, vertical, and oblique). These pairs were not explicitly revealed to the participants, who were instructed to passively observe 144 grids presented sequentially on the screen for a total duration of 7 min. In the subsequent test phase, participants performed a two-alternative forced-choice task in which they had to identify the most familiar pair from 48 trials, each consisting of a base pair and a non-base pair. Behavioral analysis using t-tests revealed notable findings. The mean score for the first group was significantly above chance, indicating the presence of visual statistical learning. Similarly, the second group also performed significantly above chance, confirming the persistence of visual statistical learning in young-old adults. Conversely, the third group, consisting of old-old adults, showed a mean score that was not significantly above chance. This lack of statistical learning in the old-old adult group suggests a decline in this cognitive ability with age. Preliminary eye-tracking results showed a decrease in the number and duration of fixations during the exposure phase for all groups. The main difference was that older participants focused more often on empty cases than younger participants, likely due to a decline in the ability to ignore irrelevant information, resulting in a decrease in statistical learning performance.

Keywords: aging, eye tracking, implicit learning, visual statistical learning

Procedia PDF Downloads 73
871 Evaluation of Alpha-Glucosidase Inhibitory Effect of Two Plants from Brazilian Cerrado

Authors: N. A. P. Camaforte, P. M. P. Vareda, L. L. Saldanha, A. L. Dokkedal, J. M. Rezende-Neto, M. R. Senger, F. P. Silva-Jr, J. R. Bosqueiro

Abstract:

Diabetes mellitus is a disease characterized by deficiency of insulin secretion and/or action which results in hyperglycemia. Nowadays, acarbose is a medicine used by diabetic people to inhibit alpha-glucosidases leading to the decreasing of post-feeding glycaemia, but with low effectiveness and many side effects. Medicinal plants have been used for the treatment of many diseases including diabetes and their action occurs through the modulation of insulin-depending processes, pancreas regeneration or inhibiting glucose absorption by the intestine. Previous studies in our laboratory showed that the treatment using two crude extracts of plants from Brazilian cerrado was able to decrease fasting blood glucose and improve glucose tolerance in streptozotocin-diabetic mice. Because of this and the importance of the search for new alternatives to decrease the hyperglycemia, we decided to evaluate the inhibitory action of two plants from Brazilian cerrado - B.H. and Myrcia bella. The enzymatic assay was performed in 50 µL of final volume using pancreatic α-amylase and maltase together with theirs commercial substrates. The inhibition potency (IC50) was determined by the incubation of eight different concentrations of both extracts and the enzymes for 5 minutes at 37ºC. After, the substrate was added to start the reaction. Glucosidases assay was evaluated measuring the quantity of p-nitrophenol in 405 nmin 384 wells automatic reader. The in vitro assay with the extracts of B.H. and M. bella showed an IC50 of 28,04µg/mL and 16,93 µg/mL for α-amilase, and 43,01µg/mL and 17 µg/mL for maltase, respectively. M. bella extract showed a higher inhibitory activity for those enzymes than B.H. extract. The crude extracts tested showed a higher inhibition rate to α-amylase, but were less effective against maltase in comparison to acarbose (IC50 36µg/mL and 9 µg/mL, respectively). In conclusion, the crude extract of B.H. and M. bella showed a potent inhibitory effect against α-amylase and showed promising results to the possible development of new medicines to treat diabetes with less or even without side effects.

Keywords: alfa-glucosidases, diabetes mellitus, glycaemia, medicinal plants

Procedia PDF Downloads 233
870 Corpus-Based Neural Machine Translation: Empirical Study Multilingual Corpus for Machine Translation of Opaque Idioms - Cloud AutoML Platform

Authors: Khadija Refouh

Abstract:

Culture bound-expressions have been a bottleneck for Natural Language Processing (NLP) and comprehension, especially in the case of machine translation (MT). In the last decade, the field of machine translation has greatly advanced. Neural machine translation NMT has recently achieved considerable development in the quality of translation that outperformed previous traditional translation systems in many language pairs. Neural machine translation NMT is an Artificial Intelligence AI and deep neural networks applied to language processing. Despite this development, there remain some serious challenges that face neural machine translation NMT when translating culture bounded-expressions, especially for low resources language pairs such as Arabic-English and Arabic-French, which is not the case with well-established language pairs such as English-French. Machine translation of opaque idioms from English into French are likely to be more accurate than translating them from English into Arabic. For example, Google Translate Application translated the sentence “What a bad weather! It runs cats and dogs.” to “يا له من طقس سيء! تمطر القطط والكلاب” into the target language Arabic which is an inaccurate literal translation. The translation of the same sentence into the target language French was “Quel mauvais temps! Il pleut des cordes.” where Google Translate Application used the accurate French corresponding idioms. This paper aims to perform NMT experiments towards better translation of opaque idioms using high quality clean multilingual corpus. This Corpus will be collected analytically from human generated idiom translation. AutoML translation, a Google Neural Machine Translation Platform, is used as a custom translation model to improve the translation of opaque idioms. The automatic evaluation of the custom model will be compared to the Google NMT using Bilingual Evaluation Understudy Score BLEU. BLEU is an algorithm for evaluating the quality of text which has been machine-translated from one natural language to another. Human evaluation is integrated to test the reliability of the Blue Score. The researcher will examine syntactical, lexical, and semantic features using Halliday's functional theory.

Keywords: multilingual corpora, natural language processing (NLP), neural machine translation (NMT), opaque idioms

Procedia PDF Downloads 139
869 The Staphylococcus aureus Exotoxin Recognition Using Nanobiosensor Designed by an Antibody-Attached Nanosilica Method

Authors: Hamed Ahari, Behrouz Akbari Adreghani, Vadood Razavilar, Amirali Anvar, Sima Moradi, Hourieh Shalchi

Abstract:

Considering the ever increasing population and industrialization of the developmental trend of humankind's life, we are no longer able to detect the toxins produced in food products using the traditional techniques. This is due to the fact that the isolation time for food products is not cost-effective and even in most of the cases, the precision in the practical techniques like the bacterial cultivation and other techniques suffer from operator errors or the errors of the mixtures used. Hence with the advent of nanotechnology, the design of selective and smart sensors is one of the greatest industrial revelations of the quality control of food products that in few minutes time, and with a very high precision can identify the volume and toxicity of the bacteria. Methods and Materials: In this technique, based on the bacterial antibody connection to nanoparticle, a sensor was used. In this part of the research, as the basis for absorption for the recognition of bacterial toxin, medium sized silica nanoparticles of 10 nanometer in form of solid powder were utilized with Notrino brand. Then the suspension produced from agent-linked nanosilica which was connected to bacterial antibody was positioned near the samples of distilled water, which were contaminated with Staphylococcus aureus bacterial toxin with the density of 10-3, so that in case any toxin exists in the sample, a connection between toxin antigen and antibody would be formed. Finally, the light absorption related to the connection of antigen to the particle attached antibody was measured using spectrophotometry. The gene of 23S rRNA that is conserved in all Staphylococcus spp., also used as control. The accuracy of the test was monitored by using serial dilution (l0-6) of overnight cell culture of Staphylococcus spp., bacteria (OD600: 0.02 = 107 cell). It showed that the sensitivity of PCR is 10 bacteria per ml of cells within few hours. Result: The results indicate that the sensor detects up to 10-4 density. Additionally, the sensitivity of the sensors was examined after 60 days, the sensor by the 56 days had confirmatory results and started to decrease after those time periods. Conclusions: Comparing practical nano biosensory to conventional methods like that culture and biotechnology methods(such as polymerase chain reaction) is accuracy, sensitiveness and being unique. In the other way, they reduce the time from the hours to the 30 minutes.

Keywords: exotoxin, nanobiosensor, recognition, Staphylococcus aureus

Procedia PDF Downloads 381
868 Organization Culture: Mediator of Information Technology Competence and IT Governance Effectiveness

Authors: Sonny Nyeko, Moses Niwe

Abstract:

Purpose: This research paper examined the mediation effect of organization culture in the relationship between information technology (IT) competence and IT governance effectiveness in Ugandan public universities. The purpose of the research paper is to examine the role of organizational culture in the relationship between IT competence and IT governance effectiveness. Design/methodology/approach: The paper adopted the MedGraph program, Sobel tests and Kenny and Baron Approach for testing the mediation effects. Findings: It is impeccable that IT competence and organization culture are true drivers of IT governance effectiveness in Ugandan public universities. However, organizational culture reveals partial mediation in the IT competence and IT governance effectiveness relationship. Research limitations/implications: The empirical investigation in this research depends profoundly on public universities. Future research in Ugandan private universities could be undertaken to compare results. Practical implications: To effectively achieve IT governance effectiveness, it means senior management requires IT knowledge which is a vital ingredient of IT competence. Moreover, organizations today ought to adopt cultures that are intended to have them competitive in their businesses, with IT operations not in isolation. Originality/value: Spending thousands of dollars on IT resources in advanced institutes of learning necessitates IT control. Preliminary studies in Ugandan public universities have revealed the ineffective utilization of IT resources. Besides, IT governance issues with IT competence and organization culture remain outstanding. Thus, it’s a new study testing the mediating outcome of organization culture in the association between IT competence and IT governance effectiveness in the Ugandan universities.

Keywords: organization culture, IT competence, IT governance, effectiveness, mediating effect, universities, Uganda

Procedia PDF Downloads 130
867 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem

Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee

Abstract:

Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.

Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research

Procedia PDF Downloads 334