Search results for: binary morphological operation
1123 In Silico Modeling of Drugs Milk/Plasma Ratio in Human Breast Milk Using Structures Descriptors
Authors: Navid Kaboudi, Ali Shayanfar
Abstract:
Introduction: Feeding infants with safe milk from the beginning of their life is an important issue. Drugs which are used by mothers can affect the composition of milk in a way that is not only unsuitable, but also toxic for infants. Consuming permeable drugs during that sensitive period by mother could lead to serious side effects to the infant. Due to the ethical restrictions of drug testing on humans, especially women, during their lactation period, computational approaches based on structural parameters could be useful. The aim of this study is to develop mechanistic models to predict the M/P ratio of drugs during breastfeeding period based on their structural descriptors. Methods: Two hundred and nine different chemicals with their M/P ratio were used in this study. All drugs were categorized into two groups based on their M/P value as Malone classification: 1: Drugs with M/P>1, which are considered as high risk 2: Drugs with M/P>1, which are considered as low risk Thirty eight chemical descriptors were calculated by ACD/labs 6.00 and Data warrior software in order to assess the penetration during breastfeeding period. Later on, four specific models based on the number of hydrogen bond acceptors, polar surface area, total surface area, and number of acidic oxygen were established for the prediction. The mentioned descriptors can predict the penetration with an acceptable accuracy. For the remaining compounds (N= 147, 158, 160, and 174 for models 1 to 4, respectively) of each model binary regression with SPSS 21 was done in order to give us a model to predict the penetration ratio of compounds. Only structural descriptors with p-value<0.1 remained in the final model. Results and discussion: Four different models based on the number of hydrogen bond acceptors, polar surface area, and total surface area were obtained in order to predict the penetration of drugs into human milk during breastfeeding period About 3-4% of milk consists of lipids, and the amount of lipid after parturition increases. Lipid soluble drugs diffuse alongside with fats from plasma to mammary glands. lipophilicity plays a vital role in predicting the penetration class of drugs during lactation period. It was shown in the logistic regression models that compounds with number of hydrogen bond acceptors, PSA and TSA above 5, 90 and 25 respectively, are less permeable to milk because they are less soluble in the amount of fats in milk. The pH of milk is acidic and due to that, basic compounds tend to be concentrated in milk than plasma while acidic compounds may consist lower concentrations in milk than plasma. Conclusion: In this study, we developed four regression-based models to predict the penetration class of drugs during the lactation period. The obtained models can lead to a higher speed in drug development process, saving energy, and costs. Milk/plasma ratio assessment of drugs requires multiple steps of animal testing, which has its own ethical issues. QSAR modeling could help scientist to reduce the amount of animal testing, and our models are also eligible to do that.Keywords: logistic regression, breastfeeding, descriptors, penetration
Procedia PDF Downloads 721122 Android-Based Edugame Application for Earthquakes Disaster Mitigation Education
Authors: Endina P. Purwandari, Yolanda Hervianti, Feri Noperman, Endang W. Winarni
Abstract:
The earthquakes disaster is an event that can threaten at any moment and cause damage and loss of life. Game earthquake disaster mitigation is a useful educational game to enhance children insight, knowledge, and understanding in the response to the impact of the earthquake. This study aims to build an educational games application on the Android platform as a learning media for earthquake mitigation education and to determine the effect of the application toward children understanding of the earthquake disaster mitigation. The methods were research and development. The development was to develop edugame application for earthquakes mitigation education. The research involved elementary students as a research sample to test the developed application. The research results were valid android-based edugame application, and its the effect of application toward children understanding. The application contains an earthquake simulation video, an earthquake mitigation video, and a game consisting three stages, namely before the earthquake, when the earthquake occur, and after the earthquake. The results of the feasibility test application showed that this application was included in the category of 'Excellent' which the average percentage of the operation of applications by 76%, view application by 67% and contents of application by 74%. The test results of students' responses were 80% that showed that a positive their responses toward the application. The student understanding test results show that the average score of children understanding pretest was 71,33, and post-test was 97,00. T-test result showed that t value by 8,02 more than table t by 2,001. This indicated that the earthquakes disaster mitigation edugame application based on Android platform affects the children understanding about disaster earthquake mitigation.Keywords: android, edugame, mitigation, earthquakes
Procedia PDF Downloads 3641121 Development of a Diagnostic Device to Predict Clinically Significant Inflammation Associated with Cardiac Surgery
Authors: Mohamed Majrashi, Patricia Connolly, Terry Gourlay
Abstract:
Cardiopulmonary bypass is known to cause inflammatory response during open heart surgery. It includes the initiation of different cascades such as coagulation, complement system and cytokines. Although the immune system is body’s key defense mechanism against external assault, when overexpressed, it can be injurious to the patient, particularly in a cohort of patients in which there is a heightened and uncontrolled response. The inflammatory response develops in these patients to an exaggerated level resulting in an autoimmune injury and may lead to poor postoperative outcomes (systemic inflammatory response syndrome and multi-organs failure). Previous studies by this group have suggested a correlation between the level of IL6 measured in patient’s blood before surgery and after polymeric activation and the observed inflammatory response during surgery. Based upon these findings, the present work is aimed at using this response to develop a test which can be used prior to the open heart surgery to identify the high-risk patients before their operation. The work will be accomplished via three main clinical phases including some pilot in-vitro studies, device development and clinical investigation. Current findings from studies using animal blood, employing DEHP and DEHP plasticized PVC materials as the activator, support the earlier results in patient samples. Having established this relationship, ongoing work will focus on developing an activated lateral flow strip technology as a screening device for heightened inflammatory propensity.Keywords: cardiopulmonary bypass, cytokines, inflammatory response, overexpression
Procedia PDF Downloads 2841120 Patterns of TV Simultaneous Interpreting of Emotive Overtones in Trump’s Victory Speech from English into Arabic
Authors: Hanan Al-Jabri
Abstract:
Simultaneous interpreting is deemed to be the most challenging mode of interpreting by many scholars. The special constraints involved in this task including time constraints, different linguistic systems, and stress pose a great challenge to most interpreters. These constraints are likely to maximise when the interpreting task is done live on TV. The TV interpreter is exposed to a wide variety of audiences with different backgrounds and needs and is mostly asked to interpret high profile tasks which raise his/her levels of stress, which further complicate the task. Under these constraints, which require fast and efficient performance, TV interpreters of four TV channels were asked to render Trump's victory speech into Arabic. However, they had also to deal with the burden of rendering English emotive overtones employed by the speaker into a whole different linguistic system. The current study aims at investigating the way TV interpreters, who worked in the simultaneous mode, handled this task; it aims at exploring and evaluating the TV interpreters’ linguistic choices and whether the original emotive effect was maintained, upgraded, downgraded or abandoned in their renditions. It also aims at exploring the possible difficulties and challenges that emerged during this process and might have influenced the interpreters’ linguistic choices. To achieve its aims, the study analysed Trump’s victory speech delivered on November 6, 2016, along with four Arabic simultaneous interpretations produced by four TV channels: Al-Jazeera, RT, CBC News, and France 24. The analysis of the study relied on two frameworks: a macro and a micro framework. The former presents an overview of the wider context of the English speech as well as an overview of the speaker and his political background to help understand the linguistic choices he made in the speech, and the latter framework investigates the linguistic tools which were employed by the speaker to stir people’s emotions. These tools were investigated based on Shamaa’s (1978) classification of emotive meaning according to their linguistic level: phonological, morphological, syntactic, and semantic and lexical levels. Moreover, this level investigates the patterns of rendition which were detected in the Arabic deliveries. The results of the study identified different rendition patterns in the Arabic deliveries, including parallel rendition, approximation, condensation, elaboration, transformation, expansion, generalisation, explicitation, paraphrase, and omission. The emerging patterns, as suggested by the analysis, were influenced by factors such as speedy and continuous delivery of some stretches, and highly-dense segments among other factors. The study aims to contribute to a better understanding of TV simultaneous interpreting between English and Arabic, as well as the practices of TV interpreters when rendering emotiveness especially that little is known about interpreting practices in the field of TV, particularly between Arabic and English.Keywords: emotive overtones, interpreting strategies, political speeches, TV interpreting
Procedia PDF Downloads 1591119 Characterization of New Sources of Maize (Zea mays L.) Resistance to Sitophilus zeamais (Coleoptera: Curculionidae) Infestation in Stored Maize
Authors: L. C. Nwosu, C. O. Adedire, M. O. Ashamo, E. O. Ogunwolu
Abstract:
The maize weevil, Sitophilus zeamais Motschulsky is a notorious pest of stored maize (Zea mays L.). The development of resistant maize varieties to manage weevils is a major breeding objective. The study investigated the parameters and mechanisms that confer resistance on a maize variety to S. zeamais infestation using twenty elite maize varieties. Detailed morphological, physical and chemical studies were conducted on whole-maize grain and the grain pericarp. Resistance was assessed at 33, 56, and 90 days post infestation using weevil mortality rate, weevil survival rate, percent grain damage, percent grain weight loss, weight of grain powder, oviposition rate and index of susceptibility as indices rated on a scale developed by the present study and on Dobie’s modified scale. Linear regression models that can predict maize grain damage in relation to the duration of storage were developed and applied. The resistant varieties identified particularly 2000 SYNEE-WSTR and TZBRELD3C5 with very high degree of resistance should be used singly or best in an integrated pest management system for the control of S. zeamais infestation in stored maize. Though increases in the physical properties of grain hardness, weight, length, and width increased varietal resistance, it was found that the bases of resistance were increased chemical attributes of phenolic acid, trypsin inhibitor and crude fibre while the bases of susceptibility were increased protein, starch, magnesium, calcium, sodium, phosphorus, manganese, iron, cobalt and zinc, the role of potassium requiring further investigation. Characters that conferred resistance on the test varieties were found distributed in the pericarp and the endosperm of the grains. Increases in grain phenolic acid, crude fibre, and trypsin inhibitor adversely and significantly affected the bionomics of the weevil on further assessment. The flat side of a maize grain at the point of penetration was significantly preferred by the weevil. Why the south area of the flattened side of a maize grain was significantly preferred by the weevil is clearly unknown, even though grain-face-type seemed to be a contributor in the study. The preference shown to the south area of the grain flat side has implications for seed viability. The study identified antibiosis, preference, antixenosis, and host evasion as the mechanisms of maize post harvest resistance to Sitophilus zeamais infestation.Keywords: maize weevil, resistant, parameters, mechanisms, preference
Procedia PDF Downloads 3071118 Low Frequency Ultrasonic Degassing to Reduce Void Formation in Epoxy Resin and Its Effect on the Thermo-Mechanical Properties of the Cured Polymer
Authors: A. J. Cobley, L. Krishnan
Abstract:
The demand for multi-functional lightweight materials in sectors such as automotive, aerospace, electronics is growing, and for this reason fibre-reinforced, epoxy polymer composites are being widely utilized. The fibre reinforcing material is mainly responsible for the strength and stiffness of the composites whilst the main role of the epoxy polymer matrix is to enhance the load distribution applied on the fibres as well as to protect the fibres from the effect of harmful environmental conditions. The superior properties of the fibre-reinforced composites are achieved by the best properties of both of the constituents. Although factors such as the chemical nature of the epoxy and how it is cured will have a strong influence on the properties of the epoxy matrix, the method of mixing and degassing of the resin can also have a significant impact. The production of a fibre-reinforced epoxy polymer composite will usually begin with the mixing of the epoxy pre-polymer with a hardener and accelerator. Mechanical methods of mixing are often employed for this stage but such processes naturally introduce air into the mixture, which, if it becomes entrapped, will lead to voids in the subsequent cured polymer. Therefore, degassing is normally utilised after mixing and this is often achieved by placing the epoxy resin mixture in a vacuum chamber. Although this is reasonably effective, it is another process stage and if a method of mixing could be found that, at the same time, degassed the resin mixture this would lead to shorter production times, more effective degassing and less voids in the final polymer. In this study the effect of four different methods for mixing and degassing of the pre-polymer with hardener and accelerator were investigated. The first two methods were manual stirring and magnetic stirring which were both followed by vacuum degassing. The other two techniques were ultrasonic mixing/degassing using a 40 kHz ultrasonic bath and a 20 kHz ultrasonic probe. The cured cast resin samples were examined under scanning electron microscope (SEM), optical microscope, and Image J analysis software to study morphological changes, void content and void distribution. Three point bending test and differential scanning calorimetry (DSC) were also performed to determine the thermal and mechanical properties of the cured resin. It was found that the use of the 20 kHz ultrasonic probe for mixing/degassing gave the lowest percentage voids of all the mixing methods in the study. In addition, the percentage voids found when employing a 40 kHz ultrasonic bath to mix/degas the epoxy polymer mixture was only slightly higher than when magnetic stirrer mixing followed by vacuum degassing was utilized. The effect of ultrasonic mixing/degassing on the thermal and mechanical properties of the cured resin will also be reported. The results suggest that low frequency ultrasound is an effective means of mixing/degassing a pre-polymer mixture and could enable a significant reduction in production times.Keywords: degassing, low frequency ultrasound, polymer composites, voids
Procedia PDF Downloads 2961117 Co-Alignment of Comfort and Energy Saving Objectives for U.S. Office Buildings and Restaurants
Authors: Lourdes Gutierrez, Eric Williams
Abstract:
Post-occupancy research shows that only 11% of commercial buildings met the ASHRAE thermal comfort standard. Many buildings are too warm in winter and/or too cool in summer, wasting energy and not providing comfort. In this paper, potential energy savings in U.S. offices and restaurants if thermostat settings are calculated according the updated ASHRAE 55-2013 comfort model that accounts for outdoor temperature and clothing choice for different climate zones. eQUEST building models are calibrated to reproduce aggregate energy consumption as reported in the U.S. Commercial Building Energy Consumption Survey. Changes in energy consumption due to the new settings are analyzed for 14 cities in different climate zones and then the results are extrapolated to estimate potential national savings. It is found that, depending on the climate zone, each degree increase in the summer saves 0.6 to 1.0% of total building electricity consumption. Each degree the winter setting is lowered saves 1.2% to 8.7% of total building natural gas consumption. With new thermostat settings, national savings are 2.5% of the total consumed in all office buildings and restaurants, summing up to national savings of 69.6 million GJ annually, comparable to all 2015 total solar PV generation in US. The goals of improved comfort and energy/economic savings are thus co-aligned, raising the importance of thermostat management as an energy efficiency strategy.Keywords: energy savings quantifications, commercial building stocks, dynamic clothing insulation model, operation-focused interventions, energy management, thermal comfort, thermostat settings
Procedia PDF Downloads 3021116 Cognitive Model of Analogy Based on Operation of the Brain Cells: Glial, Axons and Neurons
Authors: Ozgu Hafizoglu
Abstract:
Analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with attributional, deep structural, casual relations that are essential to learning, to innovation in artificial worlds, and to discovery in science. Cognitive Model of Analogy (CMA) leads and creates information pattern transfer within and between domains and disciplines in science. This paper demonstrates the Cognitive Model of Analogy (CMA) as an evolutionary approach to scientific research. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions. In this paper, the model of analogical reasoning is created based on brain cells, their fractal, and operational forms within the system itself. Visualization techniques are used to show correspondences. Distinct phases of the problem-solving processes are divided thusly: encoding, mapping, inference, and response. The system is revealed relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain cells: glial cells, axons, axon terminals, and neurons, relative to matching conditions of analogical reasoning and relational information. It’s found that encoding, mapping, inference, and response processes in four-term analogical reasoning are corresponding with the fractal and operational forms of brain cells: glial, axons, and neurons.Keywords: analogy, analogical reasoning, cognitive model, brain and glials
Procedia PDF Downloads 1851115 Sentiment Analysis of Chinese Microblog Comments: Comparison between Support Vector Machine and Long Short-Term Memory
Authors: Xu Jiaqiao
Abstract:
Text sentiment analysis is an important branch of natural language processing. This technology is widely used in public opinion analysis and web surfing recommendations. At present, the mainstream sentiment analysis methods include three parts: sentiment analysis based on a sentiment dictionary, based on traditional machine learning, and based on deep learning. This paper mainly analyzes and compares the advantages and disadvantages of the SVM method of traditional machine learning and the Long Short-term Memory (LSTM) method of deep learning in the field of Chinese sentiment analysis, using Chinese comments on Sina Microblog as the data set. Firstly, this paper classifies and adds labels to the original comment dataset obtained by the web crawler, and then uses Jieba word segmentation to classify the original dataset and remove stop words. After that, this paper extracts text feature vectors and builds document word vectors to facilitate the training of the model. Finally, SVM and LSTM models are trained respectively. After accuracy calculation, it can be obtained that the accuracy of the LSTM model is 85.80%, while the accuracy of SVM is 91.07%. But at the same time, LSTM operation only needs 2.57 seconds, SVM model needs 6.06 seconds. Therefore, this paper concludes that: compared with the SVM model, the LSTM model is worse in accuracy but faster in processing speed.Keywords: sentiment analysis, support vector machine, long short-term memory, Chinese microblog comments
Procedia PDF Downloads 941114 Performance and Processing Evaluation of Solid Oxide Cells by Co-Sintering of GDC Buffer Layer and LSCF Air Electrode
Authors: Hyun-Jong Choi, Minjun Kwak, Doo-Won Seo, Sang-Kuk Woo, Sun-Dong Kim
Abstract:
Solid Oxide Cell(SOC) systems can contribute to the transition to the hydrogen society by utilized as a power and hydrogen generator by the electrochemical reaction with high efficiency at high operation temperature (>750 ℃). La1-xSrxCo1-yFeyO3, which is an air electrode, is occurred stability degradations due to reaction and delamination with yittria stabilized zirconia(YSZ) electrolyte in a water electrolysis mode. To complement this phenomenon SOCs need gadolinium doped ceria(GDC) buffer layer between electrolyte and air electrode. However, GDC buffer layer requires a high sintering temperature and it causes a reaction with YSZ electrolyte. This study carried out low temperature sintering of GDC layer by applying Cu-oxide as a sintering aid. The effect of a copper additive as a sintering aid to lower the sintering temperature for the construction of solid oxide fuel cells (SOFCs) was investigated. GDC buffer layer with 0.25-10 mol% CuO sintering aid was prepared by reacting GDC power and copper nitrate solution followed by heating at 600 ℃. The sintering of CuO-added GDC powder was optimized by investigating linear shrinkage, microstructure, grain size, ionic conductivity, and activation energy of CuO-GDC electrolytes at temperatures ranging from 1100 to 1400 ℃. The sintering temperature of the CuO-GDC electrolyte decreases from 1400 ℃ to 1100 ℃ by adding the CuO sintering aid. The ionic conductivity of the CuO-GDC electrolyte shows a maximum value at 0.5 mol% of CuO. However, the addition of CuO has no significant effects on the activation energy of GDC electrolyte. GDC-LSCF layers were co-sintering at 1050 and 1100 ℃ and button cell tests were carried out at 750 ℃.Keywords: Co-Sintering, GDC-LSCF, Sintering Aid, solid Oxide Cells
Procedia PDF Downloads 2451113 Optimization of Structures with Mixed Integer Non-linear Programming (MINLP)
Authors: Stojan Kravanja, Andrej Ivanič, Tomaž Žula
Abstract:
This contribution focuses on structural optimization in civil engineering using mixed integer non-linear programming (MINLP). MINLP is characterized as a versatile method that can handle both continuous and discrete optimization variables simultaneously. Continuous variables are used to optimize parameters such as dimensions, stresses, masses, or costs, while discrete variables represent binary decisions to determine the presence or absence of structural elements within a structure while also calculating discrete materials and standard sections. The optimization process is divided into three main steps. First, a mechanical superstructure with a variety of different topology-, material- and dimensional alternatives. Next, a MINLP model is formulated to encapsulate the optimization problem. Finally, an optimal solution is searched in the direction of the defined objective function while respecting the structural constraints. The economic or mass objective function of the material and labor costs of a structure is subjected to the constraints known from structural analysis. These constraints include equations for the calculation of internal forces and deflections, as well as equations for the dimensioning of structural components (in accordance with the Eurocode standards). Given the complex, non-convex and highly non-linear nature of optimization problems in civil engineering, the Modified Outer-Approximation/Equality-Relaxation (OA/ER) algorithm is applied. This algorithm alternately solves subproblems of non-linear programming (NLP) and main problems of mixed-integer linear programming (MILP), in this way gradually refines the solution space up to the optimal solution. The NLP corresponds to the continuous optimization of parameters (with fixed topology, discrete materials and standard dimensions, all determined in the previous MILP), while the MILP involves a global approximation to the superstructure of alternatives, where a new topology, materials, standard dimensions are determined. The optimization of a convex problem is stopped when the MILP solution becomes better than the best NLP solution. Otherwise, it is terminated when the NLP solution can no longer be improved. While the OA/ER algorithm, like all other algorithms, does not guarantee global optimality due to the presence of non-convex functions, various modifications, including convexity tests, are implemented in OA/ER to mitigate these difficulties. The effectiveness of the proposed MINLP approach is demonstrated by its application to various structural optimization tasks, such as mass optimization of steel buildings, cost optimization of timber halls, composite floor systems, etc. Special optimization models have been developed for the optimization of these structures. The MINLP optimizations, facilitated by the user-friendly software package MIPSYN, provide insights into a mass or cost-optimal solutions, optimal structural topologies, optimal material and standard cross-section choices, confirming MINLP as a valuable method for the optimization of structures in civil engineering.Keywords: MINLP, mixed-integer non-linear programming, optimization, structures
Procedia PDF Downloads 461112 Adsorption of 17a-Ethinylestradiol on Activated Carbon Based on Sewage Sludge in Aqueous Medium
Authors: Karoline Reis de Sena
Abstract:
Endocrine disruptors are unregulated or not fully regulated compounds, even in the most developed countries, and which can be a danger to the environment and human health. They pass untreated through the secondary stage of conventional wastewater treatment plants, then the effluent from the wastewater treatment plants is discharged into the rivers, upstream and downstream from the drinking water treatment plants that use the same river water as the tributary. Long-term consumption of drinking water containing low concentrations of these compounds can cause health problems; these are persistent in nature and difficult to remove. In this way, research on emerging pollutants is expanding and is fueled by progress in finding the appropriate method for treating wastewater. Adsorption is the most common separation process, it is a simple and low-cost operation, but it is not eco-efficient. Concomitant to this, biosorption arises, which is a subcategory of adsorption where the biosorbent is biomass and which presents numerous advantages when compared to conventional treatment methods, such as low cost, high efficiency, minimization of the use of chemicals, absence of need for additional nutrients, biosorbent regeneration capacity and the biomass used in the production of biosorbents are found in abundance in nature. Thus, the use of alternative materials, such as sewage sludge, for the synthesis of adsorbents has proved to be an economically viable alternative, together with the importance of valuing the generated by-product flows, as well as managing the problem of their correct disposal. In this work, an alternative for the management of sewage sludge is proposed, transforming it into activated carbon and using it in the adsorption process of 17a-ethinylestradiol.Keywords: 17α-ethinylestradiol, adsorption, activated carbon, sewage sludge, micropollutants
Procedia PDF Downloads 951111 Numerical Analysis of Fluid Mixing in Three Split and Recombine Micromixers at Different Inlets Volume Ratio
Authors: Vladimir Viktorov, M. Readul Mahmud, Carmen Visconte
Abstract:
Numerical simulation were carried out to study the mixing of miscible liquid at different inlets volume ratio (1 to 3) within two existing mixers namely Chain, Tear-drop and one new “C-H” mixer. The new passive C-H micromixer is developed based on split and recombine principles, combining the operation concepts of known Chain mixer and H mixer. The mixing performances of the three micromixers were predicted by a preliminary numerical analysis of the flow patterns inside the channel in terms of the segregation or distribution of path lines. Afterward, the efficiency and the pressure drop were investigated numerically, taking into account species transport. All numerical calculations were computed at a wide range of Reynolds number from 1 to 100. Among the presented three micromixers, tear-drop provides fairly good efficiency except in the middle range of Re numbers but has high-pressure drop. In addition, inlets flow ratio has a significant influence on efficiency, especially at the Re number range of 10 to 50, Moreover maximum increase of efficiency is almost 10% when inlets flow ratio is increased by 1. Chain mixer presents relatively low mixing efficiency at low and middle range of Re numbers (5≤Re≤50) but has reasonable pressure drop. Furthermore, Chain mixer shows almost no dependence on inlets flow ratio. Whereas, C-H mixer poses excellent mixing efficiency (more than 93%) for all range of Re numbers and causes the lowest pressure drop, On top of that efficiency has slight dependency on inlets flow ratio. In addition, C-H mixer shows respectively about three and two times lower pressure drop than Tear-drop and Chain mixers.Keywords: CFD, micromixing, passive micromixer, SAR
Procedia PDF Downloads 4821110 Design of SAE J2716 Single Edge Nibble Transmission Digital Sensor Interface for Automotive Applications
Authors: Jongbae Lee, Seongsoo Lee
Abstract:
Modern sensors often embed small-size digital controller for sensor control, value calibration, and signal processing. These sensors require digital data communication with host microprocessors, but conventional digital communication protocols are too heavy for price reduction. SAE J2716 SENT (single edge nibble transmission) protocol transmits direct digital waveforms instead of complicated analog modulated signals. In this paper, a SENT interface is designed in Verilog HDL (hardware description language) and implemented in FPGA (field-programmable gate array) evaluation board. The designed SENT interface consists of frame encoder/decoder, configuration register, tick period generator, CRC (cyclic redundancy code) generator/checker, and TX/RX (transmission/reception) buffer. Frame encoder/decoder is implemented as a finite state machine, and it controls whole SENT interface. Configuration register contains various parameters such as operation mode, tick length, CRC option, pause pulse option, and number of nibble data. Tick period generator generates tick signals from input clock. CRC generator/checker generates or checks CRC in the SENT data frame. TX/RX buffer stores transmission/received data. The designed SENT interface can send or receives digital data in 25~65 kbps at 3 us tick. Synthesized in 0.18 um fabrication technologies, it is implemented about 2,500 gates.Keywords: digital sensor interface, SAE J2716, SENT, verilog HDL
Procedia PDF Downloads 3001109 Hysteresis Modeling in Iron-Dominated Magnets Based on a Deep Neural Network Approach
Authors: Maria Amodeo, Pasquale Arpaia, Marco Buzio, Vincenzo Di Capua, Francesco Donnarumma
Abstract:
Different deep neural network architectures have been compared and tested to predict magnetic hysteresis in the context of pulsed electromagnets for experimental physics applications. Modelling quasi-static or dynamic major and especially minor hysteresis loops is one of the most challenging topics for computational magnetism. Recent attempts at mathematical prediction in this context using Preisach models could not attain better than percent-level accuracy. Hence, this work explores neural network approaches and shows that the architecture that best fits the measured magnetic field behaviour, including the effects of hysteresis and eddy currents, is the nonlinear autoregressive exogenous neural network (NARX) model. This architecture aims to achieve a relative RMSE of the order of a few 100 ppm for complex magnetic field cycling, including arbitrary sequences of pseudo-random high field and low field cycles. The NARX-based architecture is compared with the state-of-the-art, showing better performance than the classical operator-based and differential models, and is tested on a reference quadrupole magnetic lens used for CERN particle beams, chosen as a case study. The training and test datasets are a representative example of real-world magnet operation; this makes the good result obtained very promising for future applications in this context.Keywords: deep neural network, magnetic modelling, measurement and empirical software engineering, NARX
Procedia PDF Downloads 1301108 Modeling and Simulation of Ship Structures Using Finite Element Method
Authors: Javid Iqbal, Zhu Shifan
Abstract:
The development in the construction of unconventional ships and the implementation of lightweight materials have shown a large impulse towards finite element (FE) method, making it a general tool for ship design. This paper briefly presents the modeling and analysis techniques of ship structures using FE method for complex boundary conditions which are difficult to analyze by existing Ship Classification Societies rules. During operation, all ships experience complex loading conditions. These loads are general categories into thermal loads, linear static, dynamic and non-linear loads. General strength of the ship structure is analyzed using static FE analysis. FE method is also suitable to consider the local loads generated by ballast tanks and cargo in addition to hydrostatic and hydrodynamic loads. Vibration analysis of a ship structure and its components can be performed using FE method which helps in obtaining the dynamic stability of the ship. FE method has developed better techniques for calculation of natural frequencies and different mode shapes of ship structure to avoid resonance both globally and locally. There is a lot of development towards the ideal design in ship industry over the past few years for solving complex engineering problems by employing the data stored in the FE model. This paper provides an overview of ship modeling methodology for FE analysis and its general application. Historical background, the basic concept of FE, advantages, and disadvantages of FE analysis are also reported along with examples related to hull strength and structural components.Keywords: dynamic analysis, finite element methods, ship structure, vibration analysis
Procedia PDF Downloads 1361107 Aerodynamic Heating and Drag Reduction of Pegasus-XL Satellite Launch Vehicle
Authors: Syed Muhammad Awais Tahir, Syed Hossein Raza Hamdani
Abstract:
In the last two years, there has been a substantial increase in the rate of satellite launches. To keep up with the technology, it is imperative that the launch cost must be made affordable, especially in developing and underdeveloped countries. Launch cost is directly affected by the launch vehicle’s aerodynamic performance. Pegasus-XL SLV (Satellite Launch Vehicle) has been serving as a commercial SLV for the last 26 years, commencing its commercial flight operation from the six operational sites all around the US and Europe, and the Marshal Islands. Aerodynamic heating and drag contribute largely to Pegasus’s flight performance. The objective of this study is to reduce the aerodynamic heating and drag on Pegasus’s body significantly for supersonic and hypersonic flight regimes. Aerodynamic data for Pegasus’s first flight has been validated through CFD (Computational Fluid Dynamics), and then drag and aerodynamic heating is reduced by using a combination of a forward-facing cylindrical spike and a conical aero-disk at the actual operational flight conditions. CFD analysis using ANSYS fluent will be carried out for Mach no. ranges from 0.83 to 7.8, and AoA (Angle of Attack) ranges from -4 to +24 degrees for both simple and spiked-configuration, and then the comparison will be drawn using a variety of graphs and contours. Expected drag reduction for supersonic flight is to be around 15% to 25%, and for hypersonic flight is to be around 30% to 50%, especially for AoA < 15⁰. A 5% to 10% reduction in aerodynamic heating is expected to be achieved for hypersonic regions. In conclusion, the aerodynamic performance of air-launched Pegasus-XL SLV can be further enhanced, leading to its optimal fuel usage to achieve a more economical orbital flight.Keywords: aerodynamics, pegasus-XL, drag reduction, aerodynamic heating, satellite launch vehicle, SLV, spike, aero-disk
Procedia PDF Downloads 1061106 Horse Racing on Life Support: How to save the Sport of Kings in the United States
Authors: Mick Jackowski
Abstract:
In terms of popularity in the United States, horse racing has been in a steady state of decline since the 1970s. This trend can be attributed to deterioration in the prestige of the sport, due to a shift in cultural values around the treatment of horses, as well as the growing interest of other sports and gambling options. Despite this drift, horse racing still commands a significant piece of the sport landscape through specific events like the Triple Crown and the Breeders Cup. The 2024 Kentucky Derby enjoyed it largest peak television audience (20.1 million) ever. It is because of this still significant attraction to thoroughbred racing that hope exists, not only for the survivability of one of the oldest organized sports in North America, but also for its future growth. But the spectacle that makes select races very popular must be expanded to tracks around the country on a regular basis. The first step is to create a centralized governing body that regulates operation of all races at all tracks in the country, instead of the state-by-state government fiefdoms that currently oversee operations in each jurisdiction. One league office, if you will, can also better coordinate marketing efforts to promote races. These promotions, though, must be targeted to specific audiences, focusing on the strengths that horse racing has in relation to other recreational activities. The industry should utilize a multi-segment strategy that targets the following four groups: Families, Young Adults, Fashion-Conscious, and Sports Bettors. Beyond the traditional marketing mix, the most vital means of establishing and maintaining relationships with each of these consumer segments is through community building.Keywords: community building, horse racing, sport marketing, thoroughbreds
Procedia PDF Downloads 191105 A Study of Behaviors in Using Social Networks of Corporate Personnel of Suan Sunandha Rajabhat University
Authors: Wipada Chaiwchan
Abstract:
This research aims to study behaviors in using social networks of Corporate personnel of Suan Sunandha Rajabhat University. The sample used in the study were two groups: 1) Academic Officer 70 persons and 2) Operation Officer 143 persons were used in this study. The tools in this research consisted of questionnaire which the data were analyzed by using percentage, average (X) and Standard deviation (S.D.) and Independent Sample T-Test to test the difference between the mean values obtained from two independent samples, and One-way anova to analysis of variance, and Multiple comparisons to test that the average pair of different methods by Fisher’s Least Significant Different (LSD). The study result found that the most of corporate personnel have purpose in using social network to information awareness aspect was knowledge and online conference with social media. By using the average more than 3 hours per day in everyday. Using time in working in one day and there are computers connected to the Internet at home, by using the communication in the operational processes. Behaviors using social networks in relation to gender, age, job title, department, and type of personnel. Hypothesis testing, and analysis of variance for the effects of this analysis is divided into three aspects: The use of online social networks, the attitude of the users and the security analysis has found that Corporate Personnel of Suan Sunandha Rajabhat University. Overall and specifically at the high level, and considering each item found all at a high level. By sorting of the social network (X=3.22), The attitude of the users (X= 3.06) and the security (X= 3.11). The overall behaviors using of each side (X=3.11).Keywords: social network, behaviors, social media, computer information systems
Procedia PDF Downloads 3941104 A Single Stage Cleft Rhinoplasty Technique for Primary Unilateral Cleft Lip and Palate 'The Gujrat Technique'
Authors: Diaa Othman, Muhammad Adil Khan, Muhammad Riaz
Abstract:
Without an early intervention to correct the unilateral complete cleft lip and palate deformity, nasal architecture can progress to an exaggerated cleft nose deformity. We present the results of a modified unilateral cleft rhinoplasty procedure ‘the Gujrat technique’ to correct this deformity. Ninety pediatric and adult patients with non-syndromic unilateral cleft lip underwent primary and secondary composite cleft rhinoplasty using the Gujrat technique as a single stage operation over a 10-year period. The technique involved an open rhinoplasty with Tennison lip repair, and employed a combination of three autologous cartilage grafts, seven cartilage-molding sutures and a prolene mesh graft for alar base support. Post-operative evaluation of nasal symmetry was undertaken using the validated computer program ‘SymNose’. Functional outcome and patient satisfaction were assessed using the NOSE scale and ROE (rhinoplasty outcome evaluation) questionnaires. The single group study design used the non-parametric matching pairs Wilcoxon Sign test (p < 0.001), and showed overall good to excellent functional and aesthetic outcomes, including nasal projection and tip definition, and higher scores of the digital SymNose grading system. Objective assessment of the Gujrat cleft rhinoplasty technique demonstrates its aesthetic appeal and functional versatility. Overall it is a simple and reproducible technique, with no significant complications.Keywords: cleft lip and palate, congenital rhinoplasty, nasal deformity, secondary rhinoplasty
Procedia PDF Downloads 2031103 Safe Zone: A Framework for Detecting and Preventing Drones Misuse
Authors: AlHanoof A. Alharbi, Fatima M. Alamoudi, Razan A. Albrahim, Sarah F. Alharbi, Abdullah M Almuhaideb, Norah A. Almubairik, Abdulrahman Alharby, Naya M. Nagy
Abstract:
Recently, drones received a rapid interest in different industries worldwide due to its powerful impact. However, limitations still exist in this emerging technology, especially privacy violation. These aircrafts consistently threaten the security of entities by entering restricted areas accidentally or deliberately. Therefore, this research project aims to develop drone detection and prevention mechanism to protect the restricted area. Until now, none of the solutions have met the optimal requirements of detection which are cost-effectiveness, high accuracy, long range, convenience, unaffected by noise and generalization. In terms of prevention, the existing methods are focusing on impractical solutions such as catching a drone by a larger drone, training an eagle or a gun. In addition, the practical solutions have limitations, such as the No-Fly Zone and PITBULL jammers. According to our study and analysis of previous related works, none of the solutions includes detection and prevention at the same time. The proposed solution is a combination of detection and prevention methods. To implement the detection system, a passive radar will be used to properly identify the drone against any possible flying objects. As for the prevention, jamming signals and forceful safe landing of the drone integrated together to stop the drone’s operation. We believe that applying this mechanism will limit the drone’s invasion of privacy incidents against highly restricted properties. Consequently, it effectively accelerates drones‘ usages at personal and governmental levels.Keywords: detection, drone, jamming, prevention, privacy, RF, radar, UAV
Procedia PDF Downloads 2111102 Event Related Brain Potentials Evoked by Carmen in Musicians and Dancers
Authors: Hanna Poikonen, Petri Toiviainen, Mari Tervaniemi
Abstract:
Event-related potentials (ERPs) evoked by simple tones in the brain have been extensively studied. However, in reality the music surrounding us is spectrally and temporally complex and dynamic. Thus, the research using natural sounds is crucial in understanding the operation of the brain in its natural environment. Music is an excellent example of natural stimulation, which, in various forms, has always been an essential part of different cultures. In addition to sensory responses, music elicits vast cognitive and emotional processes in the brain. When compared to laymen, professional musicians have stronger ERP responses in processing individual musical features in simple tone sequences, such as changes in pitch, timbre and harmony. Here we show that the ERP responses evoked by rapid changes in individual musical features are more intense in musicians than in laymen, also while listening to long excerpts of the composition Carmen. Interestingly, for professional dancers, the amplitudes of the cognitive P300 response are weaker than for musicians but still stronger than for laymen. Also, the cognitive P300 latencies of musicians are significantly shorter whereas the latencies of laymen are significantly longer. In contrast, sensory N100 do not differ in amplitude or latency between musicians and laymen. These results, acquired from a novel ERP methodology for natural music, suggest that we can take the leap of studying the brain with long pieces of natural music also with the ERP method of electroencephalography (EEG), as has already been made with functional magnetic resonance (fMRI), as these two brain imaging devices complement each other.Keywords: electroencephalography, expertise, musical features, real-life music
Procedia PDF Downloads 4841101 Injunctions, Disjunctions, Remnants: The Reverse of Unity
Authors: Igor Guatelli
Abstract:
The universe of aesthetic perception entails impasses about sensitive divergences that each text or visual object may be subjected to. If approached through intertextuality that is not based on the misleading notion of kinships or similarities a priori admissible, the possibility of anachronistic, heterogeneous - and non-diachronic - assemblies can enhance the emergence of interval movements, intermediate, and conflicting, conducive to a method of reading, interpreting, and assigning meaning that escapes the rigid antinomies of the mere being and non-being of things. In negative, they operate in a relationship built by the lack of an adjusted meaning set by their positive existences, with no remainders; the generated interval becomes the remnant of each of them; it is the opening that obscures the stable positions of each one. Without the negative of absence, of that which is always missing or must be missing in a text, concept, or image made positive by history, nothing is perceived beyond what has been already given. Pairings or binary oppositions cannot lead only to functional syntheses; on the contrary, methodological disturbances accumulated by the approximation of signs and entities can initiate a process of becoming as an opening to an unforeseen other, transformation until a moment when the difficulties of [re]conciliation become the mainstay of a future of that sign/entity, not envisioned a priori. A counter-history can emerge from these unprecedented, misadjusted approaches, beginnings of unassigned injunctions and disjunctions, in short, difficult alliances that open cracks in a supposedly cohesive history, chained in its apparent linearity with no remains, understood as a categorical historical imperative. Interstices are minority fields that, because of their opening, are capable of causing opacity in that which, apparently, presents itself with irreducible clarity. Resulting from an incomplete and maladjusted [at the least dual] marriage between the signs/entities that originate them, this interval may destabilize and cause disorder in these entities and their own meanings. The interstitials offer a hyphenated relationship: a simultaneous union and separation, a spacing between the entity’s identity and its otherness or, alterity. One and the other may no longer be seen without the crack or fissure that now separates them, uniting, by a space-time lapse. Ontological, semantic shifts are caused by this fissure, an absence between one and the other, one with and against the other. Based on an improbable approximation between some conceptual and semantic shifts within the design production of architect Rem Koolhaas and the textual production of the philosopher Jacques Derrida, this article questions the notion of unity, coherence, affinity, and complementarity in the process of construction of thought from these ontological, epistemological, and semiological fissures that rattle the signs/entities and their stable meanings. Fissures in a thought that is considered coherent, cohesive, formatted are the negativity that constitutes the interstices that allow us to move towards what still remains as non-identity, which allows us to begin another story.Keywords: clearing, interstice, negative, remnant, spectrum
Procedia PDF Downloads 1341100 A Comparative Approach for Modeling the Toxicity of Metal Mixtures in Two Ecologically Related Three-Spined (Gasterosteus aculeatus L.) And Nine-Spined (Pungitius pungitius L.) Sticklebacks
Authors: Tomas Makaras
Abstract:
Sticklebacks (Gasterosteiformes) are increasingly used in ecological and evolutionary research and become well-established role as model species for biologists. However, ecotoxicology studies concerning behavioural effects in sticklebacks regarding stress responses, mainly induced by chemical mixtures, have hardly been addressed. Moreover, although many authors in their studies emphasised the similarity between three-spined and nine-spined stickleback in morphological, neuroanatomical and behavioural adaptations to environmental changes, several comparative studies have revealed considerable differences between these species in and their susceptibility and resistance to variousstressors in laboratory experiments. The hypothesis of this study was that three-spined and nine-spined stickleback species will demonstrate apparent differences in response patterns and sensitivity to metal-based chemicals stimuli. For this purpose, we investigated the swimming behaviour (including mortality rate based on 96-h LC50 values) of two ecologically similar three-spined (Gasterosteusaculeatus) and nine-spined sticklebacks (Pungitiuspungitius) to short-term (up to 24 h) metal mixture (MIX) exposure. We evaluated the relevance and efficacy of behavioural responses of test species in the early toxicity assessment of chemical mixtures. Fish exposed to six (Zn, Pb, Cd, Cu, Ni and Cr) metals in the mixture were either singled out by the Water Framework Directive as priority or as relevant substances in surface water, which was prepared according to the environmental quality standards (EQSs) of these metals set for inland waters in the European Union (EU) (Directive 2013/39/EU). Based on acute toxicity results, G. aculeatus found to be slightly (1.4-fold) more tolerant of MIX impact than those of P. pungitius specimens. The performed behavioural analysis showed the main effect on the interaction between time, species and treatment variables. Although both species exposed to MIX revealed a decreasing tendency in swimming activity, these species’ responsiveness to MIX was somewhat different. Substantial changes in the activity of G. aculeatus were established after 3-h exposure to MIX solutions, which was 1.43-fold lower, while in the case of P. pungitius, 1.96-fold higher than established 96-h LC50 values for each species. This study demonstrated species-specific differences in response sensitivity to metal-based water pollution, indicating behavioural insensitivity of P. pungitiuscompared to G. aculeatus. While many studies highlight the usefulness and suitability of nine-spined sticklebacks for evolutionary and ecological research, attested by their increasing popularity in these fields, great caution must be exercised when using them as model species in ecotoxicological research to probe metal contamination. Meanwhile, G. aculeatus showed to be a promising bioindicator species in the environmental ecotoxicology field.Keywords: acute toxicity, comparative behaviour, metal mixture, swimming activity
Procedia PDF Downloads 1621099 Localization of Frontal and Temporal Speech Areas in Brain Tumor Patients by Their Structural Connections with Probabilistic Tractography
Authors: B.Shukir, H.Woo, P.Barzo, D.Kis
Abstract:
Preoperative brain mapping in tumors involving the speech areas has an important role to reduce surgical risks. Functional magnetic resonance imaging (fMRI) is the gold standard method to localize cortical speech areas preoperatively, but its availability in clinical routine is difficult. Diffusion MRI based probabilistic tractography is available in head MRI. It’s used to segment cortical subregions by their structural connectivity. In our study, we used probabilistic tractography to localize the frontal and temporal cortical speech areas. 15 patients with left frontal tumor were enrolled to our study. Speech fMRI and diffusion MRI acquired preoperatively. The standard automated anatomical labelling atlas 3 (AAL3) cortical atlas used to define 76 left frontal and 118 left temporal potential speech areas. 4 types of tractography were run according to the structural connection of these regions to the left arcuate fascicle (FA) to localize those cortical areas which have speech functions: 1, frontal through FA; 2, frontal with FA; 3, temporal to FA; 4, temporal with FA connections were determined. Thresholds of 1%, 5%, 10% and 15% applied. At each level, the number of affected frontal and temporal regions by fMRI and tractography were defined, the sensitivity and specificity were calculated. At the level of 1% threshold showed the best results. Sensitivity was 61,631,4% and 67,1523,12%, specificity was 87,210,4% and 75,611,37% for frontal and temporal regions, respectively. From our study, we conclude that probabilistic tractography is a reliable preoperative technique to localize cortical speech areas. However, its results are not feasible that the neurosurgeon rely on during the operation.Keywords: brain mapping, brain tumor, fMRI, probabilistic tractography
Procedia PDF Downloads 1661098 Attitudes of Resort Hotel Managers toward Climate Change Adaptation and Mitigation Practices, Bishoftu, Ethiopia
Authors: Mohammed Aman Kassim
Abstract:
This study explored the attitudes of hotel managers toward climate change adaption and mitigation practices in resort hotels located in Bishoftu town, Ethiopia. Weak resource management in the area causes serious environmental problems. So sustainable way forward is needed for the destination in order to reduce environmental damage. Six resorts were selected out of twelve resort hotels in Bishoftu City by using the systematic sampling method, and a total of fifty-six managers were taken for the study. The data analyzed came from self-administered questionnaires, site observation, and a short face-to-face interview with general managers. The results showed that 99% of hotel managers possess positive attitudes toward climate change adaptation and mitigation practices. But they did not show a high commitment to adopting all adaptation and mitigation practices in their hotel’s actions and day-to-day operation. Key adoption influencing factors identified were: owners' commitment toward sustainability, the applicability of government rules and regulations, and incentives for good achievement. The findings also revealed that the attitudes of resort hotel managers toward climate change adaption and mitigation practices are more significantly influenced by their social factors, such as level of education and age, in this study. The study demonstrated that in order to increase managers' commitment and hotels become green: government led-education and training programs, green certification actions, and application of government environmental regulation are important.Keywords: climate change, climate change adaptation and mitigation practices, environmental attitude, resort hotels
Procedia PDF Downloads 1021097 Mobile Augmented Reality for Collaboration in Operation
Authors: Chong-Yang Qiao
Abstract:
Mobile augmented reality (MAR) tracking targets from the surroundings and aids operators for interactive data and procedures visualization, potential equipment and system understandably. Operators remotely communicate and coordinate with each other for the continuous tasks, information and data exchange between control room and work-site. In the routine work, distributed control system (DCS) monitoring and work-site manipulation require operators interact in real-time manners. The critical question is the improvement of user experience in cooperative works through applying Augmented Reality in the traditional industrial field. The purpose of this exploratory study is to find the cognitive model for the multiple task performance by MAR. In particular, the focus will be on the comparison between different tasks and environment factors which influence information processing. Three experiments use interface and interaction design, the content of start-up, maintenance and stop embedded in the mobile application. With the evaluation criteria of time demands and human errors, and analysis of the mental process and the behavior action during the multiple tasks, heuristic evaluation was used to find the operators performance with different situation factors, and record the information processing in recognition, interpretation, judgment and reasoning. The research will find the functional properties of MAR and constrain the development of the cognitive model. Conclusions can be drawn that suggest MAR is easy to use and useful for operators in the remote collaborative works.Keywords: mobile augmented reality, remote collaboration, user experience, cognition model
Procedia PDF Downloads 1971096 The Relationship between Operating Condition and Sludge Wasting of an Aerobic Suspension-Sequencing Batch Reactor (ASSBR) Treating Phenolic Wastewater
Authors: Ali Alattabi, Clare Harris, Rafid Alkhaddar, Ali Alzeyadi
Abstract:
Petroleum refinery wastewater (PRW) can be considered as one of the most significant source of aquatic environmental pollution. It consists of oil and grease along with many other toxic organic pollutants. In recent years, a new technique was implemented using different types of membranes and sequencing batch reactors (SBRs) to treat PRW. SBR is a fill and draw type sludge system which operates in time instead of space. Many researchers have optimised SBRs’ operating conditions to obtain maximum removal of undesired wastewater pollutants. It has gained more importance mainly because of its essential flexibility in cycle time. It can handle shock loads, requires less area for operation and easy to operate. However, bulking sludge or discharging floating or settled sludge during the draw or decant phase with some SBR configurations are still one of the problems of SBR system. The main aim of this study is to develop and innovative design for the SBR optimising the process variables to result is a more robust and efficient process. Several experimental tests will be developed to determine the removal percentages of chemical oxygen demand (COD), Phenol and nitrogen compounds from synthetic PRW. Furthermore, the dissolved oxygen (DO), pH and oxidation-reduction potential (ORP) of the SBR system will be monitored online to ensure a good environment for the microorganisms to biodegrade the organic matter effectively.Keywords: petroleum refinery wastewater, sequencing batch reactor, hydraulic retention time, Phenol, COD, mixed liquor suspended solids (MLSS)
Procedia PDF Downloads 2601095 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects
Authors: Ma Yuzhe, Burra Venkata Durga Kumar
Abstract:
The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.Keywords: Linux, operating system, system management, security
Procedia PDF Downloads 1081094 A Cost Effective Approach to Develop Mid-Size Enterprise Software Adopted the Waterfall Model
Authors: Mohammad Nehal Hasnine, Md Kamrul Hasan Chayon, Md Mobasswer Rahman
Abstract:
Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.Keywords: end-user application development, enterprise software design, information resource management, usability
Procedia PDF Downloads 438