Search results for: real time pest tracking
17081 Experimental Study Analysis of Flow over Pickup Truck’s Cargo Area Using Bed Covers
Authors: Jonathan Rodriguez, Dominga Guerrero, Surupa Shaw
Abstract:
Automobiles are modeled in various forms, and they interact with air when in motion. Aerodynamics is the study of such interactions where solid bodies affect the way air moves around them. The shape of solid bodies can impact the ease at which they move against the flow of air; due to which any additional freightage, or loads, impact its aerodynamics. It is important to transport people and cargo safely. Despite the various safety measures, there are a large number of vehicle-related accidents. This study precisely explores the effects an automobile experiences, with added cargo and covers. The addition of these items changes the original vehicle shape and the approved design for safe driving. This paper showcases the effects of the changed vehicle shape and design via experimental testing conducted on a physical 1:27 scale and CAD model of an F-150 pickup truck, the most common pickup truck in the United States, with differently shaped loads and weight traveling at a constant speed. The additional freightage produces unwanted drag or lift resulting in lower fuel efficiencies and unsafe driving conditions. This study employs an adjustable external shell on the F-150 pickup truck to create a controlled aerodynamic geometry to combat the detrimental effects of additional freightage. The results utilize colored powder [ which acts as a visual medium for the interaction of air with the vehicle], to highlight the impact of the additional freight on the automobile’s external shell. This will be done along with simulation models using Altair CFD software of twelve cases regarding the effects of an added load onto an F-150 pickup truck. This paper is an attempt toward standardizing the geometric design of the external shell, given the uniqueness of every load and its placement on the vehicle; while providing real-time data to be compared to simulation results from the existing literature.Keywords: aerodynamics, CFD, freightage, pickup cover
Procedia PDF Downloads 16817080 Effects of Feeding Time on Survival Rates, Growth Performance and Feeding Behavior of Juvenile Catfish
Authors: Abdullahi Ibrahim
Abstract:
The culture of Clarias gariepinus for fish production is becoming increasingly essential as the fish is contributing to the food abundance and nutritional benefit to family health, income generation, and employment opportunities. The effect of feeding frequency was investigated over a period of ten (10) weeks; the experiment was conducted to monitor survival rates, growth performance, and feeding behavior of juvenile catfish. The experimental fish were randomly assigned to five treatment groups; (i.e., with different feeding frequency intervals) of 100 fish each. Each treatment was replicated twice with 50 fish per replicate. All the groups were fed with floating fish feed (blue crown®). The five treatments (feeding frequency) were T1- once a day feeding of night hours only, T2- twice a day feeding time of morning and night hours, T3- trice a day feeding time of morning, evening and night hours, T-4 four times a day feeding of morning, afternoon, evening, and night hours, T-5 five times a day feeding at four hours interval. There were significant differences (p > 0.05) among treatments. Feed intake and weight gain improved significantly (p < 0.05) in T-4 and T-3. The best of the feeding time on weight gain, survival rate, and feed conversion ratio were obtained at three times a day feeding (T-3) compared to other treatments, especially those fed once and five times feeding a regiment. This might be attributed to the high level of dissolve oxygen and less stress. Feeding fish three times a day is therefore recommended for efficient catfish production to maximize profits as the feed represents more than 50% of aquaculture inputs, particularly in intensive farming systems.Keywords: catfish, floating fish feed, dissolve oxygen, juvenile
Procedia PDF Downloads 15517079 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients
Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi
Abstract:
Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.Keywords: frailty model, latent variables, liver cirrhosis, parametric distribution
Procedia PDF Downloads 26117078 A Game Theory Analysis of the Effectiveness of Passenger Profiling for Transportation Security
Authors: Yael Deutsch, Arieh Gavious
Abstract:
The threat of aviation terrorism and its potential damage became significant after the 9/11 terror attacks. These attacks have led authorities and leaders to suggest that security personnel should overcome politically correct scruples about profiling and use it openly. However, there is a lack of knowledge about the smart usage of profiling and its advantages. We analyze game models that are suitable to specific real-world scenarios, focusing on profiling as a tool to detect potential violators, such as terrorists and smugglers. We provide analytical and clear answers to difficult questions, and by that help fighting against harmful violation acts.Keywords: game theory, profiling, security, nash equilibrium
Procedia PDF Downloads 10917077 Investigation of User Position Accuracy for Stand-Alone and Hybrid Modes of the Indian Navigation with Indian Constellation Satellite System
Authors: Naveen Kumar Perumalla, Devadas Kuna, Mohammed Akhter Ali
Abstract:
Satellite Navigation System such as the United States Global Positioning System (GPS) plays a significant role in determining the user position. Similar to that of GPS, Indian Regional Navigation Satellite System (IRNSS) is a Satellite Navigation System indigenously developed by Indian Space Research Organization (ISRO), India, to meet the country’s navigation applications. This system is also known as Navigation with Indian Constellation (NavIC). The NavIC system’s main objective, is to offer Positioning, Navigation and Timing (PNT) services to users in its two service areas i.e., covering the Indian landmass and the Indian Ocean. Six NavIC satellites are already deployed in the space and their receivers are in the performance evaluation stage. Four NavIC dual frequency receivers are installed in the ‘Advanced GNSS Research Laboratory’ (AGRL) in the Department of Electronics and Communication Engineering, University College of Engineering, Osmania University, India. The NavIC receivers can be operated in two positioning modes: Stand-alone IRNSS and Hybrid (IRNSS+GPS) modes. In this paper, analysis of various parameters such as Dilution of Precision (DoP), three Dimension (3D) Root Mean Square (RMS) Position Error and Horizontal Position Error with respect to Visibility of Satellites is being carried out using the real-time IRNSS data, obtained by operating the receiver in both positioning modes. Two typical days (6th July 2017 and 7th July 2017) are considered for Hyderabad (Latitude-17°24'28.07’N, Longitude-78°31'4.26’E) station are analyzed. It is found that with respect to the considered parameters, the Hybrid mode operation of NavIC receiver is giving better results than that of the standalone positioning mode. This work finds application in development of NavIC receivers for civilian navigation applications.Keywords: DoP, GPS, IRNSS, GNSS, position error, satellite visibility
Procedia PDF Downloads 21317076 Offshore Facilities Load Out: Case Study of Jacket Superstructure Loadout by Strand Jacking Skidding Method
Authors: A. Rahim Baharudin, Nor Arinee binti Mat Saaud, Muhammad Afiq Azman, Farah Adiba A. Sani
Abstract:
Objectives: This paper shares the case study on the engineering analysis, data analysis, and real-time data comparison for qualifying the stand wires' minimum breaking load and safe working load upon loadout operation for a new project and, at the same time, eliminate the risk due to discrepancies and unalignment of COMPANY Technical Standards to Industry Standards and Practices. This paper demonstrates “Lean Construction” for COMPANY’s Project by sustaining fit-for-purpose Technical Requirements of Loadout Strand Wire Factor of Safety (F.S). The case study utilizes historical engineering data from a few loadout operations by skidding methods from different projects. It is also demonstrating and qualifying the skidding wires' minimum breaking load and safe working load used for loadout operation for substructure and other facilities for the future. Methods: Engineering analysis and comparison of data were taken as referred to the international standard and internal COMPANY standard requirements. Data was taken from nine (9) previous projects for both topsides and jacket facilities executed at the several local fabrication yards where load out was conducted by three (3) different service providers with emphasis on four (4) basic elements: i) Industry Standards for Loadout Engineering and Operation Reference: COMPANY internal standard was referred to superseded documents of DNV-OS-H201 and DNV/GL 0013/ND. DNV/GL 0013/ND and DNVGL-ST-N001 do not mention any requirements of Strand Wire F.S of 4.0 for Skidding / Pulling Operations. ii) Reference to past Loadout Engineering and Execution Package: Reference was made to projects delivered by three (3) major offshore facilities operators. Strand Wire F.S observed ranges from 2.0 MBL (Min) to 2.5 MBL (Max). No Loadout Operation using the requirements of 4.0 MBL was sighted from the reference. iii) Strand Jack Equipment Manufacturer Datasheet Reference: Referring to Strand Jack Equipment Manufactured Datasheet by different loadout service providers, it is shown that the Designed F.S for the equipment is also ranging between 2.0 ~ 2.5. Eight (8) Strand Jack Datasheet Model was referred to, ranging from 15 Mt to 850 Mt Capacity; however, there are NO observations of designed F.S 4.0 sighted. iv) Site Monitoring on Actual Loadout Data and Parameter: Max Load on Strand Wire was captured during 2nd Breakout, which is during Static Condition of 12.9 MT / Strand Wire (67.9% Utilization). Max Load on Strand Wire for Dynamic Conditions during Step 8 and Step 12 is 9.4 Mt / Strand Wire (49.5% Utilization). Conclusion: This analysis and study demonstrated the adequacy of strand wires supplied by the service provider were technically sufficient in terms of strength, and via engineering analysis conducted, the minimum breaking load and safe working load utilized and calculated for the projects were satisfied and operated safely for the projects. It is recommended from this study that COMPANY’s technical requirements are to be revised for future projects’ utilization.Keywords: construction, load out, minimum breaking load, safe working load, strand jacking, skidding
Procedia PDF Downloads 11217075 Different Approaches to Teaching a Database Course to Undergraduate and Graduate Students
Authors: Samah Senbel
Abstract:
Database Design is a fundamental part of the Computer Science and Information technology curricula in any school, as well as in the study of management, business administration, and data analytics. In this study, we compare the performance of two groups of students studying the same database design and implementation course at Sacred Heart University in the fall of 2018. Both courses used the same textbook and were taught by the same professor, one for seven graduate students and one for 26 undergraduate students (juniors). The undergraduate students were aged around 20 years old with little work experience, while the graduate students averaged 35 years old and all were employed in computer-related or management-related jobs. The textbook used was 'Database Systems, Design, Implementation, and Management' by Coronel and Morris, and the course was designed to follow the textbook roughly a chapter per week. The first 6 weeks covered the design aspect of a database, followed by a paper exam. The next 6 weeks covered the implementation aspect of the database using SQL followed by a lab exam. Since the undergraduate students are on a 16 week semester, we spend the last three weeks of the course covering NoSQL. This part of the course was not included in this study. After the course was over, we analyze the results of the two groups of students. An interesting discrepancy was observed: In the database design part of the course, the average grade of the graduate students was 92%, while that of the undergraduate students was 77% for the same exam. In the implementation part of the course, we observe the opposite: the average grade of the graduate students was 65% while that of the undergraduate students was 73%. The overall grades were quite similar: the graduate average was 78% and that of the undergraduates was 75%. Based on these results, we concluded that having both classes follow the same time schedule was not beneficial, and an adjustment is needed. The graduates could spend less time on design and the undergraduates would benefit from more design time. In the fall of 2019, 30 students registered for the undergraduate course and 15 students registered for the graduate course. To test our conclusion, the undergraduates spend about 67% of time (eight classes) on the design part of the course and 33% (four classes) on the implementation part, using the exact exams as the previous year. This resulted in an improvement in their average grades on the design part from 77% to 83% and also their implementation average grade from 73% to 79%. In conclusion, we recommend using two separate schedules for teaching the database design course. For undergraduate students, it is important to spend more time on the design part rather than the implementation part of the course. While for the older graduate students, we recommend spending more time on the implementation part, as it seems that is the part they struggle with, even though they have a higher understanding of the design component of databases.Keywords: computer science education, database design, graduate and undergraduate students, pedagogy
Procedia PDF Downloads 12117074 Volatility Switching between Two Regimes
Authors: Josip Visković, Josip Arnerić, Ante Rozga
Abstract:
Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most successful and popular models in modelling time varying volatility are GARCH type models. When financial returns exhibit sudden jumps that are due to structural breaks, standard GARCH models show high volatility persistence, i.e. integrated behaviour of the conditional variance. In such situations models in which the parameters are allowed to change over time are more appropriate. This paper compares different GARCH models in terms of their ability to describe structural changes in returns caused by financial crisis at stock markets of six selected central and east European countries. The empirical analysis demonstrates that Markov regime switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility when sudden switching occurs in response to financial crisis.Keywords: central and east European countries, financial crisis, Markov switching GARCH model, transition probabilities
Procedia PDF Downloads 22617073 Integration of Hybrid PV-Wind in Three Phase Grid System Using Fuzzy MPPT without Battery Storage for Remote Area
Authors: Thohaku Abdul Hadi, Hadyan Perdana Putra, Nugroho Wicaksono, Adhika Prajna Nandiwardhana, Onang Surya Nugroho, Heri Suryoatmojo, Soedibjo
Abstract:
Access to electricity is now a basic requirement of mankind. Unfortunately, there are still many places around the world which have no access to electricity, such as small islands, where there could potentially be a factory, a plantation, a residential area, or resorts. Many of these places might have substantial potential for energy generation such us Photovoltaic (PV) and Wind turbine (WT), which can be used to generate electricity independently for themselves. Solar energy and wind power are renewable energy sources which are mostly found in nature and also kinds of alternative energy that are still developing in a rapid speed to help and meet the demand of electricity. PV and Wind has a characteristic of power depend on solar irradiation and wind speed based on geographical these areas. This paper presented a control methodology of hybrid small scale PV/Wind energy system that use a fuzzy logic controller (FLC) to extract the maximum power point tracking (MPPT) in different solar irradiation and wind speed. This paper discusses simulation and analysis of the generation process of hybrid resources in MPP and power conditioning unit (PCU) of Photovoltaic (PV) and Wind Turbine (WT) that is connected to the three-phase low voltage electricity grid system (380V) without battery storage. The capacity of the sources used is 2.2 kWp PV and 2.5 kW PMSG (Permanent Magnet Synchronous Generator) -WT power rating. The Modeling of hybrid PV/Wind, as well as integrated power electronics components in grid connected system, are simulated using MATLAB/Simulink.Keywords: fuzzy MPPT, grid connected inverter, photovoltaic (PV), PMSG wind turbine
Procedia PDF Downloads 35517072 Solar Collectors for Northern Countries
Authors: Ilze Pelece, Imants Ziemelis, Henriks Putans
Abstract:
Traditionally the solar energy has been used in southern countries, but it has been used also in northern ones. Most popular kind of use of solar energy in Latvia is solar collector for water heating. Traditionally flat-plate solar collectors are used because of simplicity of manufacturing. However, some peculiarities in use of solar energy in northern countries must be taken into account. In northern countries, there is lower irradiance, but longer day and longer path of the sun during summer. Therefore traditional flat-plate solar collectors are not appropriate enough in northern countries, but new forms must be developed. There are two forms of solar collectors - cylindrical and semi-spherical – proposed in this work. Such collectors can be made both for water or air heating. Theoretical calculations and measurements of energy gain from those two collectors have been done. Results show that daily energy sum received by the semi-spherical collector from the sun at the middle of summer is 1.43 times more than that of the flat one, but for the cylindrical collector, it is 1.74 times more than that of the flat one or equal to that of the tracking to sun flat-plate collector. The resulting difference in energy gain from collector will be not so large because of the difference in heat loses. Heat can be decreased by switching off the water circulation pump when the sun is covered by clouds. For this purpose solar batteries, powered pump can be used instead of complicated and expensive automatics. Even more important than overall energy gain is the fact that semi-spherical and cylindrical collectors work all day (17 hours in the middle of summer at 57 northern latitudes), while flat-plate collector only about 11 hours. Yearly energy sum received by the collector from the sun is 1.5 and 1.9 times larger for the semi-spherical and cylindrical collector respectively as for the flat one. The cylindrical solar collector is easier to manufacture, but semi-spherical one is more aesthetical and durable against the impact of the wind. Although solar collectors for water and air heating are studied in this article, main ideas are applicable also for solar batteries.Keywords: cylindric, semi-spherical, solar collector, solar energy, water heating
Procedia PDF Downloads 26617071 Identification of Vehicle Dynamic Parameters by Using Optimized Exciting Trajectory on 3- DOF Parallel Manipulator
Authors: Di Yao, Gunther Prokop, Kay Buttner
Abstract:
Dynamic parameters, including the center of gravity, mass and inertia moments of vehicle, play an essential role in vehicle simulation, collision test and real-time control of vehicle active systems. To identify the important vehicle dynamic parameters, a systematic parameter identification procedure is studied in this work. In the first step of the procedure, a conceptual parallel manipulator (virtual test rig), which possesses three rotational degrees-of-freedom, is firstly proposed. To realize kinematic characteristics of the conceptual parallel manipulator, the kinematic analysis consists of inverse kinematic and singularity architecture is carried out. Based on the Euler's rotation equations for rigid body dynamics, the dynamic model of parallel manipulator and derivation of measurement matrix for parameter identification are presented subsequently. In order to reduce the sensitivity of parameter identification to measurement noise and other unexpected disturbances, a parameter optimization process of searching for optimal exciting trajectory of parallel manipulator is conducted in the following section. For this purpose, the 321-Euler-angles defined by parameterized finite-Fourier-series are primarily used to describe the general exciting trajectory of parallel manipulator. To minimize the condition number of measurement matrix for achieving better parameter identification accuracy, the unknown coefficients of parameterized finite-Fourier-series are estimated by employing an iterative algorithm based on MATLAB®. Meanwhile, the iterative algorithm will ensure the parallel manipulator still keeps in an achievable working status during the execution of optimal exciting trajectory. It is showed that the proposed procedure and methods in this work can effectively identify the vehicle dynamic parameters and could be an important application of parallel manipulator in the fields of parameter identification and test rig development.Keywords: parameter identification, parallel manipulator, singularity architecture, dynamic modelling, exciting trajectory
Procedia PDF Downloads 26617070 Use and Effects of Kanban Board from the Aspects of Brothers Furniture Limited
Authors: Kazi Rizvan, Yamin Rekhu
Abstract:
Due to high competitiveness in industries throughout the world, every industry is trying hard to utilize all their resources to keep their productivity as high as possible. Many tools have been being used to ensure smoother flow of an operation, to balance tasks, to maintain proper schedules for tasks, to maintain proper sequence for tasks, to reduce unproductive time. All of these tools are used to augment productivity within an industry. Kanban board is one of them and of the many important tools of lean production system. Kanban Board is a visual depiction of the status of tasks. Kanban board shows the actual status of the tasks. It conveys the progress and issues of tasks as well. Using Kanban Board, tasks can be distributed among workers and operation targets can be visually represented to them. In this paper, an example of Kanban board from the aspects of Brothers Furniture Limited was taken and how the Kanban board system was implemented, how the board was designed and how it was made easily perceivable for the less literate or illiterate workers. The Kanban board was designed for the packing section of Brothers Furniture Limited. It was implemented for the purpose of representing the tasks flow to the workers and to mitigate the time that was wasted while the workers remained wondering about what task they should start after they finish one. Kanban board subsumed seven columns and there was a column for comments where if any problem occurred during working on the tasks. Kanban board was helpful for the workers as the board showed the urgency of the tasks. It was also helpful for the store section as they could understand which products and how much of them could be delivered to store at any certain time. Kanban board had all the information centralized which is why the work-flow got paced up and idle time was minimized. Regardless of many workers being illiterate or less literate, Kanban board was still explicable for the workers as the Kanban cards were colored. Since the significance of colors can be conveniently interpretable to them, colored cards helped a great deal in that matter. Hence, the illiterate or less literate workers didn’t have to spend time wondering about the significance of the cards. Even when the workers weren’t told the significance of the colored cards, they could grow a feeling about their meaning as colors can trigger anyone’s mind to perceive the situation. As a result, the board elucidated the workers about what board required them to do, when to do and what to do next. Kanban board alleviated excessive time between tasks by setting day-plan for targeted tasks and it also reduced time during tasks as the workers were acknowledged of forthcoming tasks for a day. Being very specific to the tasks, Kanban board helped the workers become more focused on their tasks helped them do their job with more perfection. As a result, The Kanban board helped achieve a 8.75% increase in productivity than the productivity before the Kanban board was implemented.Keywords: color, Kanban Board, Lean Tool, literacy, packing, productivity
Procedia PDF Downloads 23317069 Creation of a Mentoring Program for Improving the Education of Industrial Engineers
Authors: Maria Da Glória Diniz De Almeida, Andreia M. P. Salgado
Abstract:
This paper aims to present the creation of a mentoring program to be applied in developing future junior industrial engineers acting professionally. Its objective is to contribute to a better professional performance as engineers. It is a case-study for the RIP region (including the cities of Resende, Itatiaia and Porto Real), which is located in an industrial area in Rio de Janeiro State, in Brazil. As a result, 87% of mentors and mentees approved the program as efficient, based on the initial targets.Keywords: mentoring program, mentors and mentees, student professional development, young engineers education
Procedia PDF Downloads 46617068 Affective Robots: Evaluation of Automatic Emotion Recognition Approaches on a Humanoid Robot towards Emotionally Intelligent Machines
Authors: Silvia Santano Guillén, Luigi Lo Iacono, Christian Meder
Abstract:
One of the main aims of current social robotic research is to improve the robots’ abilities to interact with humans. In order to achieve an interaction similar to that among humans, robots should be able to communicate in an intuitive and natural way and appropriately interpret human affects during social interactions. Similarly to how humans are able to recognize emotions in other humans, machines are capable of extracting information from the various ways humans convey emotions—including facial expression, speech, gesture or text—and using this information for improved human computer interaction. This can be described as Affective Computing, an interdisciplinary field that expands into otherwise unrelated fields like psychology and cognitive science and involves the research and development of systems that can recognize and interpret human affects. To leverage these emotional capabilities by embedding them in humanoid robots is the foundation of the concept Affective Robots, which has the objective of making robots capable of sensing the user’s current mood and personality traits and adapt their behavior in the most appropriate manner based on that. In this paper, the emotion recognition capabilities of the humanoid robot Pepper are experimentally explored, based on the facial expressions for the so-called basic emotions, as well as how it performs in contrast to other state-of-the-art approaches with both expression databases compiled in academic environments and real subjects showing posed expressions as well as spontaneous emotional reactions. The experiments’ results show that the detection accuracy amongst the evaluated approaches differs substantially. The introduced experiments offer a general structure and approach for conducting such experimental evaluations. The paper further suggests that the most meaningful results are obtained by conducting experiments with real subjects expressing the emotions as spontaneous reactions.Keywords: affective computing, emotion recognition, humanoid robot, human-robot-interaction (HRI), social robots
Procedia PDF Downloads 23517067 Investigation of Some Flotation Parameters and the Role of Dispersants in the Flotation of Chalcopyrite
Authors: H. A. Taner, V. Önen
Abstract:
A suitable choice of flotation parameters and reagents have a strong effect on the effectiveness of flotation process. The objective of this paper is to give an overview of the flotation of chalcopyrite with the different conditions and dispersants. Flotation parameters such as grinding time, pH, type, and dosage of dispersant were investigated. In order to understand the interaction of some dispersants, sodium silicate, sodium hexametaphosphate and sodium polyphosphate were used. The optimum results were obtained at a pH of 11.5 and a grinding time of 10 minutes. A copper concentrate was produced assaying 29.85% CuFeS2 and 65.97% flotation recovery under optimum rougher flotation conditions with sodium silicate.Keywords: chalcopyrite, dispersant, flotation, reagent
Procedia PDF Downloads 18217066 Modification Of Rubber Swab Tool With Brush To Reduce Rubber Swab Fraction Fishing Time
Authors: T. R. Hidayat, G. Irawan, F. Kurniawan, E. H. I. Prasetya, Suharto, T. F. Ridwan, A. Pitoyo, A. Juniantoro, R. T. Hidayat
Abstract:
Swab activities is an activity to lift fluid from inside the well with the use of a sand line that aims to find out fluid influx after conducting perforation or to reduce the level of fluid as an effort to get the difference between formation pressure with hydrostatic pressure in the well for underbalanced perforation. During the swab activity, problems occur frequent problems occur with the rubber swab. The rubber swab often breaks and becomes a fish inside the well. This rubber swab fishing activity caused the rig operation takes longer, the swab result data becomes too late and create potential losses of well operation for the company. The average time needed for fishing the fractions of rubber swab plus swab work is 42 hours. Innovation made for such problems is to modify the rubber swab tool. The rubber swab tool is modified by provided a series of brushes at the end part of the tool with a thread of connection in order to improve work safety, so when the rubber swab breaks, the broken swab will be lifted by the brush underneath; therefore, it reduces the loss time for rubber swab fishing. This tool has been applied, it and is proven that with this rubber swab tool modification, the rig operation becomes more efficient because it does not carry out the rubber swab fishing activity. The fish fractions of the rubber swab are lifted up to the surface. Therefore, it saves the fuel cost, and well production potentials are obtained. The average time to do swab work after the application of this modified tool is 8 hours.Keywords: rubber swab, modifikasi swab, brush, fishing rubber swab, saving cost
Procedia PDF Downloads 16717065 Genetically Encoded Tool with Time-Resolved Fluorescence Readout for the Calcium Concentration Measurement
Authors: Tatiana R. Simonyan, Elena A. Protasova, Anastasia V. Mamontova, Eugene G. Maksimov, Konstantin A. Lukyanov, Alexey M. Bogdanov
Abstract:
Here, we describe two variants of the calcium indicators based on the GCaMP sensitive core and BrUSLEE fluorescent protein (GCaMP-BrUSLEE and GCaMP-BrUSLEE-145). In contrast to the conventional GCaMP6-family indicators, these fluorophores are characterized by the well-marked responsiveness of their fluorescence decay kinetics to external calcium concentration both in vitro and in cellulo. Specifically, we show that the purified GCaMP-BrUSLEE and GCaMP-BrUSLEE-145 exhibit three-component fluorescence decay kinetics, with the amplitude-normalized lifetime component (t3*A3) of GCaMP-BrUSLEE-145 changing four-fold (500-2000 a.u.) in response to a Ca²⁺ concentration shift in the range of 0—350 nM. Time-resolved fluorescence microscopy of live cells displays the two-fold change of the GCaMP-BrUSLEE-145 mean lifetime upon histamine-stimulated calcium release. The aforementioned Ca²⁺-dependence calls considering the GCaMP-BrUSLEE-145 as a prospective Ca²⁺-indicator with the signal read-out in the time domain.Keywords: calcium imaging, fluorescence lifetime imaging microscopy, fluorescent proteins, genetically encoded indicators
Procedia PDF Downloads 15817064 GAILoc: Improving Fingerprinting-Based Localization System Using Generative Artificial Intelligence
Authors: Getaneh Berie Tarekegn
Abstract:
A precise localization system is crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. The most common method for providing continuous positioning services in outdoor environments is by using a global navigation satellite system (GNSS). Due to nonline-of-sight, multipath, and weather conditions, GNSS systems do not perform well in dense urban, urban, and suburban areas.This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 39 cm, and more than 90% of the errors are less than 82 cm. That is, numerical results proved that, in comparison to traditional methods, the proposed SRCLoc method can significantly improve positioning performance and reduce radio map construction costs.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 7517063 Removal of Vanadium from Industrial Effluents by Natural Ion Exchanger
Authors: Shashikant R. Kuchekar, Haribhau R. Aher, Priti M. Dhage
Abstract:
The removal vanadium from aqueous solution using natural exchanger was investigated. The effects of pH, contact time and exchanger dose were studied at ambient temperature (25 0C ± 2 0C). The equilibrium process was described by the Langmuir isotherm model with adsorption capacity for vanadium. The natural exchanger i.e. tamarindus seeds powder was treated with formaldehyde and sulpuric acid to increase the adsorptivity of metals. The maximum exchange level was attained as 80.1% at pH 3 with exchanger dose 5 g and contact time 60 min. Method is applied for removal of vanadium from industrial effluents.Keywords: industrial effluent, natural ion exchange, Tamarindous indica, vanadium
Procedia PDF Downloads 25117062 Poly-ε-Caprolactone Nanofibers with Synthetic Growth Factor Enriched Liposomes as Controlled Drug Delivery System
Authors: Vera Sovkova, Andrea Mickova, Matej Buzgo, Karolina Vocetkova, Eva Filova, Evzen Amler
Abstract:
PCL (poly-ε-caprolactone) nanofibrous scaffolds with adhered liposomes were prepared and tested as a possible drug delivery system for various synthetic growth factors. TGFβ, bFGF, and IGF-I have been shown to increase hMSC (human mesenchymal stem cells) proliferation and to induce hMSC differentiation. Functionalized PCL nanofibers were prepared with synthetic growth factors encapsulated in liposomes adhered to them in three different concentrations. Other samples contained PCL nanofibers with adhered, free synthetic growth factors. The synthetic growth factors free medium served as a control. The interaction of liposomes with the PCL nanofibers was visualized by SEM, and the release kinetics were determined by ELISA testing. The potential of liposomes, immobilized on the biodegradable scaffolds, as a delivery system for synthetic growth factors, and as a suitable system for MSCs adhesion, proliferation and differentiation in vitro was evaluated by MTS assay, dsDNA amount determination, confocal microscopy, flow cytometry and real-time PCR. The results showed that the growth factors adhered to the PCL nanofibers stimulated cell proliferation mainly up to day 11 and that subsequently their effect was lower. By contrast, the release of the lowest concentration of growth factors from liposomes resulted in gradual proliferation of MSCs throughout the experiment. Moreover, liposomes, as well as free growth factors, stimulated type II collagen production, which was confirmed by immunohistochemical staining using monoclonal antibody against type II collagen. The results of this study indicate that growth factors enriched liposomes adhered to surface of PCL nanofibers could be useful as a drug delivery instrument for application in short timescales, be combined with nanofiber scaffolds to promote local and persistent delivery while mimicking the local microenvironment. This work was supported by project LO1508 from the Ministry of Education, Youth and Sports of the Czech RepublicKeywords: drug delivery, growth factors, hMSC, liposomes, nanofibres
Procedia PDF Downloads 29017061 A Mathematical Model of Blood Perfusion Dependent Temperature Distribution in Transient Case in Human Dermal Region
Authors: Yogesh Shukla
Abstract:
Many attempts have been made to study temperature distribution problem in human tissues under normal environmental and physiological conditions at constant arterial blood temperature. But very few attempts have been made to investigate temperature distribution in human tissues under different arterial blood temperature. In view of above, a finite element model has been developed to unsteady temperature distribution in dermal region in human body. The model has been developed for one dimension unsteady state case. The variation in parameters like thermal conductivity, blood mass flow and metabolic activity with respect to position and time has been incorporated in the model. Appropriate boundary conditions have been framed. The central difference approach has been used in space variable and trapezoidal rule has been employed a long time variable. Numerical results have been obtained to study relationship among temperature and time.Keywords: rate of metabolism, blood mass flow rate, thermal conductivity, heat generation, finite element method
Procedia PDF Downloads 35317060 Analysis of Maternal Death Surveillance and Response: Causes and Contributing Factors in Addis Ababa, Ethiopia, 2022
Authors: Sisay Tiroro Salato
Abstract:
Background: Ethiopia has been implementing the maternal death surveillance and response system to provide real-time actionable information, including causes of death and contributing factors. Analysis of maternal mortality surveillance data was conducted to identify the causes and underlying factors in Addis Ababa, Ethiopia. Methods: We carried out a retrospective surveillance data analysis of 324 maternal deaths reported in Addis Ababa, Ethiopia, from 2017 to 2021. The data were extracted from the national maternal death surveillance and response database, including information from case investigation, verbal autopsy, and facility extraction forms. The data were analyzed by computing frequency and presented in numbers, proportions, and ratios. Results: Of 324 maternal deaths, 92% died in the health facilities, 6.2% in transit, and 1.5% at home. The mean age at death was 28 years, ranging from 17 to 45. The maternal mortality ratio per 100,000 live births was 77for the five years, ranging from 126 in 2017 to 21 in 2021. The direct and indirect causes of death were responsible for 87% and 13%, respectively. The direct causes included obstetric haemorrhage, hypertensive disorders in pregnancy, puerperal sepsis, embolism, obstructed labour, and abortion. The third delay (delay in receiving care after reaching health facilities) accounted for 57% of deaths, while the first delay (delay in deciding to seek health care) and the second delay (delay in reaching health facilities) and accounted for 34% and 24%, respectively. Late arrival to the referral facility, delayed management after admission, andnon-recognition of danger signs were underlying factors. Conclusion: Over 86% of maternal deaths were attributed by avoidable direct causes. The majority of women do try to reach health services when an emergency occurs, but the third delays present a major problem. Improving the quality of care at the healthcare facility level will help to reduce maternal death.Keywords: maternal death, surveillance, delays, factors
Procedia PDF Downloads 11317059 Reduction of Biofilm Formation in Closed Circuit Cooling Towers
Authors: Irfan Turetgen
Abstract:
Closed-circuit cooling towers are cooling units that operate according to the indirect cooling principle. Unlike the open-loop cooling tower, the filler material includes a closed-loop water-operated heat exchanger. The main purpose of this heat exchanger is to prevent the cooled process water from contacting with the external environment. In order to ensure that the hot water is cooled, the water is cooled by the air flow and the circulation water of the tower as it passes through the pipe. They are now more commonly used than open loop cooling towers that provide cooling with plastic filling material. As with all surfaces in contact with water, there is a biofilm formation on the outer surface of the pipe. Although biofilm has been studied very well on plastic surfaces in open loop cooling towers, studies on biofilm layer formed on the heat exchangers of the closed circuit tower have not been found. In the recent study, natural biofilm formation was observed on the heat exchangers of the closed loop tower for 6 months. At the same time, nano-silica coating, which is known to reduce the formation of the biofilm layer, a comparison was made between the two different surfaces in terms of biofilm formation potential. Test surfaces were placed into biofilm reactor along with the untreated control coupons up to 6-months period for biofilm maturation. Natural bacterial communities were monitored to analyze the impact to mimic the real-life conditions. Surfaces were monthly analyzed in situ for their microbial load using epifluorescence microscopy. Wettability is known to play a key role in biofilm formation on surfaces, because characteristics of surface properties affect the bacterial adhesion. Results showed that surface-conditioning with nano-silica significantly reduce (up to 90%) biofilm formation. Easy coating process is a facile and low-cost method to prepare hydrophobic surface without any kinds of expensive compounds or methods.Keywords: biofilms, cooling towers, fill material, nano silica
Procedia PDF Downloads 12917058 On the Dwindling Supply of the Observable Cosmic Microwave Background Radiation
Authors: Jia-Chao Wang
Abstract:
The cosmic microwave background radiation (CMB) freed during the recombination era can be considered as a photon source of small duration; a one-time event happened everywhere in the universe simultaneously. If space is divided into concentric shells centered at an observer’s location, one can imagine that the CMB photons originated from the nearby shells would reach and pass the observer first, and those in shells farther away would follow as time goes forward. In the Big Bang model, space expands rapidly in a time-dependent manner as described by the scale factor. This expansion results in an event horizon coincident with one of the shells, and its radius can be calculated using cosmological calculators available online. Using Planck 2015 results, its value during the recombination era at cosmological time t = 0.379 million years (My) is calculated to be Revent = 56.95 million light-years (Mly). The event horizon sets a boundary beyond which the freed CMB photons will never reach the observer. The photons within the event horizon also exhibit a peculiar behavior. Calculated results show that the CMB observed today was freed in a shell located at 41.8 Mly away (inside the boundary set by Revent) at t = 0.379 My. These photons traveled 13.8 billion years (Gy) to reach here. Similarly, the CMB reaching the observer at t = 1, 5, 10, 20, 40, 60, 80, 100 and 120 Gy are calculated to be originated at shells of R = 16.98, 29.96, 37.79, 46.47, 53.66, 55.91, 56.62, 56.85 and 56.92 Mly, respectively. The results show that as time goes by, the R value approaches Revent = 56.95 Mly but never exceeds it, consistent with the earlier statement that beyond Revent the freed CMB photons will never reach the observer. The difference Revert - R can be used as a measure of the remaining observable CMB photons. Its value becomes smaller and smaller as R approaching Revent, indicating a dwindling supply of the observable CMB radiation. In this paper, detailed dwindling effects near the event horizon are analyzed with the help of online cosmological calculators based on the lambda cold dark matter (ΛCDM) model. It is demonstrated in the literature that assuming the CMB to be a blackbody at recombination (about 3000 K), then it will remain so over time under cosmological redshift and homogeneous expansion of space, but with the temperature lowered (2.725 K now). The present result suggests that the observable CMB photon density, besides changing with space expansion, can also be affected by the dwindling supply associated with the event horizon. This raises the question of whether the blackbody of CMB at recombination can remain so over time. Being able to explain the blackbody nature of the observed CMB is an import part of the success of the Big Bang model. The present results cast some doubts on that and suggest that the model may have an additional challenge to deal with.Keywords: blackbody of CMB, CMB radiation, dwindling supply of CMB, event horizon
Procedia PDF Downloads 11917057 Cotton Fiber Quality Improvement by Introducing Sucrose Synthase (SuS) Gene into Gossypium hirsutum L.
Authors: Ahmad Ali Shahid, Mukhtar Ahmed
Abstract:
The demand for long staple fiber having better strength and length is increasing with the introduction of modern spinning and weaving industry in Pakistan. Work on gene discovery from developing cotton fibers has helped to identify dozens of genes that take part in cotton fiber development and several genes have been characterized for their role in fiber development. Sucrose synthase (SuS) is a key enzyme in the metabolism of sucrose in a plant cell, in cotton fiber it catalyzes a reversible reaction, but preferentially converts sucrose and UDP into fructose and UDP-glucose. UDP-glucose (UDPG) is a nucleotide sugar act as a donor for glucose residue in many glycosylation reactions and is essential for the cytosolic formation of sucrose and involved in the synthesis of cell wall cellulose. The study was focused on successful Agrobacterium-mediated stable transformation of SuS gene in pCAMBIA 1301 into cotton under a CaMV35S promoter. Integration and expression of the gene were confirmed by PCR, GUS assay, and real-time PCR. Young leaves of SuS overexpressing lines showed increased total soluble sugars and plant biomass as compared to non-transgenic control plants. Cellulose contents from fiber were significantly increased. SEM analysis revealed that fibers from transgenic cotton were highly spiral and fiber twist number increased per unit length when compared with control. Morphological data from field plants showed that transgenic plants performed better in field conditions. Incorporation of genes related to cotton fiber length and quality can provide new avenues for fiber improvement. The utilization of this technology would provide an efficient import substitution and sustained production of long-staple fiber in Pakistan to fulfill the industrial requirements.Keywords: agrobacterium-mediated transformation, cotton fiber, sucrose synthase gene, staple length
Procedia PDF Downloads 23317056 Bianchi Type- I Viscous Fluid Cosmological Models with Stiff Matter and Time Dependent Λ- Term
Authors: Rajendra Kumar Dubey
Abstract:
Einstein’s field equations with variable cosmological term Λ are considered in the presence of viscous fluid for Bianchi type I space time. Exact solutions of Einstein’s field equations are obtained by assuming cosmological term Λ Proportional to (R is a scale factor and m is constant). We observed that the shear viscosity is found to be responsible for faster removal of initial anisotropy in the universe. The physical significance of the cosmological models has also been discussed.Keywords: bianchi type, I cosmological model, viscous fluid, cosmological constant Λ
Procedia PDF Downloads 52817055 Expression of DNMT Enzymes-Regulated miRNAs Involving in Epigenetic Event of Tumor and Margin Tissues in Patients with Breast Cancer
Authors: Fatemeh Zeinali Sehrig
Abstract:
Background: miRNAs play an important role in the post-transcriptional regulation of genes, including genes involved in DNA methylation (DNMTs), and are also important regulators of oncogenic pathways. The study of microRNAs and DNMTs in breast cancer allows the development of targeted treatments and early detection of this cancer. Methods and Materials: Clinical Patients and Samples: Institutional guidelines, including ethical approval and informed consent, were followed by the Ethics Committee (Ethics code: IR.IAU.TABRIZ.REC.1401.063) of Tabriz Azad University, Tabriz, Iran. In this study, tissues of 100 patients with breast cancer and tissues of 100 healthy women were collected from Noor Nejat Hospital in Tabriz. The basic characteristics of the patients with breast cancer included: 1)Tumor grade(Grade 3 = 5%, Grade 2 = 87.5%, Grade 1 = 7.5%), 2)Lymph node(Yes = 87.5%, No = 12.5%), 3)Family cancer history(Yes = 47.5%, No = 41.3%, Unknown = 11.2%), 4) Abortion history(Yes = 36.2%).In silico methods (data gathering, process, and build networks): Gene Expression Omnibus (GEO), a high-throughput genomic database, was queried for miRNAs expression profiles in breast cancer. For Experimental protocol Tissue Processing, Total RNA isolation, complementary DNA(cDNA) synthesis, and quantitative real time PCR (QRT-PCR) analysis were performed. Results: In the present study, we found significant (p.value<0.05) changes in the expression level of miRNAs and DNMTs in patients with breast cancer. In bioinformatics studies, the GEO microarray data set, similar to qPCR results, showed a decreased expression of miRNAs and increased expression of DNMTs in breast cancer. Conclusion: According to the results of the present study, which showed a decrease in the expression of miRNAs and DNMTs in breast cancer, it can be said that these genes can be used as important diagnostic and therapeutic biomarkers in breast cancer.Keywords: gene expression omnibus, microarray dataset, breast cancer, miRNA, DNMT (DNA methyltransferases)
Procedia PDF Downloads 3517054 The Military and Motherhood: Identity and Role Expectation within Two Greedy Institutions
Authors: Maureen Montalban
Abstract:
The military is a predominantly male-dominated organisation that has entrenched hierarchical and patriarchal norms. Since 1975, women have been allowed to continue active service in the Australian Defence Force during pregnancy and after the birth of a child; prior to this time, pregnancy was grounds for automatic termination. The military and family, as institutions, make great demands on individuals with respect to their commitment, loyalty, time and energy. This research explores what it means to serve in the Australian Army as a woman through a gender lens, overlaid during a specific time period of their service; that is, during pregnancy, birth, and being a mother. It investigates the external demands faced by servicewomen who are mothers, whether it be from society, the Army, their teammates, their partners, or their children; and how they internally make sense of that with respect to their own identity and role as a mother, servicewoman, partner and as an individual. It also seeks to uncover how Australian Army servicewomen who are also mothers attempt to manage the dilemma of serving two greedy institutions when both expect and demand so much and whether this is, in fact, an impossible dilemma.Keywords: women's health, gender studies, military culture, identity
Procedia PDF Downloads 10217053 Design and Performance Analysis of Resource Management Algorithms in Response to Emergency and Disaster Situations
Authors: Volkan Uygun, H. Birkan Yilmaz, Tuna Tugcu
Abstract:
This study focuses on the development and use of algorithms that address the issue of resource management in response to emergency and disaster situations. The presented system, named Disaster Management Platform (DMP), takes the data from the data sources of service providers and distributes the incoming requests accordingly both to manage load balancing and minimize service time, which results in improved user satisfaction. Three different resource management algorithms, which give different levels of importance to load balancing and service time, are proposed for the study. The first one is the Minimum Distance algorithm, which assigns the request to the closest resource. The second one is the Minimum Load algorithm, which assigns the request to the resource with the minimum load. Finally, the last one is the Hybrid algorithm, which combines the previous two approaches. The performance of the proposed algorithms is evaluated with respect to waiting time, success ratio, and maximum load ratio. The metrics are monitored from simulations, to find the optimal scheme for different loads. Two different simulations are performed in the study, one is time-based and the other is lambda-based. The results indicate that, the Minimum Load algorithm is generally the best in all metrics whereas the Minimum Distance algorithm is the worst in all cases and in all metrics. The leading position in performance is switched between the Minimum Distance and the Hybrid algorithms, as lambda values change.Keywords: emergency and disaster response, resource management algorithm, disaster situations, disaster management platform
Procedia PDF Downloads 33817052 Effect of Temperature and Time on the Yield of Silica from Rice Husk Ash
Authors: Mohammed Adamu Musa, Shehu Saminu Babba
Abstract:
The technological trend towards waste utilization and cost reduction in industrial processing has attracted use of Rice Husk as a value added material. Both rice husk (RH) and Rice Husk Ash (RHA) has been found suitable for wide range of domestic as well as industrial applications. Therefore, the purpose of this research is to produce high grade sodium silicate from rice husk ash by considering the effect of temperature and time of heating as the process variables. The experiment was performed by heating the rice husk at temperatures 500 °C, 600 °C, 700 °C and 800 °C and time 60min, 90min, 120min and 150min were used to obtain the ash. 1.0M of aqueous sodium hydroxide solution was used to dissolve the silicate from the ash, which contained crude sodium silicate. In addition, the ash was neutralized by adding 5M of HCL until the pH reached 3.5 to give silica gel. At 6000C and 120mins, 94.23% silica was obtained from the RHA. At higher temperatures (700 °C and 800 °C) the percentage yield of silica reduced due to surface melting and carbon fixation in the lattice caused by presence of potassium. For this research, 600 °C is considered to be the optimum temperature for silica production from RHA. Silica produced from RHA can generate aggregate value and can be used in areas such as pulp and paper, plastic and rubber reinforcement industries.Keywords: burning, rice husk, rice husk ash, silica, silica gel, temperature
Procedia PDF Downloads 243