Search results for: operation notes
248 Relationshiop Between Occupants' Behaviour And Indoor Air Quality In Malaysian Public Hospital Outpatient Department
Authors: Farha Ibrahim, Ely Zarina Samsudin, Ahmad Razali Ishak, Jeyanthini Sathasivam
Abstract:
Introduction: Indoor air quality (IAQ) has recently gained substantial traction as the airborne transmission of infectious respiratory disease has become an increasing public health concern. Public hospital outpatient department (OPD). IAQ warrants special consideration as it is the most visited department in which patients and staff are all directly impacted by poor IAQ. However, there is limited evidence on IAQ in these settings. Moreover, occupants’ behavior like occupant’s movement and operation of door, windows and appliances, have been shown to significantly affect IAQ, yet the influence of these determinants on IAQ in such settings have not been established. Objectives: This study aims to examine IAQ in Malaysian public hospitals OPD and assess its relationships with occupants’ behavior. Methodology: A multicenter cross-sectional study in which stratified random sampling of Johor public hospitals OPD (n=6) according to building age was conducted. IAQ measurements include indoor air temperature, relative humidity (RH), air velocity (AV), carbon dioxide (CO2), total bacterial count (TBC) and total fungal count (TFC). Occupants’ behaviors in Malaysian public hospital OPD are assessed using observation forms, and results were analyzed. Descriptive statistics were performed to characterize all study variables, whereas non-parametric Spearman Rank correlation analysis was used to assess the correlation between IAQ and occupants’ behavior. Results: After adjusting for potential cofounder, the study has suggested that occupants’ movement in new building, like seated quietly, is significantly correlated with AV in new building (r 0.642, p-value 0.010), CO2 in new (r 0.772, p-value <0.001) and old building (r -0.559, p-value 0.020), TBC in new (r 0.747, p-value 0.001) and old building (r -0.559, p-value 0.020), and TFC in new (r 0.777, p-value <0.001) and old building (r -0.485, p-value 0.049). In addition, standing relaxed movement is correlated with indoor air temperature (r 0.823, p-value <0.001) in new building, CO2 (r 0.559, p-value 0.020), TBC (r 0.559, p-value 0.020), and TFC (r -0.485, p-value 0.049) in old building, while walking is correlated with AV in new building (r -0.642, p-value 0.001), CO2 in new (r -0.772, p-value <0.001) and old building (r 0.559, p-value 0.020), TBC in new (r -0.747, p-value 0.001) and old building (r 0.559, p-value 0.020), and TFC in old building (r -0.485, p-value 0.049). The indoor air temperature is significantly correlated with number of doors kept opened (r 0.522, p-value 0.046), frequency of door adjustments (r 0.753, p-value 0.001), number of windows kept opened (r 0.522, p-value 0.046), number of air-conditioned (AC) switched on (r 0.698, p-value 0.004) and frequency of AC adjustment (r 0.753, p-value 0.001) in new hospital OPD building. AV is found to be significantly correlated with number of doors kept opened (r 0.642, p-value 0.01), frequency of door adjustments (r 0.553, p-value 0.032), number of windows kept opened (r 0.642, p-value 0.01), and frequency of AC adjustment, number of fans switched on, and frequency of fans adjustment(all with r 0.553, p-value 0.032) in new building. In old hospital OPD building, the number of doors kept opened is significantly correlated with CO₂, TBC (both r -0.559, p-value 0.020) and TFC (r -0.495, p-value 0.049), frequency of door adjustment is significantly correlated with CO₂, TBC (both r-0.559, p-value 0.020) and TFC (r -0.495, p-value 0.049), number of windows kept opened is significantly correlated with CO₂, TBC (both r 0.559, p-value 0.020) and TFC (r 0.495, p-value 0.049), frequency of window adjustment is significantly correlated with CO₂,TBC (both r -0.559, p-value 0.020) and TFC (r -0.495, p-value 0.049), number of AC switched on is significantly correlated with CO₂, TBC (both r -0.559, p-value 0.020) and TFC (r -0.495, p-value 0.049),, frequency of AC adjustment is significantly correlated with CO2 (r 0.559, p-value 0.020), TBC (0.559, p-value 0.020) and TFC (r -0.495, p-value 0.049), number of fans switched on is significantly correlated with CO2, TBC (both r 0.559, p-value 0.020) and TFC (r 0.495, p-value 0.049), and frequency of fans adjustment is significantly correlated with CO2, TBC (both r -0.559, p-value 0.020) and TFC (r -0.495, p-value 0.049). Conclusion: This study provided evidence on IAQ parameters in Malaysian public hospitals OPD and significant factors that may be effective targets of prospective intervention, thus enabling stakeholders to develop appropriate policies and programs to mitigate IAQ issues in Malaysian public hospitals OPD.Keywords: outpatient department, iaq, occupants practice, public hospital
Procedia PDF Downloads 93247 The French Ekang Ethnographic Dictionary. The Quantum Approach
Authors: Henda Gnakate Biba, Ndassa Mouafon Issa
Abstract:
Dictionaries modeled on the Western model [tonic accent languages] are not suitable and do not account for tonal languages phonologically, which is why the [prosodic and phonological] ethnographic dictionary was designed. It is a glossary that expresses the tones and the rhythm of words. It recreates exactly the speaking or singing of a tonal language, and allows the non-speaker of this language to pronounce the words as if they were a native. It is a dictionary adapted to tonal languages. It was built from ethnomusicological theorems and phonological processes, according to Jean. J. Rousseau 1776 hypothesis /To say and to sing were once the same thing/. Each word in the French dictionary finds its corresponding language, ekaη. And each word ekaη is written on a musical staff. This ethnographic dictionary is also an inventive, original and innovative research thesis, but it is also an inventive, original and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation and the question of modeling in the human sciences: mathematics, computer science, translation automation and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal and random music. The experimentation confirming the theorization designed a semi-digital, semi-analog application which translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music and deterministic and random music). To test this application, I use a music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). Translation is done (from writing to writing, from writing to speech and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz and, world music or, variety etc. The software runs, giving you the option to choose harmonies, and then you select your melody.Keywords: music, language, entenglement, science, research
Procedia PDF Downloads 69246 Forging A Distinct Understanding of Implicit Bias
Authors: Benjamin D Reese Jr
Abstract:
Implicit bias is understood as unconscious attitudes, stereotypes, or associations that can influence the cognitions, actions, decisions, and interactions of an individual without intentional control. These unconscious attitudes or stereotypes are often targeted toward specific groups of people based on their gender, race, age, perceived sexual orientation or other social categories. Since the late 1980s, there has been a proliferation of research that hypothesizes that the operation of implicit bias is the result of the brain needing to process millions of bits of information every second. Hence, one’s prior individual learning history provides ‘shortcuts’. As soon as one see someone of a certain race, one have immediate associations based on their past learning, and one might make assumptions about their competence, skill, or danger. These assumptions are outside of conscious awareness. In recent years, an alternative conceptualization has been proposed. The ‘bias of crowds’ theory hypothesizes that a given context or situation influences the degree of accessibility of particular biases. For example, in certain geographic communities in the United States, there is a long-standing and deeply ingrained history of structures, policies, and practices that contribute to racial inequities and bias toward African Americans. Hence, negative biases among groups of people towards African Americans are more accessible in such contexts or communities. This theory does not focus on individual brain functioning or cognitive ‘shortcuts.’ Therefore, attempts to modify individual perceptions or learning might have negligible impact on those embedded environmental systems or policies that are within certain contexts or communities. From the ‘bias of crowds’ perspective, high levels of racial bias in a community can be reduced by making fundamental changes in structures, policies, and practices to create a more equitable context or community rather than focusing on training or education aimed at reducing an individual’s biases. The current paper acknowledges and supports the foundational role of long-standing structures, policies, and practices that maintain racial inequities, as well as inequities related to other social categories, and highlights the critical need to continue organizational, community, and national efforts to eliminate those inequities. It also makes a case for providing individual leaders with a deep understanding of the dynamics of how implicit biases impact cognitions, actions, decisions, and interactions so that those leaders might more effectively develop structural changes in the processes and systems under their purview. This approach incorporates both the importance of an individual’s learning history as well as the important variables within the ‘bias of crowds’ theory. The paper also offers a model for leadership education, as well as examples of structural changes leaders might consider.Keywords: implicit bias, unconscious bias, bias, inequities
Procedia PDF Downloads 6245 Design, Construction, Validation And Use Of A Novel Portable Fire Effluent Sampling Analyser
Authors: Gabrielle Peck, Ryan Hayes
Abstract:
Current large scale fire tests focus on flammability and heat release measurements. Smoke toxicity isn’t considered despite it being a leading cause of death and injury in unwanted fires. A key reason could be that the practical difficulties associated with quantifying individual toxic components present in a fire effluent often require specialist equipment and expertise. Fire effluent contains a mixture of unreactive and reactive gases, water, organic vapours and particulate matter, which interact with each other. This interferes with the operation of the analytical instrumentation and must be removed without changing the concentration of the target analyte. To mitigate the need for expensive equipment and time-consuming analysis, a portable gas analysis system was designed, constructed and tested for use in large-scale fire tests as a simpler and more robust alternative to online FTIR measurements. The novel equipment aimed to be easily portable and able to run on battery or mains electricity; be able to be calibrated at the test site; be capable of quantifying CO, CO2, O2, HCN, HBr, HCl, NOx and SO2 accurately and reliably; be capable of independent data logging; be capable of automated switchover of 7 bubblers; be able to withstand fire effluents; be simple to operate; allow individual bubbler times to be pre-set; be capable of being controlled remotely. To test the analysers functionality, it was used alongside the ISO/TS 19700 Steady State Tube Furnace (SSTF). A series of tests were conducted to assess the validity of the box analyser measurements and the data logging abilities of the apparatus. PMMA and PA 6.6 were used to assess the validity of the box analyser measurements. The data obtained from the bench-scale assessments showed excellent agreement. Following this, the portable analyser was used to monitor gas concentrations during large-scale testing using the ISO 9705 room corner test. The analyser was set up, calibrated and set to record smoke toxicity measurements in the doorway of the test room. The analyser was successful in operating without manual interference and successfully recorded data for 12 of the 12 tests conducted in the ISO room tests. At the end of each test, the analyser created a data file (formatted as .csv) containing the measured gas concentrations throughout the test, which do not require specialist knowledge to interpret. This validated the portable analyser’s ability to monitor fire effluent without operator intervention on both a bench and large-scale. The portable analyser is a validated and significantly more practical alternative to FTIR, proven to work for large-scale fire testing for quantification of smoke toxicity. The analyser is a cheaper, more accessible option to assess smoke toxicity, mitigating the need for expensive equipment and specialist operators.Keywords: smoke toxicity, large-scale tests, iso 9705, analyser, novel equipment
Procedia PDF Downloads 77244 The Effect of Psychosocial, Behavioral and Disease Specific Characteristics on Health-Related Quality of Life after Primary Surgery for Colorectal Cancer: A Cross Sectional Study of a Regional Australian Population
Authors: Lakmali Anthony, Madeline Gillies
Abstract:
Background: Colorectal cancer (CRC) is usually managed with surgical resection. Many of the outcomes traditionally used to define successful operative management, such as resection margin, do not adequately reflect patients’ experience. Patient-reported outcomes (PRO), such as Health-Related Quality of life (HRQoL), provide a means by which the impact of surgery for cancer can be reported in a patient-centered way. HRQoL has previously been shown to be impacted by psychosocial, behavioral and disease-specific characteristics. This exploratory cross-sectional study aims to; (1) describe postoperative HRQoL in patients who underwent primary resection in a regional Australian hospital; (2) describe the prevalence of anxiety, depression and clinically significant fear of cancer recurrence (FCR) in this population; and (3) identify demographic, psychosocial, disease and treatment factors associated with poorer self-reported HRQoL. Methods: Consecutive patients who had resection of colorectal cancer in a single regional Australian hospital between 2015 and 2022 were eligible. Participants were asked to complete a survey instrument designed to assess HRQoL, as well as validated instruments that assess several other psychosocial PROs hypothesized to be associated with HRQoL; emotional distress, fear of cancer recurrence, social support, dispositional optimism, body image and spirituality. Demographic and disease-specific data were also collected via medical record review. Results: Forty-six patients completed the survey. Clinically significant levels of fear of recurrence as well as emotional distress, were present in this group. Many domains of HRQoL were significantly worse than an Australian reference population for CRC. Demographic and disease factors associated with poor HRQoL included smoking and ongoing adjuvant systemic therapy. The primary operation was not associated with HRQoL; however, the operative approach (laparoscopic vs. open) was associated with HRQoL for these patients. All psychosocial factors measured were associated with HRQoL, including cancer worry, emotional distress, body image and dispositional optimism. Conclusion: HRQoL is an important outcome in surgery for both research and clinical practice. This study provides an overview of the quality of life in a regional Australian population of postoperative colorectal cancer patients and the factors that affect it. Understanding HRQoL and awareness of patients particularly vulnerable to poor outcomes should be used to aid the informed consent and shared decision-making process between surgeon and patient.Keywords: surgery, colorectal, cancer, PRO, HRQoL
Procedia PDF Downloads 70243 Placement Characteristics of Major Stream Vehicular Traffic at Median Openings
Authors: Tathagatha Khan, Smruti Sourava Mohapatra
Abstract:
Median openings are provided in raised median of multilane roads to facilitate U-turn movement. The U-turn movement is a highly complex and risky maneuver because U-turning vehicle (minor stream) makes 180° turns at median openings and merge with the approaching through traffic (major stream). A U-turning vehicle requires a suitable gap in the major stream to merge, and during this process, the possibility of merging conflict develops. Therefore, these median openings are potential hot spot of conflict and posses concern pertaining to safety. The traffic at the median openings could be managed efficiently with enhanced safety when the capacity of a traffic facility has been estimated correctly. The capacity of U-turns at median openings is estimated by Harder’s formula, which requires three basic parameters namely critical gap, follow up time and conflict flow rate. The estimation of conflicting flow rate under mixed traffic condition is very much complicated due to absence of lane discipline and discourteous behavior of the drivers. The understanding of placement of major stream vehicles at median opening is very much important for the estimation of conflicting traffic faced by U-turning movement. The placement data of major stream vehicles at different section in 4-lane and 6-lane divided multilane roads were collected. All the test sections were free from the effect of intersection, bus stop, parked vehicles, curvature, pedestrian movements or any other side friction. For the purpose of analysis, all the vehicles were divided into 6 categories such as motorized 2W, autorickshaw (3-W), small car, big car, light commercial vehicle, and heavy vehicle. For the collection of placement data of major stream vehicles, the entire road width was divided into sections of 25 cm each and these were numbered seriatim from the pavement edge (curbside) to the end of the road. The placement major stream vehicle crossing the reference line was recorded by video graphic technique on various weekdays. The collected data for individual category of vehicles at all the test sections were converted into a frequency table with a class interval of 25 cm each and the placement frequency curve. Separate distribution fittings were tried for 4- lane and 6-lane divided roads. The variation of major stream traffic volume on the placement characteristics of major stream vehicles has also been explored. The findings of this study will be helpful to determine the conflict volume at the median openings. So, the present work holds significance in traffic planning, operation and design to alleviate the bottleneck, prospect of collision and delay at median opening in general and at median opening in developing countries in particular.Keywords: median opening, U-turn, conflicting traffic, placement, mixed traffic
Procedia PDF Downloads 138242 Healthcare Fire Disasters: Readiness, Response and Resilience Strategies: A Real-Time Experience of a Healthcare Organization of North India
Authors: Raman Sharma, Ashok Kumar, Vipin Koushal
Abstract:
Healthcare facilities are always seen as places of haven and protection for managing the external incidents, but the situation becomes more difficult and challenging when such facilities themselves are affected from internal hazards. Such internal hazards are arguably more disruptive than external incidents affecting vulnerable ones, as patients are always dependent on supportive measures and are neither in a position to respond to such crisis situation nor do they know how to respond. The situation becomes more arduous and exigent to manage if, in case critical care areas like Intensive Care Units (ICUs) and Operating Rooms (OR) are convoluted. And, due to these complexities of patients’ in-housed there, it becomes difficult to move such critically ill patients on immediate basis. Healthcare organisations use different types of electrical equipment, inflammable liquids, and medical gases often at a single point of use, hence, any sort of error can spark the fire. Even though healthcare facilities face many fire hazards, damage caused by smoke rather than flames is often more severe. Besides burns, smoke inhalation is primary cause of fatality in fire-related incidents. The greatest cause of illness and mortality in fire victims, particularly in enclosed places, appears to be the inhalation of fire smoke, which contains a complex mixture of gases in addition to carbon monoxide. Therefore, healthcare organizations are required to have a well-planned disaster mitigation strategy, proactive and well prepared manpower to cater all types of exigencies resulting from internal as well as external hazards. This case report delineates a true OR fire incident in Emergency Operation Theatre (OT) of a tertiary care multispecialty hospital and details the real life evidence of the challenges encountered by OR staff in preserving both life and property. No adverse event was reported during or after this fire commotion, yet, this case report aimed to congregate the lessons identified of the incident in a sequential and logical manner. Also, timely smoke evacuation and preventing the spread of smoke to adjoining patient care areas by opting appropriate measures, viz. compartmentation, pressurisation, dilution, ventilation, buoyancy, and airflow, helped to reduce smoke-related fatalities. Henceforth, precautionary measures may be implemented to mitigate such incidents. Careful coordination, continuous training, and fire drill exercises can improve the overall outcomes and minimize the possibility of these potentially fatal problems, thereby making a safer healthcare environment for every worker and patient.Keywords: healthcare, fires, smoke, management, strategies
Procedia PDF Downloads 68241 The Spatial Circuit of the Audiovisual Industry in Argentina: From Monopoly and Geographic Concentration to New Regionalization and Democratization Policies
Authors: André Pasti
Abstract:
Historically, the communication sector in Argentina is characterized by intense monopolization and geographical concentration in the city of Buenos Aires. In 2000, the four major media conglomerates in operation – Clarín, Telefónica, America and Hadad – controlled 84% of the national media market. By 2009, new policies were implemented as a result of civil society organizations demands. Legally, a new regulatory framework was approved: the law 26,522 of Audiovisual Communications Services. Supposedly, these policies intend to create new conditions for the development of the audiovisual economy in the territory of Argentina. The regionalization of audiovisual production and the democratization of channels and access to media were among the priorities. This paper analyses the main changes and continuities in the organization of the spatial circuit of the audiovisual industry in Argentina provoked by these new policies. These new policies aim at increasing the diversity of audiovisual producers and promoting regional audiovisual industries. For this purpose, a national program for the development of audiovisual centers within the country was created. This program fostered a federalized production network, based on nine audiovisual regions and 40 nodes. Each node has created technical, financial and organizational conditions to gather different actors in audiovisual production – such as SMEs, social movements and local associations. The expansion of access to technical networks was also a concern of other policies, such as ‘Argentina connected’, whose objective was to expand access to broadband Internet. The Open Digital Television network also received considerable investments. Furthermore, measures have been carried out in order to impose limits on the concentration of ownership as well as to eliminate the oligopolies and to ensure more competition in the sector. These actions intended to force a divide of the media conglomerates into smaller groups. Nevertheless, the corporations that compose these conglomerates resist strongly, making full use of their economic and judiciary power. Indeed, the absence of effective impact of such measures can be testified by the fact that the audiovisual industry remains strongly concentrated in Argentina. Overall, these new policies were designed properly to decentralize audiovisual production and expand the regional diversity of the audiovisual industry. However, the effective transformation of the organization of the audiovisual circuit in the territory faced several resistances. This can be explained firstly and foremost by the ideological and economic power of the media conglomerates. In the second place, there is an inherited inertia from the unequal distribution of the objects needed for the audiovisual production and consumption. Lastly, the resistance also relies on financial needs and in the excessive dependence of the state for the promotion of regional audiovisual production.Keywords: Argentina, audiovisual industry, communication policies, geographic concentration, regionalization, spatial circuit
Procedia PDF Downloads 216240 Measuring Oxygen Transfer Coefficients in Multiphase Bioprocesses: The Challenges and the Solution
Authors: Peter G. Hollis, Kim G. Clarke
Abstract:
Accurate quantification of the overall volumetric oxygen transfer coefficient (KLa) is ubiquitously measured in bioprocesses by analysing the response of dissolved oxygen (DO) to a step change in the oxygen partial pressure in the sparge gas using a DO probe. Typically, the response lag (τ) of the probe has been ignored in the calculation of KLa when τ is less than the reciprocal KLa, failing which a constant τ has invariably been assumed. These conventions have now been reassessed in the context of multiphase bioprocesses, such as a hydrocarbon-based system. Here, significant variation of τ in response to changes in process conditions has been documented. Experiments were conducted in a 5 L baffled stirred tank bioreactor (New Brunswick) in a simulated hydrocarbon-based bioprocess comprising a C14-20 alkane-aqueous dispersion with suspended non-viable Saccharomyces cerevisiae solids. DO was measured with a polarographic DO probe fitted with a Teflon membrane (Mettler Toledo). The DO concentration response to a step change in the sparge gas oxygen partial pressure was recorded, from which KLa was calculated using a first order model (without incorporation of τ) and a second order model (incorporating τ). τ was determined as the time taken to reach 63.2% of the saturation DO after the probe was transferred from a nitrogen saturated vessel to an oxygen saturated bioreactor and is represented as the inverse of the probe constant (KP). The relative effects of the process parameters on KP were quantified using a central composite design with factor levels typical of hydrocarbon bioprocesses, namely 1-10 g/L yeast, 2-20 vol% alkane and 450-1000 rpm. A response surface was fitted to the empirical data, while ANOVA was used to determine the significance of the effects with a 95% confidence interval. KP varied with changes in the system parameters with the impact of solid loading statistically significant at the 95% confidence level. Increased solid loading reduced KP consistently, an effect which was magnified at high alkane concentrations, with a minimum KP of 0.024 s-1 observed at the highest solids loading of 10 g/L. This KP was 2.8 fold lower that the maximum of 0.0661 s-1 recorded at 1 g/L solids, demonstrating a substantial increase in τ from 15.1 s to 41.6 s as a result of differing process conditions. Importantly, exclusion of KP in the calculation of KLa was shown to under-predict KLa for all process conditions, with an error up to 50% at the highest KLa values. Accurate quantification of KLa, and therefore KP, has far-reaching impact on industrial bioprocesses to ensure these systems are not transport limited during scale-up and operation. This study has shown the incorporation of τ to be essential to ensure KLa measurement accuracy in multiphase bioprocesses. Moreover, since τ has been conclusively shown to vary significantly with process conditions, it has also been shown that it is essential for τ to be determined individually for each set of process conditions.Keywords: effect of process conditions, measuring oxygen transfer coefficients, multiphase bioprocesses, oxygen probe response lag
Procedia PDF Downloads 266239 Possibility of Membrane Filtration to Treatment of Effluent from Digestate
Authors: Marcin Debowski, Marcin Zielinski, Magdalena Zielinska, Paulina Rusanowska
Abstract:
The problem with digestate management is one of the most important factors influencing on the development and operation of biogas plant. Turbidity and bacterial contamination negatively affect the growth of algae, which can limit the use of the effluent in the production of algae biomass on a large scale. These problems can be overcome by cultivating of algae species resistant to environmental factors, such as Chlorella sp., Scenedesmus sp., or reducing load of organic compounds to prevent bacterial contamination. The effluent requires dilution and/or purification. One of the methods of effluent treatment is the use of a membrane technology such as microfiltration (MF), ultrafiltration (UF), nanofiltration (NF) and reverse osmosis (RO), depending on the membrane pore size and the cut off point. Membranes are a physical barrier to solids and particles larger than the size of the pores. MF membranes have the largest pores and are used to remove turbidity, suspensions, bacteria and some viruses. UF membranes remove also color, odor and organic compounds with high molecular weight. In treatment of wastewater or other waste streams, MF and UF can provide a sufficient degree of purification. NF membranes are used to remove natural organic matter from waters, water disinfection products and sulfates. RO membranes are applied to remove monovalent ions such as Na⁺ or K⁺. The effluent was used in UF for medium to cultivation of two microalgae: Chlorella sp. and Phaeodactylum tricornutum. Growth rates of Chlorella sp. and P. tricornutum were similar: 0.216 d⁻¹ and 0.200 d⁻¹ (Chlorella sp.); 0.128 d⁻¹ and 0.126 d⁻¹ (P. tricornutum), on synthetic medium and permeate from UF, respectively. The final biomass composition was also similar, regardless of the medium. Removal of nitrogen was 92% and 71% by Chlorella sp. and P. tricornutum, respectively. The fermentation effluents after UF and dilution were also used for cultivation of algae Scenedesmus sp. that is resistant to environmental conditions. The authors recommended the development of biorafinery based on the production of algae for the biogas production. There are examples of using a multi-stage membrane system to purify the liquid fraction from digestate. After the initial UF, RO is used to remove ammonium nitrogen and COD. To obtain a permeate with a concentration of ammonium nitrogen allowing to discharge it into the environment, it was necessary to apply three-stage RO. The composition of the permeate after two-stage RO was: COD 50–60 mg/dm³, dry solids 0 mg/dm³, ammonium nitrogen 300–320 mg/dm³, total nitrogen 320–340 mg/dm³, total phosphorus 53 mg/dm³. However compostion of permeate after three-stage RO was: COD < 5 mg/dm³, dry solids 0 mg/dm³, ammonium nitrogen 0 mg/dm³, total nitrogen 3.5 mg/dm³, total phosphorus < 0,05 mg/dm³. Last stage of RO might be replaced by ion exchange process. The negative aspect of membrane filtration systems is the fact that the permeate is about 50% of the introduced volume, the remainder is the retentate. The management of a retentate might involve recirculation to a biogas plant.Keywords: digestate, membrane filtration, microalgae cultivation, Chlorella sp.
Procedia PDF Downloads 352238 Study on the Rapid Start-up and Functional Microorganisms of the Coupled Process of Short-range Nitrification and Anammox in Landfill Leachate Treatment
Authors: Lina Wu
Abstract:
The excessive discharge of nitrogen in sewage greatly intensifies the eutrophication of water bodies and poses a threat to water quality. Nitrogen pollution control has become a global concern. Currently, the problem of water pollution in China is still not optimistic. As a typical high ammonia nitrogen organic wastewater, landfill leachate is more difficult to treat than domestic sewage because of its complex water quality, high toxicity, and high concentration.Many studies have shown that the autotrophic anammox bacteria in nature can combine nitrous and ammonia nitrogen without carbon source through functional genes to achieve total nitrogen removal, which is very suitable for the removal of nitrogen from leachate. In addition, the process also saves a lot of aeration energy consumption than the traditional nitrogen removal process. Therefore, anammox plays an important role in nitrogen conversion and energy saving. The process composed of short-range nitrification and denitrification coupled an ammo ensures the removal of total nitrogen and improves the removal efficiency, meeting the needs of the society for an ecologically friendly and cost-effective nutrient removal treatment technology. Continuous flow process for treating late leachate [an up-flow anaerobic sludge blanket reactor (UASB), anoxic/oxic (A/O)–anaerobic ammonia oxidation reactor (ANAOR or anammox reactor)] has been developed to achieve autotrophic deep nitrogen removal. In this process, the optimal process parameters such as hydraulic retention time and nitrification flow rate have been obtained, and have been applied to the rapid start-up and stable operation of the process system and high removal efficiency. Besides, finding the characteristics of microbial community during the start-up of anammox process system and analyzing its microbial ecological mechanism provide a basis for the enrichment of anammox microbial community under high environmental stress. One research developed partial nitrification-Anammox (PN/A) using an internal circulation (IC) system and a biological aerated filter (BAF) biofilm reactor (IBBR), where the amount of water treated is closer to that of landfill leachate. However, new high-throughput sequencing technology is still required to be utilized to analyze the changes of microbial diversity of this system, related functional genera and functional genes under optimal conditions, providing theoretical and further practical basis for the engineering application of novel anammox system in biogas slurry treatment and resource utilization.Keywords: nutrient removal and recovery, leachate, anammox, partial nitrification
Procedia PDF Downloads 51237 Working Capital Management Practices in Small Businesses in Victoria
Authors: Ranjith Ihalanayake, Lalith Seelanatha, John Breen
Abstract:
In this study, we explored the current working capital management practices as applied in small businesses in Victoria, filling an existing theoretical and empirical gap in literature in general and in Australia in particular. Amidst the current global competitive and dynamic environment, the short term insolvency of small businesses is very critical for the long run survival. A firm’s short-term insolvency is dependent on the availability of sufficient working capital for feeding day to day operational activities. Therefore, given the reliance for short-term funding by small businesses, it has been recognized that the efficient management of working capital is crucial in respect of the prosperity and survival of such firms. Against this background, this research was an attempt to understand the current working capital management strategies and practices used by the small scale businesses. To this end, we conducted an internet survey among 220 small businesses operating in Victoria, Australia. The survey results suggest that the majority of respondents are owner-manager (73%) and male (68%). Respondents participated in this survey mostly have a degree (46%). About a half of respondents are more than 50 years old. Most of respondents (64%) have business management experience more than ten years. Similarly, majority of them (63%) had experience in the area of their current business. Types of business of the respondents are: Private limited company (41%), sole proprietorship (37%), and partnership (15%). In addition, majority of the firms are service companies (63%), followed by retailed companies (25%), and manufacturing (17%). Size of companies of this survey varies, 32% of them have annual sales $100,000 or under, while 22% of them have revenue more than $1,000,000 every year. In regards to the total assets, majority of respondents (43%) have total assets $100,000 or less while 20% of respondents have total assets more than $1,000,000. In regards to WCMPs, results indicate that almost 70% of respondents mentioned that they are responsible for managing their business working capital. The survey shows that majority of respondents (65.5%) use their business experience to identify the level of investment in working capital, compared to 22% of respondents who seek advice from professionals. The other 10% of respondents, however, follow industry practice to identify the level of working capital. The survey also shows that more than a half of respondents maintain good liquidity financial position for their business by having accounts payable less than accounts receivable. This study finds that majority of small business companies in western area of Victoria have a WCM policy but only about 8 % of them have a formal policy. Majority of the businesses (52.7%) have an informal policy while 39.5% have no policy. Of those who have a policy, 44% described their working capital management policies as a compromise policy while 35% described their policy as a conservative policy. Only 6% of respondents apply aggressive policy. Overall the results indicate that the small businesses pay less attention into the management of working capital of their business despite its significance in the successful operation of the business. This approach may be adopted during favourable economic times. However, during relatively turbulent economic conditions, such an approach could lead to greater financial difficulties i.e. short-term financial insolvency.Keywords: small business, working capital management, Australia, sufficient, financial insolvency
Procedia PDF Downloads 354236 Design of a Low-Cost, Portable, Sensor Device for Longitudinal, At-Home Analysis of Gait and Balance
Authors: Claudia Norambuena, Myissa Weiss, Maria Ruiz Maya, Matthew Straley, Elijah Hammond, Benjamin Chesebrough, David Grow
Abstract:
The purpose of this project is to develop a low-cost, portable sensor device that can be used at home for long-term analysis of gait and balance abnormalities. One area of particular concern involves the asymmetries in movement and balance that can accompany certain types of injuries and/or the associated devices used in the repair and rehabilitation process (e.g. the use of splints and casts) which can often increase chances of falls and additional injuries. This device has the capacity to monitor a patient during the rehabilitation process after injury or operation, increasing the patient’s access to healthcare while decreasing the number of visits to the patient’s clinician. The sensor device may thereby improve the quality of the patient’s care, particularly in rural areas where access to the clinician could be limited, while simultaneously decreasing the overall cost associated with the patient’s care. The device consists of nine interconnected accelerometer/ gyroscope/compass chips (9-DOF IMU, Adafruit, New York, NY). The sensors attach to and are used to determine the orientation and acceleration of the patient’s lower abdomen, C7 vertebra (lower neck), L1 vertebra (middle back), anterior side of each thigh and tibia, and dorsal side of each foot. In addition, pressure sensors are embedded in shoe inserts with one sensor (ESS301, Tekscan, Boston, MA) beneath the heel and three sensors (Interlink 402, Interlink Electronics, Westlake Village, CA) beneath the metatarsal bones of each foot. These sensors measure the distribution of the weight applied to each foot as well as stride duration. A small microntroller (Arduino Mega, Arduino, Ivrea, Italy) is used to collect data from these sensors in a CSV file. MATLAB is then used to analyze the data and output the hip, knee, ankle, and trunk angles projected on the sagittal plane. An open-source program Processing is then used to generate an animation of the patient’s gait. The accuracy of the sensors was validated through comparison to goniometric measurements (±2° error). The sensor device was also shown to have sufficient sensitivity to observe various gait abnormalities. Several patients used the sensor device, and the data collected from each represented the patient’s movements. Further, the sensors were found to have the ability to observe gait abnormalities caused by the addition of a small amount of weight (4.5 - 9.1 kg) to one side of the patient. The user-friendly interface and portability of the sensor device will help to construct a bridge between patients and their clinicians with fewer necessary inpatient visits.Keywords: biomedical sensing, gait analysis, outpatient, rehabilitation
Procedia PDF Downloads 289235 Investigation of a Single Feedstock Particle during Pyrolysis in Fluidized Bed Reactors via X-Ray Imaging Technique
Authors: Stefano Iannello, Massimiliano Materazzi
Abstract:
Fluidized bed reactor technologies are one of the most valuable pathways for thermochemical conversions of biogenic fuels due to their good operating flexibility. Nevertheless, there are still issues related to the mixing and separation of heterogeneous phases during operation with highly volatile feedstocks, including biomass and waste. At high temperatures, the volatile content of the feedstock is released in the form of the so-called endogenous bubbles, which generally exert a “lift” effect on the particle itself by dragging it up to the bed surface. Such phenomenon leads to high release of volatile matter into the freeboard and limited mass and heat transfer with particles of the bed inventory. The aim of this work is to get a better understanding of the behaviour of a single reacting particle in a hot fluidized bed reactor during the devolatilization stage. The analysis has been undertaken at different fluidization regimes and temperatures to closely mirror the operating conditions of waste-to-energy processes. Beechwood and polypropylene particles were used to resemble the biomass and plastic fractions present in waste materials, respectively. The non-invasive X-ray technique was coupled to particle tracking algorithms to characterize the motion of a single feedstock particle during the devolatilization with high resolution. A high-energy X-ray beam passes through the vessel where absorption occurs, depending on the distribution and amount of solids and fluids along the beam path. A high-speed video camera is synchronised to the beam and provides frame-by-frame imaging of the flow patterns of fluids and solids within the fluidized bed up to 72 fps (frames per second). A comprehensive mathematical model has been developed in order to validate the experimental results. Beech wood and polypropylene particles have shown a very different dynamic behaviour during the pyrolysis stage. When the feedstock is fed from the bottom, the plastic material tends to spend more time within the bed than the biomass. This behaviour can be attributed to the presence of the endogenous bubbles, which drag effect is more pronounced during the devolatilization of biomass, resulting in a lower residence time of the particle within the bed. At the typical operating temperatures of thermochemical conversions, the synthetic polymer softens and melts, and the bed particles attach on its outer surface, generating a wet plastic-sand agglomerate. Consequently, this additional layer of sand may hinder the rapid evolution of volatiles in the form of endogenous bubbles, and therefore the establishment of a poor drag effect acting on the feedstock itself. Information about the mixing and segregation of solid feedstock is of prime importance for the design and development of more efficient industrial-scale operations.Keywords: fluidized bed, pyrolysis, waste feedstock, X-ray
Procedia PDF Downloads 172234 The Structuring of Economic of Brazilian Innovation and the Institutional Proposal to the Legal Management for Global Conformity to Treat the Technological Risks
Authors: Daniela Pellin, Wilson Engelmann
Abstract:
Brazil has sought to accelerate your development through technology and innovation as a response to the global influences, which has received in internal management practices. For this, it had edited the Brazilian Law of Innovation 13.243/2016. However observing the Law overestimated economic aspects the respective application will not consider the stakeholders and the technological risks because there is no legal treatment. The economic exploitation and the technological risks must be controlled by limits of democratic system to find better social development to contribute with the economics agents for making decision to conform with global directions. The research understands this is a problem to face given the social particularities of the country because there has been the literal import of the North American Triple Helix Theory consolidated in developed countries and the negative consequences when applied in developing countries. Because of this symptomatic scenario, it is necessary to create adjustment to conduct the management of the law besides social democratic interests to increase the country development. For this, therefore, the Government will have to adopt some conducts promoting side by side with universities, civil society and companies, informational transparency, catch of partnerships, create a Confort Letter document for preparation to ensure the operation, joint elaboration of a Manual of Good Practices, make accountability and data dissemination. Also the Universities must promote informational transparency, drawing up partnership contracts and generating revenue, development of information. In addition, the civil society must do data analysis about proposals received for discussing to give opinion related. At the end, companies have to give public and transparent information about investments and economic benefits, risks and innovation manufactured. The research intends as a general objective to demonstrate that the efficiency of the propeller deployment will be possible if the innovative decision-making process goes through the institutional logic. As specific objectives, the American influence must undergo some modifications to better suit the economic-legal incentives to potentiate the development of the social system. The hypothesis points to institutional model for application to the legal system can be elaborated based on emerging characteristics of the country, in such a way that technological risks can be foreseen and there will be global conformity with attention to the full development of society as proposed by the researchers.The method of approach will be the systemic-constructivist with bibliographical review, data collection and analysis with the construction of the institutional and democratic model for the management of the Law.Keywords: development, governance of law, institutionalization, triple helix
Procedia PDF Downloads 140233 Requirement Engineering for Intrusion Detection Systems in Wireless Sensor Networks
Authors: Afnan Al-Romi, Iman Al-Momani
Abstract:
The urge of applying the Software Engineering (SE) processes is both of vital importance and a key feature in critical, complex large-scale systems, for example, safety systems, security service systems, and network systems. Inevitably, associated with this are risks, such as system vulnerabilities and security threats. The probability of those risks increases in unsecured environments, such as wireless networks in general and in Wireless Sensor Networks (WSNs) in particular. WSN is a self-organizing network of sensor nodes connected by wireless links. WSNs consist of hundreds to thousands of low-power, low-cost, multi-function sensor nodes that are small in size and communicate over short-ranges. The distribution of sensor nodes in an open environment that could be unattended in addition to the resource constraints in terms of processing, storage and power, make such networks in stringent limitations such as lifetime (i.e. period of operation) and security. The importance of WSN applications that could be found in many militaries and civilian aspects has drawn the attention of many researchers to consider its security. To address this important issue and overcome one of the main challenges of WSNs, security solution systems have been developed by researchers. Those solutions are software-based network Intrusion Detection Systems (IDSs). However, it has been witnessed, that those developed IDSs are neither secure enough nor accurate to detect all malicious behaviours of attacks. Thus, the problem is the lack of coverage of all malicious behaviours in proposed IDSs, leading to unpleasant results, such as delays in the detection process, low detection accuracy, or even worse, leading to detection failure, as illustrated in the previous studies. Also, another problem is energy consumption in WSNs caused by IDS. So, in other words, not all requirements are implemented then traced. Moreover, neither all requirements are identified nor satisfied, as for some requirements have been compromised. The drawbacks in the current IDS are due to not following structured software development processes by researches and developers when developing IDS. Consequently, they resulted in inadequate requirement management, process, validation, and verification of requirements quality. Unfortunately, WSN and SE research communities have been mostly impermeable to each other. Integrating SE and WSNs is a real subject that will be expanded as technology evolves and spreads in industrial applications. Therefore, this paper will study the importance of Requirement Engineering when developing IDSs. Also, it will study a set of existed IDSs and illustrate the absence of Requirement Engineering and its effect. Then conclusions are drawn in regard of applying requirement engineering to systems to deliver the required functionalities, with respect to operational constraints, within an acceptable level of performance, accuracy and reliability.Keywords: software engineering, requirement engineering, Intrusion Detection System, IDS, Wireless Sensor Networks, WSN
Procedia PDF Downloads 322232 A Case of Prosthetic Vascular-Graft Infection Due to Mycobacterium fortuitum
Authors: Takaaki Nemoto
Abstract:
Case presentation: A 69-year-old Japanese man presented with a low-grade fever and fatigue that had persisted for one month. The patient had an aortic dissection on the aortic arch 13 years prior, an abdominal aortic aneurysm seven years prior, and an aortic dissection on the distal aortic arch one year prior, which were all treated with artificial blood-vessel replacement surgery. Laboratory tests revealed an inflammatory response (CRP 7.61 mg/dl), high serum creatinine (Cr 1.4 mg/dL), and elevated transaminase (AST 47 IU/L, ALT 45 IU/L). The patient was admitted to our hospital on suspicion of prosthetic vascular graft infection. Following further workups on the inflammatory response, an enhanced chest computed tomography (CT) and a non-enhanced chest DWI (MRI) were performed. The patient was diagnosed with a pulmonary fistula and a prosthetic vascular graft infection on the distal aortic arch. After admission, the patient was administered Ceftriaxion and Vancomycine for 10 days, but his fever and inflammatory response did not improve. On day 13 of hospitalization, a lung fistula repair surgery and an omental filling operation were performed, and Meropenem and Vancomycine were administered. The fever and inflammatory response continued, and therefore we took repeated blood cultures. M. fortuitum was detected in a blood culture on day 16 of hospitalization. As a result, we changed the treatment regimen to Amikacin (400 mg/day), Meropenem (2 g/day), and Cefmetazole (4 g/day), and the fever and inflammatory response began to decrease gradually. We performed a test of sensitivity for Mycobacterium fortuitum, and found that the MIC was low for fluoroquinolone antibacterial agent. The clinical course was good, and the patient was discharged after a total of 8 weeks of intravenous drug administration. At discharge, we changed the treatment regimen to Levofloxacin (500 mg/day) and Clarithromycin (800 mg/day), and prescribed these two drugs as a long life suppressive therapy. Discussion: There are few cases of prosthetic vascular graft infection caused by mycobacteria, and a standard therapy remains to be established. For prosthetic vascular graft infections, it is ideal to provide surgical and medical treatment in parallel, but in this case, surgical treatment was difficult and, therefore, a conservative treatment was chosen. We attempted to increase the treatment success rate of this refractory disease by conducting a susceptibility test for mycobacteria and treating with different combinations of antimicrobial agents, which was ultimately effective. With our treatment approach, a good clinical course was obtained and continues at the present stage. Conclusion: Although prosthetic vascular graft infection resulting from mycobacteria is a refractory infectious disease, it may be curative to administer appropriate antibiotics based on the susceptibility test in addition to surgical treatment.Keywords: prosthetic vascular graft infection, lung fistula, Mycobacterium fortuitum, conservative treatment
Procedia PDF Downloads 156231 Assessing Sydney Tar Ponds Remediation and Natural Sediment Recovery in Nova Scotia, Canada
Authors: Tony R. Walker, N. Devin MacAskill, Andrew Thalhiemer
Abstract:
Sydney Harbour, Nova Scotia has long been subject to effluent and atmospheric inputs of metals, polycyclic aromatic hydrocarbons (PAHs), and polychlorinated biphenyls (PCBs) from a large coking operation and steel plant that operated in Sydney for nearly a century until closure in 1988. Contaminated effluents from the industrial site resulted in the creation of the Sydney Tar Ponds, one of Canada’s largest contaminated sites. Since its closure, there have been several attempts to remediate this former industrial site and finally, in 2004, the governments of Canada and Nova Scotia committed to remediate the site to reduce potential ecological and human health risks to the environment. The Sydney Tar Ponds and Coke Ovens cleanup project has become the most prominent remediation project in Canada today. As an integral part of remediation of the site (i.e., which consisted of solidification/stabilization and associated capping of the Tar Ponds), an extensive multiple media environmental effects program was implemented to assess what effects remediation had on the surrounding environment, and, in particular, harbour sediments. Additionally, longer-term natural sediment recovery rates of select contaminants predicted for the harbour sediments were compared to current conditions. During remediation, potential contributions to sediment quality, in addition to remedial efforts, were evaluated which included a significant harbour dredging project, propeller wash from harbour traffic, storm events, adjacent loading/unloading of coal and municipal wastewater treatment discharges. Two sediment sampling methodologies, sediment grab and gravity corer, were also compared to evaluate the detection of subtle changes in sediment quality. Results indicated that overall spatial distribution pattern of historical contaminants remains unchanged, although at much lower concentrations than previously reported, due to natural recovery. Measurements of sediment indicator parameter concentrations confirmed that natural recovery rates of Sydney Harbour sediments were in broad agreement with predicted concentrations, in spite of ongoing remediation activities. Overall, most measured parameters in sediments showed little temporal variability even when using different sampling methodologies, during three years of remediation compared to baseline, except for the detection of significant increases in total PAH concentrations noted during one year of remediation monitoring. The data confirmed the effectiveness of mitigation measures implemented during construction relative to harbour sediment quality, despite other anthropogenic activities and the dynamic nature of the harbour.Keywords: contaminated sediment, monitoring, recovery, remediation
Procedia PDF Downloads 236230 Research on the Performance Management of Social Organizations Participating in Home-Based Care
Authors: Qiuhu Shao
Abstract:
Community home-based care service system, which is based on the family pension, supported by community pension and supplied by institutions pension, is an effective pension system to solve the current situation of China's accelerated aging. However, due to the fundamental realities of our country, the government is not able to bear the unilateral supply of the old-age service of the community. Therefore, based on the theory of welfare pluralism, the participation of social organizations in the home-based care service center has become an important part of the diversified supply of the old-age service for the elderly. Meanwhile, the home-based care service industry is still in the early stage, the management is relatively rough, which resulted in a large number of social resources waste. Thus, scientific, objective and long-term implementation is needed for social organizations to participate in home-based care services to guide its performance management. In order to realize the design of the performance management system, the author has done a research work that clarifies the research status of social organization's participation in home-based care service. Relevant theories such as welfare pluralism, community care theory, and performance management theory have been used to demonstrate the feasibility of data envelopment analysis method in social organization performance research. This paper analyzes the characteristics of the operation mode of the home-based care service center, and hackles the national as well as local documents, standards and norms related to the development of the home-based care industry, particularly studies those documents in Nanjing. Based on this, the paper designed a set of performance management PDCA system for home-based care service center in Nanjing and clarified each step of the system in detail. Subsequently, the research methods of performance evaluation and performance management and feedback, which are two core steps of performance management have been compared and screened in order to establish the overall framework of the performance management system of the home-based care service center. Through a large number of research, the paper summarized and analyzed the characteristics of the home-based care service center. Based on the research results, combined with the practice of the industry development in Nanjing, the paper puts forward a targeted performance evaluation index system of home-based care service center in Nanjing. Finally, the paper evaluated and sub-filed the performance of 186 home-based care service centers in Nanjing and then designed the performance optimization direction and performance improvement path based on the results. This study constructs the index system of performance evaluation of home-based care service and makes the index detailed to the implementation level, and constructs the evaluation index system which can be applied directly. Meanwhile, the quantitative evaluation of social organizations participating in the home-based care service changed the subjective impression in the previous practice of evaluation.Keywords: data envelopment analysis, home-based care, performance management, social organization
Procedia PDF Downloads 270229 Solutions to Reduce CO2 Emissions in Autonomous Robotics
Authors: Antoni Grau, Yolanda Bolea, Alberto Sanfeliu
Abstract:
Mobile robots can be used in many different applications, including mapping, search, rescue, reconnaissance, hazard detection, and carpet cleaning, exploration, etc. However, they are limited due to their reliance on traditional energy sources such as electricity and oil which cannot always provide a convenient energy source in all situations. In an ever more eco-conscious world, solar energy offers the most environmentally clean option of all energy sources. Electricity presents threats of pollution resulting from its production process, and oil poses a huge threat to the environment. Not only does it pose harm by the toxic emissions (for instance CO2 emissions), it produces the combustion process necessary to produce energy, but there is the ever present risk of oil spillages and damages to ecosystems. Solar energy can help to mitigate carbon emissions by replacing more carbon intensive sources of heat and power. The challenge of this work is to propose the design and the implementation of electric battery recharge stations. Those recharge docks are based on the use of renewable energy such as solar energy (with photovoltaic panels) with the object to reduce the CO2 emissions. In this paper, a comparative study of the CO2 emission productions (from the use of different energy sources: natural gas, gas oil, fuel and solar panels) in the charging process of the Segway PT batteries is carried out. To make the study with solar energy, a photovoltaic panel, and a Buck-Boost DC/DC block has been used. Specifically, the STP005S-12/Db solar panel has been used to carry out our experiments. This module is a 5Wp-photovoltaic (PV) module, configured with 36 monocrystalline cells serially connected. With those elements, a battery recharge station is made to recharge the robot batteries. For the energy storage DC/DC block, a series of ultracapacitors have been used. Due to the variation of the PV panel with the temperature and irradiation, and the non-integer behavior of the ultracapacitors as well as the non-linearities of the whole system, authors have been used a fractional control method to achieve that solar panels supply the maximum allowed power to recharge the robots in the lesser time. Greenhouse gas emissions for production of electricity vary due to regional differences in source fuel. The impact of an energy technology on the climate can be characterised by its carbon emission intensity, a measure of the amount of CO2, or CO2 equivalent emitted by unit of energy generated. In our work, the coal is the fossil energy more hazardous, providing a 53% more of gas emissions than natural gas and a 30% more than fuel. Moreover, it is remarkable that existing fossil fuel technologies produce high carbon emission intensity through the combustion of carbon-rich fuels, whilst renewable technologies such as solar produce little or no emissions during operation, but may incur emissions during manufacture. The solar energy thus can help to mitigate carbon emissions.Keywords: autonomous robots, CO2 emissions, DC/DC buck-boost, solar energy
Procedia PDF Downloads 422228 Determination of the Knowledge Level of Healthcare Professional's Working at the Emergency Services in Turkey about Their Approaches to Common Forensic Cases
Authors: E. Tuğba Topçu, Ebru E. Kazan, Erhan Büken
Abstract:
Emergency nurses are the first health care professional to generally observe the patients, communicate patients’ family or relatives, touch the properties of patients and contact to laboratory sample of patients. Also, they are the encounter incidents related crime, people who engage in violence or suspicious injuries frequently. So, documentation of patients’ condition came to the hospital and conservation of evidence are important in the inquiry of forensic medicine. The aim of the study was to determine the knowledge level of healthcare professional working at the emergency services regarding their approaches to common forensic cases. The study was comprised of 404 healthcare professional working (nurse, emergency medicine technician, health officer) at the emergency services of 6 state hospitals, 6 training and 6 research hospitals and 3 university hospitals in Ankara. Data was collected using questionnaire form which was developed by researches in the direction of literature. Questionnaire form is comprised of two sections. The first section includes 17 questions related demographic information about health care professional and 4 questions related Turkish laws. The second section includes 43 questions to the determination of knowledge level of health care professional’s working in the emergency department, about approaches to frequently encountered forensic cases. For the data evaluation of the study; Mann Whitney U test, Bonferroni correction Kruskal Wallis H test and Chi Square tests have been used. According to study, it’s said that there is no forensic medicine expert in the foundation by 73.4% of health care professionals. Two third (66%) of participants’ in emergency department reported daily average 7 or above forensic cases applied to the emergency department and 52.1% of participants did not evaluate incidents came to the emergency department as a forensic case. Most of the participants informed 'duty of preservation of evidence' is health care professionals duty related forensic cases. In result, we determinated that knowledge level of health care professional working in the emergency department, about approaches to frequently encountered forensic cases, is not the expected level. Because we found that most of them haven't received education about forensic nursing.Postgraduates participants, educated health professional about forensic nursing, staff who applied to sources about forensic nursing and staff who evaluated emergency department cases as forensic cases have significantly higher level of knowledge. Moreover, it’s found that forensic cases diagnosis score is the highest in health officer and university graduated. Health care professional’s deficiency in knowledge about forensic cases can cause defects in operation of the forensic process because of mistakes in collecting and conserving of evidence. It is obvious that training about the approach to forensic nursing should be arranged.Keywords: emergency nurses, forensic case, forensic nursing, level of knowledge
Procedia PDF Downloads 294227 Improving the Management Systems of the Ownership Risks in Conditions of Transformation of the Russian Economy
Authors: Mikhail V. Khachaturyan
Abstract:
The article analyzes problems of improving the management systems of the ownership risks in the conditions of the transformation of the Russian economy. Among the main sources of threats business owners should highlight is the inefficiency of the implementation of business models and interaction with hired managers. In this context, it is particularly important to analyze the relationship of business models and ownership risks. The analysis of this problem appears to be relevant for a number of reasons: Firstly, the increased risk appetite of the owner directly affects the business model and the composition of his holdings; secondly, owners with significant stakes in the company are factors in the formation of particular types of risks for owners, for which relations have a significant influence on a firm's competitiveness and ultimately determines its survival; and thirdly, inefficient system of management ownership of risk is one of the main causes of mass bankruptcies, which significantly affects the stable operation of the economy as a whole. The separation of the processes of possession, disposal and use in modern organizations is the cause of not only problems in the process of interaction between the owner and managers in managing the organization as a whole, but also the asymmetric information about the kinds and forms of the main risks. Managers tend to avoid risky projects, inhibit the diversification of the organization's assets, while owners can insist on the development of such projects, with the aim not only of creating new values for themselves and consumers, but also increasing the value of the company as a result of increasing capital. In terms of separating ownership and management, evaluation of projects by the ratio of risk-yield requires preservation of the influence of the owner on the process of development and making management decisions. It is obvious that without a clearly structured system of participation of the owner in managing the risks of their business, further development is hopeless. In modern conditions of forming a risk management system, owners are compelled to compromise between the desire to increase the organization's ability to produce new value, and, consequently, increase its cost due to the implementation of risky projects and the need to tolerate the cost of lost opportunities of risk diversification. Improving the effectiveness of the management of ownership risks may also contribute to the revitalization of creditors on implementation claims to inefficient owners, which ultimately will contribute to the efficiency models of ownership control to exclude variants of insolvency. It is obvious that in modern conditions, the success of the model of the ownership of risk management and audit is largely determined by the ability and willingness of the owner to find a compromise between potential opportunities for expanding the firm's ability to create new value through risk and maintaining the current level of new value creation and an acceptable level of risk through the use of models of diversification.Keywords: improving, ownership risks, problem, Russia
Procedia PDF Downloads 350226 Development of the Integrated Quality Management System of Cooked Sausage Products
Authors: Liubov Lutsyshyn, Yaroslava Zhukova
Abstract:
Over the past twenty years, there has been a drastic change in the mode of nutrition in many countries which has been reflected in the development of new products, production techniques, and has also led to the expansion of sales markets for food products. Studies have shown that solution of the food safety problems is almost impossible without the active and systematic activity of organizations directly involved in the production, storage and sale of food products, as well as without management of end-to-end traceability and exchange of information. The aim of this research is development of the integrated system of the quality management and safety assurance based on the principles of HACCP, traceability and system approach with creation of an algorithm for the identification and monitoring of parameters of technological process of manufacture of cooked sausage products. Methodology of implementation of the integrated system based on the principles of HACCP, traceability and system approach during the manufacturing of cooked sausage products for effective provision for the defined properties of the finished product has been developed. As a result of the research evaluation technique and criteria of performance of the implementation and operation of the system of the quality management and safety assurance based on the principles of HACCP have been developed and substantiated. In the paper regularities of influence of the application of HACCP principles, traceability and system approach on parameters of quality and safety of the finished product have been revealed. In the study regularities in identification of critical control points have been determined. The algorithm of functioning of the integrated system of the quality management and safety assurance has also been described and key requirements for the development of software allowing the prediction of properties of finished product, as well as the timely correction of the technological process and traceability of manufacturing flows have been defined. Based on the obtained results typical scheme of the integrated system of the quality management and safety assurance based on HACCP principles with the elements of end-to-end traceability and system approach for manufacture of cooked sausage products has been developed. As a result of the studies quantitative criteria for evaluation of performance of the system of the quality management and safety assurance have been developed. A set of guidance documents for the implementation and evaluation of the integrated system based on the HACCP principles in meat processing plants have also been developed. On the basis of the research the effectiveness of application of continuous monitoring of the manufacturing process during the control on the identified critical control points have been revealed. The optimal number of critical control points in relation to the manufacture of cooked sausage products has been substantiated. The main results of the research have been appraised during 2013-2014 under the conditions of seven enterprises of the meat processing industry and have been implemented at JSC «Kyiv meat processing plant».Keywords: cooked sausage products, HACCP, quality management, safety assurance
Procedia PDF Downloads 247225 Using Pump as Turbine in Drinking Water Networks to Monitor and Control Water Processes Remotely
Authors: Sara Bahariderakhshan, Morteza Ahmadifar
Abstract:
Leakage is one of the most important problems that water distribution networks face which first reason is high-pressure existence. There are many approaches to control this excess pressure, which using pressure reducing valves (PRVs) or reducing pipe diameter are ones. In the other hand, Pumps are using electricity or fossil fuels to supply needed pressure in distribution networks but excess pressure are made in some branches due to topology problems and water networks’ variables therefore using pressure valves will be inevitable. Although using PRVs is inevitable but it leads to waste electricity or fuels used by pumps because PRVs just waste excess hydraulic pressure to lower it. Pumps working in reverse or Pumps as Turbine (called PaT in this article) are easily available and also effective sources of reducing the equipment cost in small hydropower plants. Urban areas of developing countries are facing increasing in area and maybe water scarcity in near future. These cities need wider water networks which make it hard to predict, control and have a better operation in the urban water cycle. Using more energy and, therefore, more pollution, slower repairing services, more user dissatisfaction and more leakage are these networks’ serious problems. Therefore, more effective systems are needed to monitor and act in these complicated networks than what is used now. In this article a new approach is proposed and evaluated: Using PAT to produce enough energy for remote valves and sensors in the water network. These sensors can be used to determine the discharge, pressure, water quality and other important network characteristics. With the help of remote valves pipeline discharge can be controlled so Instead of wasting excess hydraulic pressure which may be destructive in some cases, obtaining extra pressure from pipeline and producing clean electricity used by remote instruments is this articles’ goal. Furthermore due to increasing the area of the network there is unwanted high pressure in some critical points which is not destructive but lowering the pressure results to longer lifetime for pipeline networks without users’ dissatisfaction. This strategy proposed in this article, leads to use PaT widely for pressure containment and producing energy needed for remote valves and sensors like what happens in supervisory control and data acquisition (SCADA) systems which make it easy for us to monitor, receive data from urban water cycle and make any needed changes in discharge and pressure of pipelines easily and remotely. This is a clean project of energy production without significant environmental impacts and can be used in urban drinking water networks, without any problem for consumers which leads to a stable and dynamic network which lowers leakage and pollution.Keywords: new energies, pump as turbine, drinking water, distribution network, remote control equipments
Procedia PDF Downloads 463224 Using Pump as Turbine in Urban Water Networks to Control, Monitor, and Simulate Water Processes Remotely
Authors: Morteza Ahmadifar, Sarah Bahari Derakhshan
Abstract:
Leakage is one of the most important problems that water distribution networks face which first reason is high-pressure existence. There are many approaches to control this excess pressure, which using pressure reducing valves (PRVs) or reducing pipe diameter are ones. On the other hand, Pumps are using electricity or fossil fuels to supply needed pressure in distribution networks but excess pressure are made in some branches due to topology problems and water networks’ variables, therefore using pressure valves will be inevitable. Although using PRVs is inevitable but it leads to waste electricity or fuels used by pumps because PRVs just waste excess hydraulic pressure to lower it. Pumps working in reverse or Pumps as Turbine (called PAT in this article) are easily available and also effective sources of reducing the equipment cost in small hydropower plants. Urban areas of developing countries are facing increasing in area and maybe water scarcity in near future. These cities need wider water networks which make it hard to predict, control and have a better operation in the urban water cycle. Using more energy and therefore more pollution, slower repairing services, more user dissatisfaction and more leakage are these networks’ serious problems. Therefore, more effective systems are needed to monitor and act in these complicated networks than what is used now. In this article a new approach is proposed and evaluated: Using PAT to produce enough energy for remote valves and sensors in the water network. These sensors can be used to determine the discharge, pressure, water quality and other important network characteristics. With the help of remote valves pipeline discharge can be controlled so Instead of wasting excess hydraulic pressure which may be destructive in some cases, obtaining extra pressure from pipeline and producing clean electricity used by remote instruments is this articles’ goal. Furthermore, due to increasing the area of network there is unwanted high pressure in some critical points which is not destructive but lowering the pressure results to longer lifetime for pipeline networks without users’ dissatisfaction. This strategy proposed in this article, leads to use PAT widely for pressure containment and producing energy needed for remote valves and sensors like what happens in supervisory control and data acquisition (SCADA) systems which make it easy for us to monitor, receive data from urban water cycle and make any needed changes in discharge and pressure of pipelines easily and remotely. This is a clean project of energy production without significant environmental impacts and can be used in urban drinking water networks, without any problem for consumers which leads to a stable and dynamic network which lowers leakage and pollution.Keywords: clean energies, pump as turbine, remote control, urban water distribution network
Procedia PDF Downloads 394223 Commissioning, Test and Characterization of Low-Tar Biomass Gasifier for Rural Applications and Small-Scale Plant
Authors: M. Mashiur Rahman, Ulrik Birk Henriksen, Jesper Ahrenfeldt, Maria Puig Arnavat
Abstract:
Using biomass gasification to make producer gas is one of the promising sustainable energy options available for small scale plant and rural applications for power and electricity. Tar content in producer gas is the main problem if it is used directly as a fuel. A low-tar biomass (LTB) gasifier of approximately 30 kW capacity has been developed to solve this. Moving bed gasifier with internal recirculation of pyrolysis gas has been the basic principle of the LTB gasifier. The gasifier focuses on the concept of mixing the pyrolysis gases with gasifying air and burning the mixture in separate combustion chamber. Five tests were carried out with the use of wood pellets and wood chips separately, with moisture content of 9-34%. The LTB gasifier offers excellent opportunities for handling extremely low-tar in the producer gas. The gasifiers producer gas had an extremely low tar content of 21.2 mg/Nm³ (avg.) and an average lower heating value (LHV) of 4.69 MJ/Nm³. Tar content found in different tests in the ranges of 10.6-29.8 mg/Nm³. This low tar content makes the producer gas suitable for direct use in internal combustion engine. Using mass and energy balances, the average gasifier capacity and cold gas efficiency (CGE) observed 23.1 kW and 82.7% for wood chips, and 33.1 kW and 60.5% for wood pellets, respectively. Average heat loss in term of higher heating value (HHV) observed 3.2% of thermal input for wood chips and 1% for wood pellets, where heat loss was found 1% of thermal input in term of enthalpy. Thus, the LTB gasifier performs better compared to typical gasifiers in term of heat loss. Equivalence ratio (ER) in the range of 0.29 to 0.41 gives better performance in terms of heating value and CGE. The specific gas production yields at the above ER range were in the range of 2.1-3.2 Nm³/kg. Heating value and CGE changes proportionally with the producer gas yield. The average gas compositions (H₂-19%, CO-19%, CO₂-10%, CH₄-0.7% and N₂-51%) obtained for wood chips are higher than the typical producer gas composition. Again, the temperature profile of the LTB gasifier observed relatively low temperature compared to typical moving bed gasifier. The average partial oxidation zone temperature of 970°C observed for wood chips. The use of separate combustor in the partial oxidation zone substantially lowers the bed temperature to 750°C. During the test, the engine was started and operated completely with the producer gas. The engine operated well on the produced gas, and no deposits were observed in the engine afterwards. Part of the producer gas flow was used for engine operation, and corresponding electrical power was found to be 1.5 kW continuously, and maximum power of 2.5 kW was also observed, while maximum generator capacity is 3 kW. A thermodynamic equilibrium model is good agreement with the experimental results and correctly predicts the equilibrium bed temperature, gas composition, LHV of the producer gas and ER with the experimental data, when the heat loss of 4% of the energy input is considered.Keywords: biomass gasification, low-tar biomass gasifier, tar elimination, engine, deposits, condensate
Procedia PDF Downloads 114222 Bio-Medical Equipment Technicians: Crucial Workforce to Improve Quality of Health Services in Rural Remote Hospitals in Nepal
Authors: C. M. Sapkota, B. P. Sapkota
Abstract:
Background: Continuous developments in science and technology are increasing the availability of thousands of medical devices – all of which should be of good quality and used appropriately to address global health challenges. It is obvious that bio medical devices are becoming ever more indispensable in health service delivery and among the key workforce responsible for their design, development, regulation, evaluation and training in their use: biomedical technician (BMET) is the crucial. As a pivotal member of health workforce, biomedical technicians are an essential component of the quality health service delivery mechanism supporting the attainment of the Sustainable Development Goals. Methods: The study was based on cross sectional descriptive design. Indicators measuring the quality of health services were assessed in Mechi Zonal Hospital (MZH) and Sagarmatha Zonal Hospital (SZH). Indicators were calculated based on the data about hospital utilization and performance of 2018 available in Medical record section of both hospitals. MZH had employed the BMET during 2018 but SZH had no BMET in 2018.Focus Group Discussion with health workers in both hospitals was conducted to validate the hospital records. Client exit interview was conducted to assess the level of client satisfaction in both the hospitals. Results: In MZH there was round the clock availability and utilization of Radio diagnostics equipment, Laboratory equipment. Operation Theater was functional throughout the year. Bed Occupancy rate in MZH was 97% but in SZH it was only 63%.In SZH, OT was functional only 54% of the days in 2018. CT scan machine was just installed but not functional. Computerized X-Ray in SZH was functional only in 72% of the days. Level of client satisfaction was 87% in MZH but was just 43% in SZH. MZH performed all (256) the Caesarean Sections but SZH performed only 36% of 210 Caesarean Sections in 2018. In annual performance ranking of Government Hospitals, MZH was placed in 1st rank while as SZH was placed in 19th rank out of 32 referral hospitals nationwide in 2018. Conclusion: Biomedical technicians are the crucial member of the human resource for health team with the pivotal role. Trained and qualified BMET professionals are required within health-care systems in order to design, evaluate, regulate, acquire, maintain, manage and train on safe medical technologies. Applying knowledge of engineering and technology to health-care systems to ensure availability, affordability, accessibility, acceptability and utilization of the safer, higher quality, effective, appropriate and socially acceptable bio medical technology to populations for preventive, promotive, curative, rehabilitative and palliative care across all levels of the health service delivery.Keywords: biomedical equipment technicians, BMET, human resources for health, HRH, quality health service, rural hospitals
Procedia PDF Downloads 126221 Leveraging Power BI for Advanced Geotechnical Data Analysis and Visualization in Mining Projects
Authors: Elaheh Talebi, Fariba Yavari, Lucy Philip, Lesley Town
Abstract:
The mining industry generates vast amounts of data, necessitating robust data management systems and advanced analytics tools to achieve better decision-making processes in the development of mining production and maintaining safety. This paper highlights the advantages of Power BI, a powerful intelligence tool, over traditional Excel-based approaches for effectively managing and harnessing mining data. Power BI enables professionals to connect and integrate multiple data sources, ensuring real-time access to up-to-date information. Its interactive visualizations and dashboards offer an intuitive interface for exploring and analyzing geotechnical data. Advanced analytics is a collection of data analysis techniques to improve decision-making. Leveraging some of the most complex techniques in data science, advanced analytics is used to do everything from detecting data errors and ensuring data accuracy to directing the development of future project phases. However, while Power BI is a robust tool, specific visualizations required by geotechnical engineers may have limitations. This paper studies the capability to use Python or R programming within the Power BI dashboard to enable advanced analytics, additional functionalities, and customized visualizations. This dashboard provides comprehensive tools for analyzing and visualizing key geotechnical data metrics, including spatial representation on maps, field and lab test results, and subsurface rock and soil characteristics. Advanced visualizations like borehole logs and Stereonet were implemented using Python programming within the Power BI dashboard, enhancing the understanding and communication of geotechnical information. Moreover, the dashboard's flexibility allows for the incorporation of additional data and visualizations based on the project scope and available data, such as pit design, rock fall analyses, rock mass characterization, and drone data. This further enhances the dashboard's usefulness in future projects, including operation, development, closure, and rehabilitation phases. Additionally, this helps in minimizing the necessity of utilizing multiple software programs in projects. This geotechnical dashboard in Power BI serves as a user-friendly solution for analyzing, visualizing, and communicating both new and historical geotechnical data, aiding in informed decision-making and efficient project management throughout various project stages. Its ability to generate dynamic reports and share them with clients in a collaborative manner further enhances decision-making processes and facilitates effective communication within geotechnical projects in the mining industry.Keywords: geotechnical data analysis, power BI, visualization, decision-making, mining industry
Procedia PDF Downloads 92220 Adapting Cyber Physical Production Systems to Small and Mid-Size Manufacturing Companies
Authors: Yohannes Haile, Dipo Onipede, Jr., Omar Ashour
Abstract:
The main thrust of our research is to determine Industry 4.0 readiness of small and mid-size manufacturing companies in our region and assist them to implement Cyber Physical Production System (CPPS) capabilities. Adopting CPPS capabilities will help organizations realize improved quality, order delivery, throughput, new value creation, and reduced idle time of machines and work centers of their manufacturing operations. The key metrics for the assessment include the level of intelligence, internal and external connections, responsiveness to internal and external environmental changes, capabilities for customization of products with reference to cost, level of additive manufacturing, automation, and robotics integration, and capabilities to manufacture hybrid products in the near term, where near term is defined as 0 to 18 months. In our initial evaluation of several manufacturing firms which are profitable and successful in what they do, we found low level of Physical-Digital-Physical (PDP) loop in their manufacturing operations, whereas 100% of the firms included in this research have specialized manufacturing core competencies that have differentiated them from their competitors. The level of automation and robotics integration is low to medium range, where low is defined as less than 30%, and medium is defined as 30 to 70% of manufacturing operation to include automation and robotics. However, there is a significant drive to include these capabilities at the present time. As it pertains to intelligence and connection of manufacturing systems, it is observed to be low with significant variance in tying manufacturing operations management to Enterprise Resource Planning (ERP). Furthermore, it is observed that the integration of additive manufacturing in general, 3D printing, in particular, to be low, but with significant upside of integrating it in their manufacturing operations in the near future. To hasten the readiness of the local and regional manufacturing companies to Industry 4.0 and transitions towards CPPS capabilities, our working group (ADMAR Working Group) in partnership with our university have been engaged with the local and regional manufacturing companies. The goal is to increase awareness, share know-how and capabilities, initiate joint projects, and investigate the possibility of establishing the Center for Cyber Physical Production Systems Innovation (C2P2SI). The center is intended to support the local and regional university-industry research of implementing intelligent factories, enhance new value creation through disruptive innovations, the development of hybrid and data enhanced products, and the creation of digital manufacturing enterprises. All these efforts will enhance local and regional economic development and educate students that have well developed knowledge and applications of cyber physical manufacturing systems and Industry 4.0.Keywords: automation, cyber-physical production system, digital manufacturing enterprises, disruptive innovation, new value creation, physical-digital-physical loop
Procedia PDF Downloads 140219 A Versatile Data Processing Package for Ground-Based Synthetic Aperture Radar Deformation Monitoring
Authors: Zheng Wang, Zhenhong Li, Jon Mills
Abstract:
Ground-based synthetic aperture radar (GBSAR) represents a powerful remote sensing tool for deformation monitoring towards various geohazards, e.g. landslides, mudflows, avalanches, infrastructure failures, and the subsidence of residential areas. Unlike spaceborne SAR with a fixed revisit period, GBSAR data can be acquired with an adjustable temporal resolution through either continuous or discontinuous operation. However, challenges arise from processing high temporal-resolution continuous GBSAR data, including the extreme cost of computational random-access-memory (RAM), the delay of displacement maps, and the loss of temporal evolution. Moreover, repositioning errors between discontinuous campaigns impede the accurate measurement of surface displacements. Therefore, a versatile package with two complete chains is developed in this study in order to process both continuous and discontinuous GBSAR data and address the aforementioned issues. The first chain is based on a small-baseline subset concept and it processes continuous GBSAR images unit by unit. Images within a window form a basic unit. By taking this strategy, the RAM requirement is reduced to only one unit of images and the chain can theoretically process an infinite number of images. The evolution of surface displacements can be detected as it keeps temporarily-coherent pixels which are present only in some certain units but not in the whole observation period. The chain supports real-time processing of the continuous data and the delay of creating displacement maps can be shortened without waiting for the entire dataset. The other chain aims to measure deformation between discontinuous campaigns. Temporal averaging is carried out on a stack of images in a single campaign in order to improve the signal-to-noise ratio of discontinuous data and minimise the loss of coherence. The temporal-averaged images are then processed by a particular interferometry procedure integrated with advanced interferometric SAR algorithms such as robust coherence estimation, non-local filtering, and selection of partially-coherent pixels. Experiments are conducted using both synthetic and real-world GBSAR data. Displacement time series at the level of a few sub-millimetres are achieved in several applications (e.g. a coastal cliff, a sand dune, a bridge, and a residential area), indicating the feasibility of the developed GBSAR data processing package for deformation monitoring of a wide range of scientific and practical applications.Keywords: ground-based synthetic aperture radar, interferometry, small baseline subset algorithm, deformation monitoring
Procedia PDF Downloads 161