Search results for: iterative methods
12993 Analysis Influence Variation Frequency on Characterization of Nano-Particles in Preteatment Bioetanol Oil Palm Stem (Elaeis guineensis JACQ) Use Sonication Method with Alkaline Peroxide Activators on Improvement of Celullose
Authors: Luristya Nur Mahfut, Nada Mawarda Rilek, Ameiga Cautsarina Putri, Mujaroh Khotimah
Abstract:
The use of bioetanol from lignocellulosic material has begone to be developed. In Indonesia the most abundant lignocellulosic material is stem of palm which contain 32.22% of cellulose. Indonesia produces approximatelly 300.375.000 tons of stem of palm each year. To produce bioetanol from lignocellulosic material, the first process is pretreatment. But, until now the method of lignocellulosic pretretament is uneffective. This is related to the particle size and the method of pretreatment of less than optimal so that led to an overhaul of the lignin insufficient, consequently increased levels of cellulose was not significant resulting in low yield of bioetanol. To solve the problem, this research was implemented by using the process of pretreatment method ultasonifikasi in order to produce higher pulp with nano-sized particles that will obtain higher of yield ethanol from stem of palm. Research methods used in this research is the RAK that is composed of one factor which is the frequency ultrasonic waves with three varians, they are 30 kHz, 40 kHz, 50 kHz, and use constant variable is concentration of NaOH. The analysis conducted in this research is the influence of the frequency of the wave to increase levels of cellulose and change size on the scale of nanometers on pretreatment process by using the PSA methods (Particle Size Analyzer), and a Cheason. For the analysis of the results, data, and best treatment using ANOVA and test BNT with confidence interval 5%. The best treatment was obtained by combination X3 (frequency of sonication 50 kHz) and lignin (19,6%) cellulose (59,49%) and hemicellulose (11,8%) with particle size 385,2nm (18,8%).Keywords: bioethanol, pretreatment, stem of palm, cellulosa
Procedia PDF Downloads 32712992 An Audit on Tracheal Tube Cuff Pressure Check and Monitoring during Current Practice
Authors: Mahmoud Hassanin, Roshan Thawale, Kiran Yelamati
Abstract:
Background: During current practice, intraoperative regular endotracheal cuff pressure monitoring is not routine, despite the significant number of clinicians interested in checking it after intubation to ensure a good seal and adequate ventilation. Aims and objectives: to highlight that the current practice has no guidance related to regular intra-operative monitoring of the endotracheal tube cuff pressure, which can improve patient safety and post-operative experience. Methods: local department survey was done targeting anaesthetists' current practice, measuring their knowledge and problem awareness to improve patient satisfaction and change the current approach. Results: The participants were not using the manometer, despite their interest in ensuring that the cuff pressure was high enough and there was a proper seal. More than 50% of the participant don't know the ideal range of the endotracheal tube cuff pressure range, and 32% don't know whether it is available or not in the theatre. Despite the previous finding, 100% of the participants used different methods to ensure adequate cuff pressure. The collected data revealed that at least 26% of the participant confirmed that they had seen patients having post-intubation complications. Conclusion: There is an increasing importance placed on quality assurance. Clinical practice varies widely among practitioners, with the only consistency being the omission of cuff manometers during routine intra-operative management, despite their proven benefit and efficacy. Encourage the anaesthetists and ODPs to use cuff pressure manometers. The availability of portable pressure manometers can help to maintain safe cuff pressures in patients requiring endotracheal intubation.Keywords: endotracheal cuff pressure, intra-operative monitoring, current practice, patient satisfaction
Procedia PDF Downloads 10612991 Time-Domain Analysis Approaches of Soil-Structure Interaction: A Comparative Study
Authors: Abdelrahman Taha, Niloofar Malekghaini, Hamed Ebrahimian, Ramin Motamed
Abstract:
This paper compares the substructure and direct methods for soil-structure interaction (SSI) analysis in the time domain. In the substructure SSI method, the soil domain is replaced by a set of springs and dashpots, also referred to as the impedance function, derived through the study of the behavior of a massless rigid foundation. The impedance function is inherently frequency dependent, i.e., it varies as a function of the frequency content of the structural response. To use the frequency-dependent impedance function for time-domain SSI analysis, the impedance function is approximated at the fundamental frequency of the structure-soil system. To explore the potential limitations of the substructure modeling process, a two-dimensional reinforced concrete frame structure is modeled using substructure and direct methods in this study. The results show discrepancies between the simulated responses of the substructure and the direct approaches. To isolate the effects of higher modal responses, the same study is repeated using a harmonic input motion, in which a similar discrepancy is still observed between the substructure and direct approaches. It is concluded that the main source of discrepancy between the substructure and direct SSI approaches is likely attributed to the way the impedance functions are calculated, i.e., assuming a massless rigid foundation without considering the presence of the superstructure. Hence, a refined impedance function, considering the presence of the superstructure, shall be developed. This refined impedance function is expected to significantly improve the simulation accuracy of the substructure approach for structural systems whose behavior is dominated by the fundamental mode response.Keywords: direct approach, impedance function, soil-structure interaction, substructure approach
Procedia PDF Downloads 11712990 Challenges Encountered by Small Business Owners in Building Their Social Media Marketing Competency
Authors: Nilay Balkan
Abstract:
Introductory statement: The purpose of this study is to understand how small business owners develop social media marketing competency, the challenges they encounter in doing so, and establish the social media training needs of such businesses. These challenges impact the extent to which small business owners build effective social media knowledge and, in turn, impact their ability to implement effective social media marketing into their business practices. This means small businesses are not fully able to benefit from social media, such as benefits to customer relationship management or increasing brand image, which would support the overall business operations for these businesses. This research is part one of a two-phased study. The first phase aims to establish the challenges small business owners face in building social media marketing competency and their specific training needs. Phase two will then focus in more depth on the barriers and challenges emerging from phase one. Summary of Methodology: Interviews with ten small business owners were conducted from various sectors, including fitness, tourism, food, and drinks. These businesses were located in the central belt of Scotland, which is an area with the highest population and business density in Scotland. These interviews were in-depth and semi-structured, with the purpose of being investigative and understanding the phenomena from the lived experience of the small business owners. A purposive sampling was used, where small business owners fulfilling certain criteria were approached to take part in the interviews. Key findings: The study found four ways in which small business owners develop their social media competency (informal methods, formal methods, learning through a network, and experimenting) and the various challenges they face with these methods. Further, the study established four barriers impacting the development of social media marketing competency among the interviewed small business owners. In doing so, preliminary support needs have also emerged. Concluding statement: The contribution of this study is to understand the challenges small business owners face when learning how to use social media for business purposes and identifying their training needs. This understanding can help the development of specific and tailored support. In addition, specific and tailored training can support small businesses in building competency. This supports small businesses to progress to the next stage of their development, which could be to further their digital transformation or grow their business. The insights from this study can be used to support business competitiveness and support small businesses to become more resilient. Moreover, small businesses and entrepreneurs share some similar characteristics, such as limited resources and conflicting priorities, and the findings of this study may be able to support entrepreneurs in their social media marketing strategies as well.Keywords: small business, marketing theory and applications, social media marketing, strategic management, digital competency, digitalisation, marketing research and strategy, entrepreneurship
Procedia PDF Downloads 9112989 Baseline Study of Water Quality in Indonesia Using Dynamic Methods and Technologies
Authors: R. L. P. de Lima, F. C. B. Boogaard, D. Setyo Rini, P. Arisandi, R. E. de Graaf-Van Dinther
Abstract:
Water quality in many Asian countries is very poor due to inefficient solid waste management, high population growth and the lack of sewage and purification systems for households and industry. A consortium of Indonesian and Dutch organizations has begun a large-scale international research project to evaluate and propose solutions to face the surface water pollution challenges in Brantas Basin, Indonesia (East Java: Malang / Surabaya). The first phase of the project consisted in a baseline study to assess the current status of surface water bodies and to determine the ambitions and strategies among local stakeholders. This study was conducted with high participatory / collaborative and knowledge sharing objectives. Several methods such as using mobile sensors (attached to boats or underwater drones), test strips and mobile apps, bio-monitoring (sediments), ecology scans using underwater cameras, or continuous / static measurements, were applied in different locations in the regions of the basin, at multiple locations within the water systems (e.g. spring, upstream / downstream of industry and urban areas, mouth of the Surabaya River, groundwater). Results gave an indication of (reference) values of basic water quality parameters such as turbidity, electrical conductivity, dissolved oxygen or nutrients (ammonium / nitrate). An important outcome was that collecting random samples may not be representative of a body of water, given that water quality parameters can vary widely in space (x, y, and depth) and time (day / night and seasonal). Innovative / dynamic monitoring methods (e.g. underwater drones, sensors on boats) can contribute to better understand the quality of the living environment (water, ecology, sediment) and factors that affect it. The field work activities, in particular, underwater drones, revealed potential as awareness actions as they attracted interest from locals and local press. This baseline study involved the cooperation with local managing organizations with Dutch partners, and their willingness to work together is important to ensure participatory actions and social awareness regarding the process of adaptation and strengthening of regulations, or for the construction of facilities such as sewage.Keywords: water quality monitoring, pollution, underwater drones, social awareness
Procedia PDF Downloads 19212988 Political Corruption in an Authoritarian Regime: a Story from the Kingdom of Morocco
Authors: Noureddine Radouai
Abstract:
Corruption is an endemic phenomenon in many countries around the globe. Morocco, as an authoritarian regime, relies on corruption for monarchy survival. I analyze the Makhzen structure and methods that it follows to exchange corruption for political loyalty. The abuse of power in Morocco is sponsored by the monarch itself as it is its way to remain its importance in the regime.Keywords: corruption, Clientelism, authoritarian regime, Morocco
Procedia PDF Downloads 14012987 Vertebral Pain Features in Women of Different Age Depending on Body Mass Index
Authors: Vladyslav Povoroznyuk, Tetiana Orlуk, Nataliia Dzerovych
Abstract:
Introduction: Back pain is an extremely common health care problem worldwide. Many studies show a link between an obesity and risk of lower back pain. The aim is to study correlation and peculiarities of vertebral pain in women of different age depending on their anthropometric indicators. Materials: 1886 women aged 25-89 years were examined. The patients were divided into groups according to age (25-44, 45-59, 60-74, 75-89 years old) and body mass index (BMI: to 18.4 kg/m2 (underweight), 18.5-24.9 kg/m2 (normal), 25-30 kg/m2 (overweight) and more than 30.1 kg/m2 (obese). Methods: The presence and intensity of pain was evaluated in the thoracic and lumbar spine using a visual analogue scale (VAS). BMI is calculated by the standard formula based on body weight and height measurements. Statistical analysis was performed using parametric and nonparametric methods. Significant changes were considered as p <0.05. Results: The intensity of pain in the thoracic spine was significantly higher in the underweight women in the age groups of 25-44 years (p = 0.04) and 60-74 years (p=0.005). The intensity of pain in the lumbar spine was significantly higher in the women of 45-59 years (p = 0.001) and 60-74 years (p = 0.0003) with obesity. In the women of 45-74 years BMI was significantly positively correlated with the level of pain in the lumbar spine. Obesity significantly increases the relative risk of pain in the lumbar region (RR=0.07 (95% CI: 1.03-1.12; p=0.002)), while underweight significantly increases the risk of pain in the thoracic region (RR=1.21 (95% CI: 1.00-1.46; p=0.05)). Conclusion: In women, vertebral pain syndrome may be related to the anthropometric characteristics (e.g., BMI). Underweight may indirectly influence the development of pain in the thoracic spine and increase the risk of pain in this part by 1.21 times. Obesity influences the development of pain in the lumbar spine increasing the risk by 1.07 times.Keywords: body mass index, age, pain in thoracic and lumbar spine, women
Procedia PDF Downloads 36512986 Empowering Transformers for Evidence-Based Medicine
Authors: Jinan Fiaidhi, Hashmath Shaik
Abstract:
Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers
Procedia PDF Downloads 4312985 Development of a Direct Immunoassay for Human Ferritin Using Diffraction-Based Sensing Method
Authors: Joel Ballesteros, Harriet Jane Caleja, Florian Del Mundo, Cherrie Pascual
Abstract:
Diffraction-based sensing was utilized in the quantification of human ferritin in blood serum to provide an alternative to label-based immunoassays currently used in clinical diagnostics and researches. The diffraction intensity was measured by the diffractive optics technology or dotLab™ system. Two methods were evaluated in this study: direct immunoassay and direct sandwich immunoassay. In the direct immunoassay, human ferritin was captured by human ferritin antibodies immobilized on an avidin-coated sensor while the direct sandwich immunoassay had an additional step for the binding of a detector human ferritin antibody on the analyte complex. Both methods were repeatable with coefficient of variation values below 15%. The direct sandwich immunoassay had a linear response from 10 to 500 ng/mL which is wider than the 100-500 ng/mL of the direct immunoassay. The direct sandwich immunoassay also has a higher calibration sensitivity with value 0.002 Diffractive Intensity (ng mL-1)-1) compared to the 0.004 Diffractive Intensity (ng mL-1)-1 of the direct immunoassay. The limit of detection and limit of quantification values of the direct immunoassay were found to be 29 ng/mL and 98 ng/mL, respectively, while the direct sandwich immunoassay has a limit of detection (LOD) of 2.5 ng/mL and a limit of quantification (LOQ) of 8.2 ng/mL. In terms of accuracy, the direct immunoassay had a percent recovery of 88.8-93.0% in PBS while the direct sandwich immunoassay had 94.1 to 97.2%. Based on the results, the direct sandwich immunoassay is a better diffraction-based immunoassay in terms of accuracy, LOD, LOQ, linear range, and sensitivity. The direct sandwich immunoassay was utilized in the determination of human ferritin in blood serum and the results are validated by Chemiluminescent Magnetic Immunoassay (CMIA). The calculated Pearson correlation coefficient was 0.995 and the p-values of the paired-sample t-test were less than 0.5 which show that the results of the direct sandwich immunoassay was comparable to that of CMIA and could be utilized as an alternative analytical method.Keywords: biosensor, diffraction, ferritin, immunoassay
Procedia PDF Downloads 35412984 Study of the Allelopathic Effects of Certain Aromatic Plants on Grapevines
Authors: Tinatin Shengelia, Mzia Beruashvili
Abstract:
In organic farming, including organic viticulture, biodiversity plays a crucial role. Properly selected ‘companion’ and helper plants create favorable conditions for the growth and development of the main crop. Additionally, they can provide protection from pests and diseases, suppress weeds, improve the crop’s visual and taste characteristics, enhance nutrient absorption from the soil, and, as a result of all these factors, increase yields. The use of companion plants is particularly relevant for organic farms, where the range of pesticides and fertilizers is significantly restricted by organic regulations, and they must be replaced with alternative, environmentally safe methods. Therefore, the aim of this research was to study the allelopathic effects of companion aromatic plants on grapevines. The research employed methods used in organic farming and the biological control of harmful organisms. The experiments were conducted in control and experimental plots, each with three replications on equal areas (50 m²). The allelopathic potential of medicinal hyssop (Hyssopus officinalis), basil (Ocimum basilicum), marigold or Imeretian saffron (Tagetes patula), and lavender (Lavandula angustifolia L.) was studied in vineyards located in the Mtskheta-Mtianeti and Kakheti regions. The impact of these plants on grapevines (Vitis vinifera L.) (variety Muscat petitgrain), their growth and development according to the BBCH scale, yields, and diseases caused by certain pathogenic microorganisms (downy mildew, powdery mildew, anthracnose) were determined. Additionally, the biological, agricultural, and economic efficiency of using these companion plants was assessed.Keywords: organic farming, biodiversity, allelopathy, aromatic plants
Procedia PDF Downloads 2012983 Efficacy of Deep Learning for Below-Canopy Reconstruction of Satellite and Aerial Sensing Point Clouds through Fractal Tree Symmetry
Authors: Dhanuj M. Gandikota
Abstract:
Sensor-derived three-dimensional (3D) point clouds of trees are invaluable in remote sensing analysis for the accurate measurement of key structural metrics, bio-inventory values, spatial planning/visualization, and ecological modeling. Machine learning (ML) holds the potential in addressing the restrictive tradeoffs in cost, spatial coverage, resolution, and information gain that exist in current point cloud sensing methods. Terrestrial laser scanning (TLS) remains the highest fidelity source of both canopy and below-canopy structural features, but usage is limited in both coverage and cost, requiring manual deployment to map out large, forested areas. While aerial laser scanning (ALS) remains a reliable avenue of LIDAR active remote sensing, ALS is also cost-restrictive in deployment methods. Space-borne photogrammetry from high-resolution satellite constellations is an avenue of passive remote sensing with promising viability in research for the accurate construction of vegetation 3-D point clouds. It provides both the lowest comparative cost and the largest spatial coverage across remote sensing methods. However, both space-borne photogrammetry and ALS demonstrate technical limitations in the capture of valuable below-canopy point cloud data. Looking to minimize these tradeoffs, we explored a class of powerful ML algorithms called Deep Learning (DL) that show promise in recent research on 3-D point cloud reconstruction and interpolation. Our research details the efficacy of applying these DL techniques to reconstruct accurate below-canopy point clouds from space-borne and aerial remote sensing through learned patterns of tree species fractal symmetry properties and the supplementation of locally sourced bio-inventory metrics. From our dataset, consisting of tree point clouds obtained from TLS, we deconstructed the point clouds of each tree into those that would be obtained through ALS and satellite photogrammetry of varying resolutions. We fed this ALS/satellite point cloud dataset, along with the simulated local bio-inventory metrics, into the DL point cloud reconstruction architectures to generate the full 3-D tree point clouds (the truth values are denoted by the full TLS tree point clouds containing the below-canopy information). Point cloud reconstruction accuracy was validated both through the measurement of error from the original TLS point clouds as well as the error of extraction of key structural metrics, such as crown base height, diameter above root crown, and leaf/wood volume. The results of this research additionally demonstrate the supplemental performance gain of using minimum locally sourced bio-inventory metric information as an input in ML systems to reach specified accuracy thresholds of tree point cloud reconstruction. This research provides insight into methods for the rapid, cost-effective, and accurate construction of below-canopy tree 3-D point clouds, as well as the supported potential of ML and DL to learn complex, unmodeled patterns of fractal tree growth symmetry.Keywords: deep learning, machine learning, satellite, photogrammetry, aerial laser scanning, terrestrial laser scanning, point cloud, fractal symmetry
Procedia PDF Downloads 10212982 Periareolar Zigzag Incision in the Conservative Surgical Treatment of Breast Cancer
Authors: Beom-Seok Ko, Yoo-Seok Kim, Woo-Sung Lim, Ku-Sang Kim, Hyun-Ah Kim, Jin-Sun Lee, An-Bok Lee, Jin-Gu Bong, Tae-Hyun Kim, Sei-Hyun Ahn
Abstract:
Background: Breast conserving surgery (BCS) followed by radiation therapy is today standard therapy for early breast cancer. It is safe therapeutic procedure in early breast cancers, because it provides the same level of overall survival as mastectomy. There are a number of different types of incisions used to BCS. Avoiding scars on the breast is women’s desire. Numerous minimal approaches have evolved due to this concern. Periareolar incision is often used when the small tumor relatively close to the nipple. But periareolar incision has a disadvantages include limited exposure of the surgical field. In plastic surgery, various methods such as zigzag incisions have been recommended to achieve satisfactory esthetic results. Periareolar zigzag incision has the advantage of not only good surgical field but also contributed to better surgical scars. The purpose of this study was to evaluate the oncological safety of procedures by studying the status of the surgical margins of the excised tumor specimen and reduces the need for further surgery. Methods: Between January 2016 and September 2016, 148 women with breast cancer underwent BCS or mastectomy by the same surgeon in ASAN medical center. Patients with exclusion criteria were excluded from this study if they had a bilateral breast cancer or underwent resection of the other tumors or taken axillary dissection or performed other incision methods. Periareolar zigzag incision was performed and excision margins of the specimen were identified frozen sections and paraffin-embedded or permanent sections in all patients in this study. We retrospectively analyzed tumor characteristics, the operative time, size of specimen, the distance from the tumor to nipple. Results: A total of 148 patients were reviewed, 72 included in the final analysis, 76 excluded. The mean age of the patients was 52.6 (range 25-19 years), median tumor size was 1.6 cm (range, 0.2-8.8), median tumor distance from the nipple was 4.0 cm (range, 1.0-9.0), median excised specimen sized was 5.1 cm (range, 2.8-15.0), median operation time was 70.0 minute (range, 39-138). All patients were discharged with no sign of infection or skin necrosis. Free resection margin was confirmed by frozen biopsy and permanent biopsy in all samples. There were no patients underwent reoperation. Conclusions: We suggest that periareolar zigzag incision can provide a good surgical field to remove a relatively large tumor and may provide cosmetically good outcomes.Keywords: periareolar zigzag incision, breast conserving surgery, breast cancer, resection margin
Procedia PDF Downloads 23012981 Importance of Risk Assessment in Managers´ Decision-Making Process
Authors: Mária Hudáková, Vladimír Míka, Katarína Hollá
Abstract:
Making decisions is the core of management and a result of conscious activities which is under way in a particular environment and concrete conditions. The managers decide about the goals, procedures and about the methods how to respond to the changes and to the problems which developed. Their decisions affect the effectiveness, quality, economy and the overall successfulness in every organisation. In spite of this fact, they do not pay sufficient attention to the individual steps of the decision-making process. They emphasise more how to cope with the individual methods and techniques of making decisions and forget about the way how to cope with analysing the problem or assessing the individual solution variants. In many cases, the underestimating of the analytical phase can lead to an incorrect assessment of the problem and this can then negatively influence its further solution. Based on our analysis of the theoretical solutions by individual authors who are dealing with this area and the realised research in Slovakia and also abroad we can recognise an insufficient interest of the managers to assess the risks in the decision-making process. The goal of this paper is to assess the risks in the managers´ decision-making process relating to the conditions of the environment, to the subject’s activity (the manager’s personality), to the insufficient assessment of individual variants for solving the problems but also to situations when the arisen problem is not solved. The benefit of this paper is the effort to increase the need of the managers to deal with the risks during the decision-making process. It is important for every manager to assess the risks in his/her decision-making process and to make efforts to take such decisions which reflect the basic conditions, states and development of the environment in the best way and especially for the managers´ decisions to contribute to achieving the determined goals of the organisation as effectively as possible.Keywords: risk, decision-making, manager, process, analysis, source of risk
Procedia PDF Downloads 26412980 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection
Authors: Mahshid Arabi
Abstract:
With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.Keywords: data protection, digital technologies, information security, modern management
Procedia PDF Downloads 2912979 African Swine Fewer Situation and Diagnostic Methods in Lithuania
Authors: Simona Pileviciene
Abstract:
On 24th January 2014, Lithuania notified two primary cases of African swine fever (ASF) in wild boars. The animals were tested positive for ASF virus (ASFV) genome by real-time PCR at the National Reference Laboratory for ASF in Lithuania (NRL), results were confirmed by the European Union Reference Laboratory for African swine fever (CISA-INIA). Intensive wild and domestic animal monitoring program was started. During the period of 2014-2017 ASF was confirmed in two large commercial pig holding with the highest biosecurity. Pigs were killed and destroyed. Since 2014 ASF outbreak territory from east and south has expanded to the middle of Lithuania. Diagnosis by PCR is one of the highly recommended diagnostic methods by World Organization for Animal Health (OIE) for diagnosis of ASF. The aim of the present study was to compare singleplex real-time PCR assays to a duplex assay allowing the identification of ASF and internal control in a single PCR tube and to compare primers, that target the p72 gene (ASF 250 bp and ASF 75 bp) effectivity. Multiplex real-time PCR assays prove to be less time consuming and cost-efficient and therefore have a high potential to be applied in the routine analysis. It is important to have effective and fast method that allows virus detection at the beginning of disease for wild boar population and in outbreaks for domestic pigs. For experiments, we used reference samples (INIA, Spain), and positive samples from infected animals in Lithuania. Results show 100% sensitivity and specificity.Keywords: African swine fewer, real-time PCR, wild boar, domestic pig
Procedia PDF Downloads 16612978 Prevalence and Genetic Determinant of Drug Resistant Tuberculosis among Patients Completing Intensive Phase of Treatment in a Tertiary Referral Center in Nigeria
Authors: Aminu Bashir Mohammad, Agwu Ezera, Abdulrazaq G. Habib, Garba Iliyasu
Abstract:
Background: Drug resistance tuberculosis (DR-TB) continues to be a challenge in developing countries with poor resources. Routine screening for primary DR-TB before commencing treatment is not done in public hospitals in Nigeria, even with the large body of evidence that shows a high prevalence of primary DR-TB. Data on drug resistance and its genetic determinant among follow up TB patients is lacking in Nigeria. Hence the aim of this study was to determine the prevalence and genetic determinant of drug resistance among follow up TB patients in a tertiary hospital in Nigeria. Methods: This was a cross-sectional laboratory-based study conducted on 384 sputum samples collected from consented follow-up tuberculosis patients. Standard microbiology methods (Zeil-Nielsen staining and microscopy) and PCR (Line Probe Assay)] were used to analyze the samples collected. Person’s Chi-square was used to analyze the data generated. Results: Out of three hundred and eighty-four (384) sputum samples analyzed for mycobacterium tuberculosis (MTB) and DR-TB twenty-five 25 (6.5%) were found to be AFB positive. These samples were subjected to PCR (Line Probe Assay) out of which 18(72%) tested positive for DR-TB. Mutations conferring resistance to rifampicin (rpo B) and isoniazid (katG, and or inhA) were detected in 12/18(66.7%) and 6/18(33.3%), respectively. Transmission dynamic of DR-TB was not significantly (p>0.05) dependent on demographic characteristics. Conclusion: There is a need to strengthened the laboratory capacity for diagnosis of TB and drug resistance testing and make these services available, affordable, and accessible to the patients who need them.Keywords: drug resistance tuberculosis, genetic determinant, intensive phase, Nigeria
Procedia PDF Downloads 28512977 Using Seismic Base Isolation Systems in High-Rise Hospital Buildings and a Hybrid Proposal
Authors: Elif Bakkaloglu, Necdet Torunbalci
Abstract:
The fact of earthquakes in Turkiye is an inevitable natural disaster. Therefore, buildings must be prepared for this natural hazard. Especially in hospital buildings, earthquake resistance is an essential point because hospitals are one of the first places where people come after an earthquake. Although hospital buildings are more suitable for horizontal architecture, it is necessary to construct and expand multi-storey hospital buildings due to difficulties in finding suitable places as a result of excessive urbanization, difficulties in obtaining appropriate size land and decrease in suitable places and increase in land values. In Turkiye, using seismic isolators in public hospitals, which are placed in first-degree earthquake zone and have more than 100 beds, is made obligatory by general instruction. As a result of this decision, it may sometimes be necessary to construct seismic isolated multi-storey hospital buildings in cities where those problems are experienced. Although widespread use of seismic isolators in Japan, there are few multi-storey buildings in which seismic isolators are used in Turkiye. As it is known, base isolation systems are the most effective methods of earthquake resistance, as number of floors increases, center of gravity moves away from base in multi-storey buildings, increasing the overturning effect and limiting the use of these systems. In this context, it is aimed to investigate structural systems of multi-storey buildings which built using seismic isolation methods in the World. In addition to this, a working principle is suggested for disseminating seismic isolators in multi-storey hospital buildings. The results to be obtained from the study will guide architects who design multi-storey hospital buildings in their architectural designs and engineers in terms of structural system design.Keywords: earthquake, energy absorbing systems, hospital, seismic isolation systems
Procedia PDF Downloads 15112976 Homogenization of a Non-Linear Problem with a Thermal Barrier
Authors: Hassan Samadi, Mustapha El Jarroudi
Abstract:
In this work, we consider the homogenization of a non-linear problem in periodic medium with two periodic connected media exchanging a heat flux throughout their common interface. The interfacial exchange coefficient λ is assumed to tend to zero or to infinity following a rate λ=λ(ε) when the size ε of the basic cell tends to zero. Three homogenized problems are determined according to some critical value depending of λ and ε. Our method is based on Γ-Convergence techniques.Keywords: variational methods, epiconvergence, homogenization, convergence technique
Procedia PDF Downloads 52512975 Contactless Electromagnetic Detection of Stress Fluctuations in Steel Elements
Authors: M. A. García, J. Vinolas, A. Hernando
Abstract:
Steel is nowadays one of the most important structural materials because of its outstanding mechanical properties. Therefore, in order to look for a sustainable economic model and to optimize the use of extensive resources, new methods to monitor and prevent failure of steel-based facilities are required. The classical mechanical tests, as for instance building tasting, are invasive and destructive. Moreover, for facilities where the steel element is embedded, (as reinforced concrete) these techniques are directly non applicable. Hence, non-invasive monitoring techniques to prevent failure, without altering the structural properties of the elements are required. Among them, electromagnetic methods are particularly suitable for non-invasive inspection of the mechanical state of steel-based elements. The magnetoelastic coupling effects induce a modification of the electromagnetic properties of an element upon applied stress. Since most steels are ferromagnetic because of their large Fe content, it is possible to inspect their structure and state in a non-invasive way. We present here a distinct electromagnetic method for contactless evaluation of internal stress in steel-based elements. In particular, this method relies on measuring the magnetic induction between two coils with the steel specimen in between them. We found that the alteration of electromagnetic properties of the steel specimen induced by applied stress-induced changes in the induction allowed us to detect stress well below half of the elastic limit of the material. Hence, it represents an outstanding non-invasive method to prevent failure in steel-based facilities. We here describe the theoretical model, present experimental results to validate it and finally we show a practical application for detection of stress and inhomogeneities in train railways.Keywords: magnetoelastic, magnetic induction, mechanical stress, steel
Procedia PDF Downloads 5012974 Suicide, Help-Seeking and LGBT Youth: A Mixed Methods Study
Authors: Elizabeth McDermott, Elizabeth Hughes, Victoria Rawlings
Abstract:
Globally, suicide is the second leading cause of death among 15–29 year-olds. Young people who identify as lesbian, gay, bisexual and transgender (LGBT) have elevated rates of suicide and self-harm. Despite the increased risk, there is a paucity of research on LGBT help-seeking and suicidality. This is the first national study to investigate LGBT youth help-seeking for suicidal feelings and self-harm. We report on a UK sequential exploratory mixed method study that employed face-to-face and online methods in two stages. Stage one involved 29 online (n=15) and face-to-face (n=14) semi-structured interviews with LGBT youth aged under 25 years old. Stage two utilized an online LGBT youth questionnaire employing a community-based sampling strategy (n=789). We found across the sample that LGBT youth who self-harmed or felt suicidal were reluctant to seek help. Results indicated that participants were normalizing their emotional distress and only asked for help when they reached crisis point and were no longer coping. Those who self-harmed (p<0.001, OR=2.82), had attempted or planned suicide (p<0.05, OR=1.48), or had experience of abuse related to their sexuality or gender (p<0.01, OR=1.80), were most likely to seek help. There were a number of interconnecting reasons that contributed to participants’ problems accessing help. The most prominent of these were: negotiating norms in relation to sexuality, gender, mental health and age; being unable to talk about emotions, and coping and self-reliance. It is crucial that policies and practices that aim to prevent LGBT youth suicide recognize that norms and normalizing processes connected to sexual orientation and gender identity are additional difficulties that LGBT youth have accessing mental health support.Keywords: help-seeking, LGBT, suicide, youth
Procedia PDF Downloads 27512973 Green Organic Chemistry, a New Paradigm in Pharmaceutical Sciences
Authors: Pesaru Vigneshwar Reddy, Parvathaneni Pavan
Abstract:
Green organic chemistry which is the latest and one of the most researched topics now-a- days has been in demand since 1990’s. Majority of the research in green organic chemistry chemicals are some of the important starting materials for greater number of major chemical industries. The production of organic chemicals has raw materials (or) reagents for other application is major sector of manufacturing polymers, pharmaceuticals, pesticides, paints, artificial fibers, food additives etc. organic synthesis on a large scale compound to the labratory scale, involves the use of energy, basic chemical ingredients from the petro chemical sectors, catalyst and after the end of the reaction, seperation, purification, storage, packing distribution etc. During these processes there are many problems of health and safety for workers in addition to the environmental problems caused there by use and deposition as waste. Green chemistry with its 12 principles would like to see changes in conventional way that were used for decades to make synthetic organic chemical and the use of less toxic starting materials. Green chemistry would like to increase the efficiency of synthetic methods, to use less toxic solvents, reduce the stage of synthetic routes and minimize waste as far as practically possible. In this way, organic synthesis will be part of the effort for sustainable development Green chemistry is also interested for research and alternatives innovations on many practical aspects of organic synthesis in the university and research labaratory of institutions. By changing the methodologies of organic synthesis, health and safety will be advanced in the small scale laboratory level but also will be extended to the industrial large scale production a process through new techniques. The three key developments in green chemistry include the use of super critical carbondioxide as green solvent, aqueous hydrogen peroxide as an oxidising agent and use of hydrogen in asymmetric synthesis. It also focuses on replacing traditional methods of heating with that of modern methods of heating like microwaves traditions, so that carbon foot print should reduces as far as possible. Another beneficiary of this green chemistry is that it will reduce environmental pollution through the use of less toxic reagents, minimizing of waste and more bio-degradable biproducts. In this present paper some of the basic principles, approaches, and early achievements of green chemistry has a branch of chemistry that studies the laws of passing of chemical reactions is also considered, with the summarization of green chemistry principles. A discussion about E-factor, old and new synthesis of ibuprofen, microwave techniques, and some of the recent advancements also considered.Keywords: energy, e-factor, carbon foot print, micro-wave, sono-chemistry, advancement
Procedia PDF Downloads 30612972 Current Epizootic Situation of Q Fever in Polish Cattle
Authors: Monika Szymańska-Czerwińska, Agnieszka Jodełko, Krzysztof Niemczuk
Abstract:
Q fever (coxiellosis) is an infectious disease of animals and humans causes by C. burnetii and widely distributed throughout the world. Cattle and small ruminants are commonly known as shedders of C. burnetii. The aims of this study were the evaluation of seroprevalence and shedding of C. burnetii in cattle. Genotypes of the pathogen present in the tested specimens were also identified using MLVA (Multiple Locus Variable-Number Tandem Repeat Analysis) and MST (multispacer sequence typing) methods. Sampling was conducted in different regions of Poland in 2018-2021. In total, 2180 bovine serum samples from 801 cattle herds were tested by ELISA (enzyme-linked immunosorbent assay). 489 specimens from 157 cattle herds such as: individual milk samples (n=407), bulk tank milk (n=58), vaginal swabs (n=20), placenta (n=3) and feces (n=1) were subjected to C. burnetii specific qPCR. The qPCR (IS1111 transposon-like repetitive region) was performed using Adiavet COX RealTime PCR kit. Genotypic characterization of the strains was conducted utilizing MLVA and MST methods. MLVA was performed using 6 variable loci. The overall herd-level seroprevalence of C. burnetii infection was 36.74% (801/2180). Shedders were detected in 29.3% (46/157) cattle herds in all tested regions. ST 61 sequence type was identified in 10 out of 18 genotyped strains. Interestingly one strain represents sequence type which has never been recorded previously. MLVA method identified three previously known genotypes: most common was J but also I and BE were recognized. Moreover, a one genotype has never been described previously. Seroprevalence and shedding of C. burnetii in cattle is common and strains are genetically diverse.Keywords: Coxiella burnetii, cattle, MST, MLVA, Q fever
Procedia PDF Downloads 8612971 Relative Effectiveness of Inquiry: Approach and Expository Instructional Methods in Fostering Students’ Retention in Chemistry
Authors: Joy Johnbest Egbo
Abstract:
The study was designed to investigate the relative effectiveness of inquiry role approach and expository instructional methods in fostering students’ retention in chemistry. Two research questions were answered and three null hypotheses were formulated and tested at 0.05 level of significance. A quasi-experimental (the non-equivalent pretest, posttest control group) design was adopted for the study. The population for the study comprised all senior secondary school class two (SS II) students who were offering Chemistry in single sex schools in Enugu Education Zone. The instrument for data collection was a self-developed Chemistry Retention Test (CRT). Relevant data were collected from a sample of one hundred and forty–one (141) students drawn from two secondary schools (1 male and 1 female schools) using simple random sampling technique. A reliability co-efficient of 0.82 was obtained for the instrument using Kuder Richardson formular20 (K-R20). Mean and Standard deviation scores were used to answer the research questions while two–way analysis of covariance (ANCOVA) was used to test the hypotheses. The findings showed that the students taught with Inquiry role approach retained the chemistry concept significantly higher than their counterparts taught with expository method. Female students retained slightly higher than their male counterparts. There is significant interaction between instructional packages and gender on Chemistry students’ retention. It was recommended, among others, that teachers should be encouraged to employ the use of Inquiry-role approach more in the teaching of chemistry and other subjects in general. By so doing, students’ retention of the subject could be increased.Keywords: inquiry role approach, retention, exposition method, chemistry
Procedia PDF Downloads 51312970 Antagonistic Potential of Epiphytic Bacteria Isolated in Kazakhstan against Erwinia amylovora, the Causal Agent of Fire Blight
Authors: Assel E. Molzhigitova, Amankeldi K. Sadanov, Elvira T. Ismailova, Kulyash A. Iskandarova, Olga N. Shemshura, Ainur I. Seitbattalova
Abstract:
Fire blight is a very harmful for commercial apple and pear production quarantine bacterial disease. To date, several different methods have been proposed for disease control, including the use of copperbased preparations and antibiotics, which are not always reliable or effective. The use of bacteria as biocontrol agents is one of the most promising and eco-friendly alternative methods. Bacteria with protective activity against the causal agent of fire blight are often present among the epiphytic microorganisms of the phyllosphere of host plants. Therefore, the main objective of our study was screening of local epiphytic bacteria as possible antagonists against Erwinia amylovora, the causal agent of fire blight. Samples of infected organs of apple and pear trees (shoots, leaves, fruits) were collected from the industrial horticulture areas in various agro-ecological zones of Kazakhstan. Epiphytic microorganisms were isolated by standard and modified methods on specific nutrient media. The primary screening of selected microorganisms under laboratory conditions to determine the ability to suppress the growth of Erwinia amylovora was performed by agar-diffusion-test. Among 142 bacteria isolated from the fire blight host plants, 5 isolates, belonging to the genera Bacillus, Lactobacillus, Pseudomonas, Paenibacillus and Pantoea showed higher antagonistic activity against the pathogen. The diameters of inhibition zone have been depended on the species and ranged from 10 mm to 48 mm. The maximum diameter of inhibition zone (48 mm) was exhibited by B. amyloliquefaciens. Less inhibitory effect was showed by Pantoea agglomerans PA1 (19 mm). The study of inhibitory effect of Lactobacillus species against E. amylovora showed that among 7 isolates tested only one (Lactobacillus plantarum 17M) demonstrated inhibitory zone (30 mm). In summary, this study was devoted to detect the beneficial epiphytic bacteria from plants organs of pear and apple trees due to fire blight control in Kazakhstan. Results obtained from the in vitro experiments showed that the most efficient bacterial isolates are Lactobacillus plantarum 17M, Bacillus amyloliquefaciens MB40, and Pantoea agglomerans PA1. These antagonists are suitable for development as biocontrol agents for fire blight control. Their efficacies will be evaluated additionally, in biological tests under in vitro and field conditions during our further study.Keywords: antagonists, epiphytic bacteria, Erwinia amylovora, fire blight
Procedia PDF Downloads 16612969 Determination of Klebsiella Pneumoniae Susceptibility to Antibiotics Using Infrared Spectroscopy and Machine Learning Algorithms
Authors: Manal Suleiman, George Abu-Aqil, Uraib Sharaha, Klaris Riesenberg, Itshak Lapidot, Ahmad Salman, Mahmoud Huleihel
Abstract:
Klebsiella pneumoniae is one of the most aggressive multidrug-resistant bacteria associated with human infections resulting in high mortality and morbidity. Thus, for an effective treatment, it is important to diagnose both the species of infecting bacteria and their susceptibility to antibiotics. Current used methods for diagnosing the bacterial susceptibility to antibiotics are time-consuming (about 24h following the first culture). Thus, there is a clear need for rapid methods to determine the bacterial susceptibility to antibiotics. Infrared spectroscopy is a well-known method that is known as sensitive and simple which is able to detect minor biomolecular changes in biological samples associated with developing abnormalities. The main goal of this study is to evaluate the potential of infrared spectroscopy in tandem with Random Forest and XGBoost machine learning algorithms to diagnose the susceptibility of Klebsiella pneumoniae to antibiotics within approximately 20 minutes following the first culture. In this study, 1190 Klebsiella pneumoniae isolates were obtained from different patients with urinary tract infections. The isolates were measured by the infrared spectrometer, and the spectra were analyzed by machine learning algorithms Random Forest and XGBoost to determine their susceptibility regarding nine specific antibiotics. Our results confirm that it was possible to classify the isolates into sensitive and resistant to specific antibiotics with a success rate range of 80%-85% for the different tested antibiotics. These results prove the promising potential of infrared spectroscopy as a powerful diagnostic method for determining the Klebsiella pneumoniae susceptibility to antibiotics.Keywords: urinary tract infection (UTI), Klebsiella pneumoniae, bacterial susceptibility, infrared spectroscopy, machine learning
Procedia PDF Downloads 16812968 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem
Authors: Ouafa Amira, Jiangshe Zhang
Abstract:
Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.Keywords: clustering, fuzzy c-means, regularization, relative entropy
Procedia PDF Downloads 25912967 Groupthink: The Dark Side of Team Cohesion
Authors: Farhad Eizakshiri
Abstract:
The potential for groupthink to explain the issues contributing to deterioration of decision-making ability within the unitary team and so to cause poor outcomes attracted a great deal of attention from a variety of disciplines, including psychology, social and organizational studies, political science, and others. Yet what remains unclear is how and why the team members’ strivings for unanimity and cohesion override their motivation to realistically appraise alternative courses of action. In this paper, the findings of a sequential explanatory mixed-methods research containing an experiment with thirty groups of three persons each and interviews with all experimental groups to investigate this issue is reported. The experiment sought to examine how individuals aggregate their views in order to reach a consensual group decision concerning the completion time of a task. The results indicated that groups made better estimates when they had no interaction between members in comparison with the situation that groups collectively agreed on time estimates. To understand the reasons, the qualitative data and informal observations collected during the task were analyzed through conversation analysis, thus leading to four reasons that caused teams to neglect divergent viewpoints and reduce the number of ideas being considered. Reasons found were the concurrence-seeking tendency, pressure on dissenters, self-censorship, and the illusion of invulnerability. It is suggested that understanding the dynamics behind the aforementioned reasons of groupthink will help project teams to avoid making premature group decisions by enhancing careful evaluation of available information and analysis of available decision alternatives and choices.Keywords: groupthink, group decision, cohesiveness, project teams, mixed-methods research
Procedia PDF Downloads 39612966 The Sensitivity of Electrical Geophysical Methods for Mapping Salt Stores within the Soil Profile
Authors: Fathi Ali Swaid
Abstract:
Soil salinization is one of the most hazardous phenomenons accelerating the land degradation processes. It either occurs naturally or is human-induced. High levels of soil salinity negatively affect crop growth and productivity leading land degradation ultimately. Thus, it is important to monitor and map soil salinity at an early stage to enact effective soil reclamation program that helps lessen or prevent future increase in soil salinity. Geophysical method has outperformed the traditional method for assessing soil salinity offering more informative and professional rapid assessment techniques for monitoring and mapping soil salinity. Soil sampling, EM38 and 2D conductivity imaging have been evaluated for their ability to delineate and map the level of salinity variations at Second Ponds Creek. The three methods have shown that the subsoil in the study area is saline. Salt variations were successfully observed under either method. However, EM38 reading and 2D inversion data show a clear spatial structure comparing to EC1:5 of soil samples in spite of that all soil samples, EM38 and 2D imaging were collected from the same location. Because EM38 readings and 2D imaging data are a weighted average of electrical soil conductance, it is more representative of soil properties than the soil samples method. The mapping of subsurface soil at the study area has been successful and the resistivity imaging has proven to be an advantage. The soil salinity analysis (EC1:5) correspond well to the true resistivity bringing together a good result of soil salinity. Soil salinity clearly indicated by previous investigation EM38 have been confirmed by the interpretation of the true resistivity at study area.Keywords: 2D conductivity imaging, EM38 readings, soil salinization, true resistivity, urban salinity
Procedia PDF Downloads 37612965 Evaluation of Electro-Flocculation for Biomass Production of Marine Microalgae Phaodactylum tricornutum
Authors: Luciana C. Ramos, Leandro J. Sousa, Antônio Ferreira da Silva, Valéria Gomes Oliveira Falcão, Suzana T. Cunha Lima
Abstract:
The commercial production of biodiesel using microalgae demands a high-energy input for harvesting biomass, making production economically unfeasible. Methods currently used involve mechanical, chemical, and biological procedures. In this work, a flocculation system is presented as a cost and energy effective process to increase biomass production of Phaeodactylum tricornutum. This diatom is the only species of the genus that present fast growth and lipid accumulation ability that are of great interest for biofuel production. The algae, selected from the Bank of Microalgae, Institute of Biology, Federal University of Bahia (Brazil), have been bred in tubular reactor with photoperiod of 12 h (clear/dark), providing luminance of about 35 μmol photons m-2s-1, and temperature of 22 °C. The medium used for growing cells was the Conway medium, with addition of silica. The seaweed growth curve was accompanied by cell count in Neubauer camera and by optical density in spectrophotometer, at 680 nm. The precipitation occurred at the end of the stationary phase of growth, 21 days after inoculation, using two methods: centrifugation at 5000 rpm for 5 min, and electro-flocculation at 19 EPD and 95 W. After precipitation, cells were frozen at -20 °C and, subsequently, lyophilized. Biomass obtained by electro-flocculation was approximately four times greater than the one achieved by centrifugation. The benefits of this method are that no addition of chemical flocculants is necessary and similar cultivation conditions can be used for the biodiesel production and pharmacological purposes. The results may contribute to improve biodiesel production costs using marine microalgae.Keywords: biomass, diatom, flocculation, microalgae
Procedia PDF Downloads 33012964 On the Existence of Homotopic Mapping Between Knowledge Graphs and Graph Embeddings
Authors: Jude K. Safo
Abstract:
Knowledge Graphs KG) and their relation to Graph Embeddings (GE) represent a unique data structure in the landscape of machine learning (relative to image, text and acoustic data). Unlike the latter, GEs are the only data structure sufficient for representing hierarchically dense, semantic information needed for use-cases like supply chain data and protein folding where the search space exceeds the limits traditional search methods (e.g. page-rank, Dijkstra, etc.). While GEs are effective for compressing low rank tensor data, at scale, they begin to introduce a new problem of ’data retreival’ which we observe in Large Language Models. Notable attempts by transE, TransR and other prominent industry standards have shown a peak performance just north of 57% on WN18 and FB15K benchmarks, insufficient practical industry applications. They’re also limited, in scope, to next node/link predictions. Traditional linear methods like Tucker, CP, PARAFAC and CANDECOMP quickly hit memory limits on tensors exceeding 6.4 million nodes. This paper outlines a topological framework for linear mapping between concepts in KG space and GE space that preserve cardinality. Most importantly we introduce a traceable framework for composing dense linguistic strcutures. We demonstrate performance on WN18 benchmark this model hits. This model does not rely on Large Langauge Models (LLM) though the applications are certainy relevant here as well.Keywords: representation theory, large language models, graph embeddings, applied algebraic topology, applied knot theory, combinatorics
Procedia PDF Downloads 68