Search results for: sensory processing sensitivity
1464 Gender Policies and Political Culture: An Examination of the Canadian Context
Authors: Chantal Maille
Abstract:
This paper is about gender-based analysis plus (GBA+), an intersectional gender policy used in Canada to assess the impact of policies and programs for men and women from different origins. It looks at Canada’s political culture to explain the nature of its gender policies. GBA+ is defined as an analysis method that makes it possible to assess the eventual effects of policies, programs, services, and other initiatives on women and men of different backgrounds because it takes account of gender and other identity factors. The ‘plus’ in the name serves to emphasize that GBA+ goes beyond gender to include an examination of a wide range of other related identity factors, such as age, education, language, geography, culture, and income. The point of departure for GBA+ is that women and men are not homogeneous populations and gender is never the only factor in defining a person’s identity; rather, it interacts with factors such as ethnic origin, age, disabilities, where the person lives, and other aspects of individual and social identity. GBA+ takes account of these factors and thus challenges notions of similarity or homogeneity within populations of women and men. Comparative analysis based on sex and gender may serve as a gateway to studying a given question, but women, men, girls, and boys do not form homogeneous populations. In the 1990s, intersectionality emerged as a new feminist framework. The popularity of the notion of intersectionality corresponds to a time when, in hindsight, the damage done to minoritized groups by state disengagement policies in concert with global intensification of neoliberalism, and vice versa, can be measured. Although GBA+ constitutes a form of intersectionalization of GBA, it must be understood that the two frameworks do not spring from a similar logic. Intersectionality first emerged as a dynamic analysis of differences between women that was oriented toward change and social justice, whereas GBA is a technique developed by state feminists in a context of analyzing governmental policies and aiming to promote equality between men and women. It can nevertheless be assumed that there might be interest in such a policy and program analysis grid that is decentred from gender and offers enough flexibility to take account of a group of inequalities. In terms of methodology, the research is supported by a qualitative analysis of governmental documents about GBA+ in Canada. Research findings identify links between Canadian gender policies and its political culture. In Canada, diversity has been taken into account as an element at the basis of gendered analysis of public policies since 1995. The GBA+ adopted by the government of Canada conveys an opening to intersectionality and a sensitivity to multiculturalism. The Canadian Multiculturalism Act, adopted 1988, proposes to recognize the fact that multiculturalism is a fundamental characteristic of the Canadian identity and heritage and constitutes an invaluable resource for the future of the country. In conclusion, Canada’s distinct political culture can be associated with the specific nature of its gender policies.Keywords: Canada, gender-based analysis, gender policies, political culture
Procedia PDF Downloads 2221463 Development of Mineral Carbonation Process from Ultramafic Tailings, Enhancing the Reactivity of Feedstocks
Authors: Sara Gardideh, Mansoor Barati
Abstract:
The mineral carbonation approach for reducing global warming has garnered interest on a worldwide scale. Due to the benefits of permanent storage and abundant mineral resources, mineral carbonation (MC) is one of the most effective strategies for sequestering CO₂. The combination of mineral processing for primary metal recovery and mineral carbonation for carbon sequestration is an emerging field of study with the potential to minimize capital costs. A detailed study of low-pressures–solid carbonation of ultramafic tailings in a dry environment has been accomplished. In order to track the changing structure of serpentine minerals and their reactivity as a function of temperature (300-900 ᵒC), CO₂ partial pressure (25-90 mol %), and thermal preconditioning, thermogravimetry has been utilized. The incongruent CO₂ van der Waals molecular diameters with the octahedral-tetrahedral lattice constants of serpentine were used to explain the mild carbonation reactivity. Serpentine requires additional thermal-treatment to remove hydroxyl groups, resulting in the chemical transformation to pseudo-forsterite, which is a mineral composed of isolated SiO₄ tetrahedra linked by octahedrally coordinated magnesium ions. The heating treatment above 850 ᵒC is adequate to remove chemically bound water from the lattice. Particles with a diameter < 34 (μm) are desirable, and thermally treated serpentine at 850 ᵒC for 2.30 hours reached 65% CO₂ storage capacity. The decrease in particle size, increase in temperature, and magnetic separation can dramatically enhance carbonation.Keywords: particle size, thermogravimetry, thermal-treatment, serpentine
Procedia PDF Downloads 911462 Bi-objective Network Optimization in Disaster Relief Logistics
Authors: Katharina Eberhardt, Florian Klaus Kaiser, Frank Schultmann
Abstract:
Last-mile distribution is one of the most critical parts of a disaster relief operation. Various uncertainties, such as infrastructure conditions, resource availability, and fluctuating beneficiary demand, render last-mile distribution challenging in disaster relief operations. The need to balance critical performance criteria like response time, meeting demand and cost-effectiveness further complicates the task. The occurrence of disasters cannot be controlled, and the magnitude is often challenging to assess. In summary, these uncertainties create a need for additional flexibility, agility, and preparedness in logistics operations. As a result, strategic planning and efficient network design are critical for an effective and efficient response. Furthermore, the increasing frequency of disasters and the rising cost of logistical operations amplify the need to provide robust and resilient solutions in this area. Therefore, we formulate a scenario-based bi-objective optimization model that integrates pre-positioning, allocation, and distribution of relief supplies extending the general form of a covering location problem. The proposed model aims to minimize underlying logistics costs while maximizing demand coverage. Using a set of disruption scenarios, the model allows decision-makers to identify optimal network solutions to address the risk of disruptions. We provide an empirical case study of the public authorities’ emergency food storage strategy in Germany to illustrate the potential applicability of the model and provide implications for decision-makers in a real-world setting. Also, we conduct a sensitivity analysis focusing on the impact of varying stockpile capacities, single-site outages, and limited transportation capacities on the objective value. The results show that the stockpiling strategy needs to be consistent with the optimal number of depots and inventory based on minimizing costs and maximizing demand satisfaction. The strategy has the potential for optimization, as network coverage is insufficient and relies on very high transportation and personnel capacity levels. As such, the model provides decision support for public authorities to determine an efficient stockpiling strategy and distribution network and provides recommendations for increased resilience. However, certain factors have yet to be considered in this study and should be addressed in future works, such as additional network constraints and heuristic algorithms.Keywords: humanitarian logistics, bi-objective optimization, pre-positioning, last mile distribution, decision support, disaster relief networks
Procedia PDF Downloads 791461 Glaucoma Detection in Retinal Tomography Using the Vision Transformer
Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan
Abstract:
Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning
Procedia PDF Downloads 1911460 Application of GPRS in Water Quality Monitoring System
Authors: V. Ayishwarya Bharathi, S. M. Hasker, J. Indhu, M. Mohamed Azarudeen, G. Gowthami, R. Vinoth Rajan, N. Vijayarangan
Abstract:
Identification of water quality conditions in a river system based on limited observations is an essential task for meeting the goals of environmental management. The traditional method of water quality testing is to collect samples manually and then send to laboratory for analysis. However, it has been unable to meet the demands of water quality monitoring today. So a set of automatic measurement and reporting system of water quality has been developed. In this project specifies Water quality parameters collected by multi-parameter water quality probe are transmitted to data processing and monitoring center through GPRS wireless communication network of mobile. The multi parameter sensor is directly placed above the water level. The monitoring center consists of GPRS and micro-controller which monitor the data. The collected data can be monitor at any instant of time. In the pollution control board they will monitor the water quality sensor data in computer using Visual Basic Software. The system collects, transmits and processes water quality parameters automatically, so production efficiency and economy benefit are improved greatly. GPRS technology can achieve well within the complex environment of poor water quality non-monitored, and more specifically applicable to the collection point, data transmission automatically generate the field of water analysis equipment data transmission and monitoring.Keywords: multiparameter sensor, GPRS, visual basic software, RS232
Procedia PDF Downloads 4121459 Predicting Response to Cognitive Behavioral Therapy for Psychosis Using Machine Learning and Functional Magnetic Resonance Imaging
Authors: Eva Tolmeijer, Emmanuelle Peters, Veena Kumari, Liam Mason
Abstract:
Cognitive behavioral therapy for psychosis (CBTp) is effective in many but not all patients, making it important to better understand the factors that determine treatment outcomes. To date, no studies have examined whether neuroimaging can make clinically useful predictions about who will respond to CBTp. To this end, we used machine learning methods that make predictions about symptom improvement at the individual patient level. Prior to receiving CBTp, 22 patients with a diagnosis of schizophrenia completed a social-affective processing task during functional MRI. Multivariate pattern analysis assessed whether treatment response could be predicted by brain activation responses to facial affect that was either socially threatening or prosocial. The resulting models did significantly predict symptom improvement, with distinct multivariate signatures predicting psychotic (r=0.54, p=0.01) and affective (r=0.32, p=0.05) symptoms. Psychotic symptom improvement was accurately predicted from relatively focal threat-related activation across hippocampal, occipital, and temporal regions; affective symptom improvement was predicted by a more dispersed profile of responses to prosocial affect. These findings enrich our understanding of the neurobiological underpinning of treatment response. This study provides a foundation that will hopefully lead to greater precision and tailoring of the interventions offered to patients.Keywords: cognitive behavioral therapy, machine learning, psychosis, schizophrenia
Procedia PDF Downloads 2741458 Perceiving Casual Speech: A Gating Experiment with French Listeners of L2 English
Authors: Naouel Zoghlami
Abstract:
Spoken-word recognition involves the simultaneous activation of potential word candidates which compete with each other for final correct recognition. In continuous speech, the activation-competition process gets more complicated due to speech reductions existing at word boundaries. Lexical processing is more difficult in L2 than in L1 because L2 listeners often lack phonetic, lexico-semantic, syntactic, and prosodic knowledge in the target language. In this study, we investigate the on-line lexical segmentation hypotheses that French listeners of L2 English form and then revise as subsequent perceptual evidence is revealed. Our purpose is to shed further light on the processes of L2 spoken-word recognition in context and better understand L2 listening difficulties through a comparison of skilled and unskilled reactions at the point where their working hypothesis is rejected. We use a variant of the gating experiment in which subjects transcribe an English sentence presented in increments of progressively greater duration. The spoken sentence was “And this amazing athlete has just broken another world record”, chosen mainly because it included common reductions and phonetic features in English, such as elision and assimilation. Our preliminary results show that there is an important difference in the manner in which proficient and less-proficient L2 listeners handle connected speech. Less-proficient listeners delay recognition of words as they wait for lexical and syntactic evidence to appear in the gates. Further statistical results are currently being undertaken.Keywords: gating paradigm, spoken word recognition, online lexical segmentation, L2 listening
Procedia PDF Downloads 4641457 Denoising Transient Electromagnetic Data
Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen
Abstract:
Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform
Procedia PDF Downloads 851456 The Internet of Things: A Survey of Authentication Mechanisms, and Protocols, for the Shifting Paradigm of Communicating, Entities
Authors: Nazli Hardy
Abstract:
Multidisciplinary application of computer science, interactive database-driven web application, the Internet of Things (IoT) represents a digital ecosystem that has pervasive technological, social, and economic, impact on the human population. It is a long-term technology, and its development is built around the connection of everyday objects, to the Internet. It is estimated that by 2020, with billions of people connected to the Internet, the number of connected devices will exceed 50 billion, and thus IoT represents a paradigm shift in in our current interconnected ecosystem, a communication shift that will unavoidably affect people, businesses, consumers, clients, employees. By nature, in order to provide a cohesive and integrated service, connected devices need to collect, aggregate, store, mine, process personal and personalized data on individuals and corporations in a variety of contexts and environments. A significant factor in this paradigm shift is the necessity for secure and appropriate transmission, processing and storage of the data. Thus, while benefits of the applications appear to be boundless, these same opportunities are bounded by concerns such as trust, privacy, security, loss of control, and related issues. This poster and presentation look at a multi-factor authentication (MFA) mechanisms that need to change from the login-password tuple to an Identity and Access Management (IAM) model, to the more cohesive to Identity Relationship Management (IRM) standard. It also compares and contrasts messaging protocols that are appropriate for the IoT ecosystem.Keywords: Internet of Things (IoT), authentication, protocols, survey
Procedia PDF Downloads 2991455 Evolutionary Analysis of Influenza A (H1N1) Pdm 09 in Post Pandemic Period in Pakistan
Authors: Nazish Badar
Abstract:
In early 2009, Pandemic type A (H1N1) Influenza virus emerged globally. Since then, it has continued circulation causing considerable morbidity and mortality. The purpose of this study was to evaluate the evolutionary changes in Influenza A (H1N1) pdm09 viruses from 2009-15 and their relevance with the current vaccine viruses. Methods: Respiratory specimens were collected with influenza-like illness and Severe Acute Respiratory Illness. Samples were processed according to CDC protocol. Sequencing and phylogenetic analysis of Haemagglutinin (HA) and neuraminidase (NA) genes was carried out comparing representative isolates from Pakistan viruses. Results: Between Jan2009 - Feb 2016, 1870 (13.2%) samples were positive for influenza A out of 14086. During the pandemic period (2009–10), Influenza A/ H1N1pdm 09 was the dominant strain with 366 (45%) of total influenza positives. In the post-pandemic period (2011–2016), a total of 1066 (59.6%) cases were positive Influenza A/ H1N1pdm 09 with co-circulation of different Influenza A subtypes. Overall, the Pakistan A(H1N1) pdm09 viruses grouped in two genetic clades. Influenza A(H1N1)pdm09 viruses only ascribed to Clade 7 during the pandemic period whereas viruses belong to clade 7 (2011) and clade 6B (2015) during the post-pandemic years. Amino acid analysis of the HA gene revealed mutations at positions S220T, I338V and P100S specially associated with outbreaks in all the analyzed strains. Sequence analyses of post-pandemic A(H1N1)pdm09 viruses showed additional substitutions at antigenic sites; S179N,K180Q (SA), D185N, D239G (CA), S202A (SB) and at receptor binding sites; A13T, S200P when compared with pandemic period. Substitution at Genetic markers; A273T (69%), S200P/T (15%) and D239G (7.6%) associated with severity and E391K (69%) associated with virulence was identified in viruses isolated during 2015. Analysis of NA gene revealed outbreak markers; V106I (23%) among pandemic and N248D (100%) during post-pandemic Pakistan viruses. Additional N-Glycosylation site; HA S179N (23%), NA I23T(7.6%) and N44S (77%) in place of N386K(77%) were only found in post-pandemic viruses. All isolates showed histidine (H) at position 275 in NA indicating sensitivity to neuraminidase inhibitors. Conclusion: This study shows that the Influenza A(H1N1)pdm09 viruses from Pakistan clustered into two genetic clades, with co-circulation of some variants. Certain key substitutions in the receptor binding site and few changes indicative of virulence were also detected in post-pandemic strains. Therefore, it is imperative to continue monitoring of the viruses for early identification of potential variants of high virulence or emergence of drug-resistant variants.Keywords: Influenza A (H1N1) pdm09, evolutionary analysis, post pandemic period, Pakistan
Procedia PDF Downloads 2071454 The Time-Frequency Domain Reflection Method for Aircraft Cable Defects Localization
Authors: Reza Rezaeipour Honarmandzad
Abstract:
This paper introduces an aircraft cable fault detection and location method in light of TFDR keeping in mind the end goal to recognize the intermittent faults adequately and to adapt to the serial and after-connector issues being hard to be distinguished in time domain reflection. In this strategy, the correlation function of reflected and reference signal is used to recognize and find the airplane fault as per the qualities of reflected and reference signal in time-frequency domain, so the hit rate of distinguishing and finding intermittent faults can be enhanced adequately. In the work process, the reflected signal is interfered by the noise and false caution happens frequently, so the threshold de-noising technique in light of wavelet decomposition is used to diminish the noise interference and lessen the shortcoming alert rate. At that point the time-frequency cross connection capacity of the reference signal and the reflected signal based on Wigner-Ville appropriation is figured so as to find the issue position. Finally, LabVIEW is connected to execute operation and control interface, the primary capacity of which is to connect and control MATLAB and LABSQL. Using the solid computing capacity and the bottomless capacity library of MATLAB, the signal processing turn to be effortlessly acknowledged, in addition LabVIEW help the framework to be more dependable and upgraded effectively.Keywords: aircraft cable, fault location, TFDR, LabVIEW
Procedia PDF Downloads 4761453 Efficacy of Carvacrol as an Antimicrobial Wash Treatment for Reducing Both Campylobacter jejuni and Aerobic Bacterial Counts on Chicken Skin
Authors: Sandip Shrestha, Ann M. Donoghue, Komala Arsi, Basanta R. Wagle, Abhinav Upadhyay, Dan J. Donoghue
Abstract:
Campylobacter, one of the major cause of foodborne illness worldwide, is commonly present in the intestinal tract of poultry. Many strategies are currently being investigated to reduce Campylobacter counts on commercial poultry during processing with limited success. This study investigated the efficacy of the generally recognized as safe compound, carvacrol (CR), a component of wild oregano oil as a wash treatment for reducing C. jejuni and aerobic bacteria on chicken skin. A total of two trials were conducted, and in each trial, a total of 75 skin samples (4cm × 4cm each) were randomly allocated into 5 treatment groups (0%, 0.25%, 0.5%, 1% and 2% CR). Skin samples were inoculated with a cocktail of four wild strains of C. jejuni (~ 8 log10 CFU/skin). After 30 min of attachment, inoculated skin samples were dipped in the respective treatment solution for 1 min, allowed to drip dry for 2 min and processed at 0, 8, 24 h post treatment for enumeration of C. jejuni and aerobic bacterial counts (n=5/treatment/time point). The data were analyzed by ANOVA using PROC GLM procedure of SAS 9.3. All the tested doses of CR suspension consistently reduced C. jejuni counts across all time points. The 2% CR wash was the most effective treatment and reduced C. jejuni counts by ~4 log₁₀ CFU/sample (P < 0.05). Aerobic counts were reduced for the 0.5% CR dose at 0 and 24h in Trial 1 and at 0, 8 and 24h in Trial 2. The 1 and 2% CR doses consistently reduced aerobic counts in both trials up to 2 log₁₀ CFU/skin.Keywords: Campylobacter jejuni, carvcrol, chicken skin, postharvest
Procedia PDF Downloads 1811452 Localized Analysis of Cellulosic Fibrous Insulation Materials
Authors: Chady El Hachem, Pan Ye, Kamilia Abahri, Rachid Bennacer
Abstract:
Considered as a building construction material, and regarding its environmental benefits, wood fiber insulation is the material of interest in this work. The definition of adequate elementary representative volume that guarantees reliable understanding of the hygrothermal macroscopic phenomena is very critical. At the microscopic scale, when subjected to hygric solicitations, fibers undergo local dimensionless variations. It is therefore necessary to master this behavior, which affects the global response of the material. This study consists of an experimental procedure using the non-destructive method, X-ray tomography, followed by morphological post-processing analysis using ImageJ software. A refine investigation took place in order to identify the representative elementary volume and the sufficient resolution for accurate structural analysis. The second part of this work was to evaluate the microscopic hygric behavior of the studied material. Many parameters were taken into consideration, like the evolution of the fiber diameters, distribution along the sorption cycle and the porosity, and the water content evolution. In addition, heat transfer simulations based on the energy equation resolution were achieved on the real structure. Further, the problematic of representative elementary volume was elaborated for such heterogeneous material. Moreover, the material’s porosity and its fibers’ thicknesses show very big correlation with the water content. These results provide the literature with very good understanding of wood fiber insulation’s behavior.Keywords: hygric behavior, morphological characterization, wood fiber insulation material, x-ray tomography
Procedia PDF Downloads 2671451 Wireless Information Transfer Management and Case Study of a Fire Alarm System in a Residential Building
Authors: Mohsen Azarmjoo, Mehdi Mehdizadeh Koupaei, Maryam Mehdizadeh Koupaei, Asghar Mahdlouei Azar
Abstract:
The increasing prevalence of wireless networks in our daily lives has made them indispensable. The aim of this research is to investigate the management of information transfer in wireless networks and the integration of renewable solar energy resources in a residential building. The focus is on the transmission of electricity and information through wireless networks, as well as the utilization of sensors and wireless fire alarm systems. The research employs a descriptive approach to examine the transmission of electricity and information on a wireless network with electric and optical telephone lines. It also investigates the transmission of signals from sensors and wireless fire alarm systems via radio waves. The methodology includes a detailed analysis of security, comfort conditions, and costs related to the utilization of wireless networks and renewable solar energy resources. The study reveals that it is feasible to transmit electricity on a network cable using two pairs of network cables without the need for separate power cabling. Additionally, the integration of renewable solar energy systems in residential buildings can reduce dependence on traditional energy carriers. The use of sensors and wireless remote information processing can enhance the safety and efficiency of energy usage in buildings and the surrounding spaces.Keywords: renewable energy, intelligentization, wireless sensors, fire alarm system
Procedia PDF Downloads 541450 Morphological and Molecular Abnormalities of the Skeletal Muscle Tissue from Pediatric Patient Affected by a Rare Genetic Chaperonopathy Associated with Motor Neuropathy
Authors: Leila Noori, Rosario Barone, Francesca Rappa, Antonella Marino Gammazza, Alessandra Maria Vitale, Giuseppe Donato Mangano, Giusy Sentiero, Filippo Macaluso, Kathryn H. Myburgh, Francesco Cappello, Federica Scalia
Abstract:
The neuromuscular system controls, directs, and allows movement of the body through the action of neural circuits, which include motor neurons, sensory neurons, and skeletal muscle fibers. Protein homeostasis of the involved cytotypes appears crucial to maintain the correct and prolonged functions of the neuromuscular system, and both neuronal cells and skeletal muscle fibers express significant quantities of protein chaperones, the molecular machinery responsible to maintain the protein turnover. Genetic mutations or defective post-translational modifications of molecular chaperones (i.e., genetic or acquired chaperonopathies) may lead to neuromuscular disorders called as neurochaperonopathies. The limited knowledge of the effects of the defective chaperones on skeletal muscle fibers and neurons impedes the progression of therapeutic approaches. A distinct genetic variation of CCT5 gene encoding for the subunit 5 of the chaperonin CCT (Chaperonin Containing TCP1; also known as TRiC, TCP1 Ring Complex) was recently described associated with severe distal motor neuropathy by our team. In this study, we investigated the histopathological abnormalities of the skeletal muscle biopsy of the pediatric patient affected by the mutation Leu224Val in the CCT5 subunit. We provide molecular and structural features of the diseased skeletal muscle tissue that we believe may be useful to identify undiagnosed cases of this rare genetic disorder. We investigated the histological abnormalities of the affected tissue via hematoxylin and eosin staining. Then we used immunofluorescence and qPCR techniques to explore the expression and distribution of CCT5 in diseased and healthy skeletal muscle tissue. Immunofluorescence and immunohistochemistry assays were performed to study the sarcomeric and structural proteins of skeletal muscle, including actin, myosin, tubulin, troponin-T, telethonin, and titin. We performed Western blot to examine the protein expression of CCT5 and some heat shock proteins, Hsp90, Hsp60, Hsp27, and α-B crystallin, along with the main client proteins of the CCT5, actin, and tubulin. Our findings revealed muscular atrophy, abnormal morphology, and different sizes of muscle fibers in affected tissue. The swollen nuclei and wide interfiber spaces were seen. Expression of CCT5 had been decreased and showed a different distribution pattern in the affected tissue. Altered expression, distribution, and bandage pattern were detected by confocal microscopy for the interested muscular proteins in tissue from the patient compared to the healthy control. Protein levels of the studied Hsps normally located at the Z-disk were reduced. Western blot results showed increased levels of the actin and tubulin proteins in the diseased skeletal muscle biopsy compared to healthy tissue. Chaperones must be expressed at high levels in skeletal muscle to counteract various stressors such as mechanical, oxidative, and thermal crises; therefore, it seems relevant that defects of molecular chaperones may result in damaged skeletal muscle fibers. So far, several chaperones or cochaperones involved in neuromuscular disorders have been defined. Our study shows that alteration of the CCT5 subunit is associated with the damaged structure of skeletal muscle fibers and alterations of chaperone system components and paves the way to explore possible alternative substrates of chaperonin CCT. However, further studies are underway to investigate the CCT mechanisms of action to design applicable therapeutic strategies.Keywords: molecular chaperones, neurochaperonopathy, neuromuscular system, protein homeostasis
Procedia PDF Downloads 711449 Perceiving Interpersonal Conflict and the Big Five Personality Traits
Authors: Emily Rivera, Toni DiDona
Abstract:
The Big Five personality traits is a hierarchical classification of personality traits that applies factor analysis to a personality survey data in order to describe human personality using five broad dimensions: Extraversion, Agreeableness, Conscientiousness, Neuroticism, and Openness (Fetvadjiev & Van de Vijer, 2015). Research shows that personality constructs underline individual differences in processing conflict and interpersonal relations. (Graziano et al., 1996). This research explores the understudied correlation between the Big Five personality traits and perceived interpersonal conflict in the workplace. It revises social psychological literature on Big Five personality traits within a social context and discusses organizational development journal articles on the perceived efficacy of conflict tactics and approach to interpersonal relationships. The study also presents research undertaken on a survey group of 867 subjects over the age of 18 that were recruited by means of convenience sampling through social media, email, and text messaging. The central finding of this study is that only two of the Big Five personality traits had a significant correlation with perceiving interpersonal conflict in the workplace. Individuals who score higher on agreeableness and neuroticism, perceive more interpersonal conflict in the workplace compared to those that score lower on each dimension. The relationship between both constructs is worthy of research due to its everyday frequency and unique individual psycho-social consequences. This multimethod research associated the Big Five personality dimensions to interpersonal conflict. Its findings that can be utilized to further understand social cognition, person perception, complex social behavior and social relationships in the work environment.Keywords: five-factor model, interpersonal conflict, personality, The Big Five personality traits
Procedia PDF Downloads 1571448 Effects of Fermentation Techniques on the Quality of Cocoa Beans
Authors: Monday O. Ale, Adebukola A. Akintade, Olasunbo O. Orungbemi
Abstract:
Fermentation as an important operation in the processing of cocoa beans is now affected by the recent climate change across the globe. The major requirement for effective fermentation is the ability of the material used to retain sufficient heat for the required microbial activities. Apart from the effects of climate on the rate of heat retention, the materials used for fermentation plays an important role. Most Farmers still restrict fermentation activities to the use of traditional methods. Improving on cocoa fermentation in this era of climate change makes it necessary to work on other materials that can be suitable for cocoa fermentation. Therefore, the objective of this study was to determine the effects of fermentation techniques on the quality of cocoa beans. The materials used in this fermentation research were heap-leaves (traditional), stainless steel, plastic tin, plastic basket and wooden box. The period of fermentation varies from zero days to 10 days. Physical and chemical tests were carried out for variables in quality determination in the samples. The weight per bean varied from 1.0-1.2 g after drying across the samples and the major color of the dry beans observed was brown except with the samples from stainless steel. The moisture content varied from 5.5-7%. The mineral content and the heavy metals decreased with increase in the fermentation period. A wooden box can conclusively be used as an alternative to heap-leaves as there was no significant difference in the physical features of the samples fermented with the two methods. The use of a wooden box as an alternative for cocoa fermentation is therefore recommended for cocoa farmers.Keywords: fermentation, effects, fermentation materials, period, quality
Procedia PDF Downloads 2071447 Experimental Study on Mechanical Properties of Commercially Pure Copper Processed by Severe Plastic Deformation Technique-Equal Channel Angular Extrusion
Authors: Krishnaiah Arkanti, Ramulu Malothu
Abstract:
The experiments have been conducted to study the mechanical properties of commercially pure copper processing at room temperature by severe plastic deformation using equal channel angular extrusion (ECAE) through a die of 90oangle up to 3 passes by route BC i.e. rotating the sample in the same direction by 90o after each pass. ECAE is used to produce from existing coarse grains to ultra-fine, equiaxed grains structure with high angle grain boundaries in submicron level by introducing a large amount of shear strain in the presence of hydrostatic pressure into the material without changing billet shape or dimension. Mechanical testing plays an important role in evaluating fundamental properties of engineering materials as well as in developing new materials and in controlling the quality of materials for use in design and construction. Yield stress, ultimate tensile stress and ductility are structure sensitive properties and vary with the structure of the material. Microhardness and tensile tests were carried out to evaluate the hardness, strength and ductility of the ECAE processed materials. The results reveal that the strength and hardness of commercially pure copper samples improved significantly without losing much ductility after each pass.Keywords: equal channel angular extrusion, severe plastic deformation, copper, mechanical properties
Procedia PDF Downloads 1891446 A Sectional Control Method to Decrease the Accumulated Survey Error of Tunnel Installation Control Network
Authors: Yinggang Guo, Zongchun Li
Abstract:
In order to decrease the accumulated survey error of tunnel installation control network of particle accelerator, a sectional control method is proposed. Firstly, the accumulation rule of positional error with the length of the control network is obtained by simulation calculation according to the shape of the tunnel installation-control-network. Then, the RMS of horizontal positional precision of tunnel backbone control network is taken as the threshold. When the accumulated error is bigger than the threshold, the tunnel installation control network should be divided into subsections reasonably. On each segment, the middle survey station is taken as the datum for independent adjustment calculation. Finally, by taking the backbone control points as faint datums, the weighted partial parameters adjustment is performed with the adjustment results of each segment and the coordinates of backbone control points. The subsections are jointed and unified into the global coordinate system in the adjustment process. An installation control network of the linac with a length of 1.6 km is simulated. The RMS of positional deviation of the proposed method is 2.583 mm, and the RMS of the difference of positional deviation between adjacent points reaches 0.035 mm. Experimental results show that the proposed sectional control method can not only effectively decrease the accumulated survey error but also guarantee the relative positional precision of the installation control network. So it can be applied in the data processing of tunnel installation control networks, especially for large particle accelerators.Keywords: alignment, tunnel installation control network, accumulated survey error, sectional control method, datum
Procedia PDF Downloads 1911445 Coping with Incompatible Identities in Russia: Case of Orthodox Gays
Authors: Siuzan Uorner
Abstract:
The era of late modernity is characterized, on the one hand, by social disintegration, values of personal freedom, tolerance, and self-expression. Boundaries between the accessible and the elitist, normal and abnormal are blurring. On the other hand, traditional social institutions, such as religion (especially Russian Orthodox Church), exist, criticizing lifestyle and worldview other than conventionally structured canons. Despite the declared values and opportunities in late modern society, people's freedom is ambivalent. Personal identity and its aspects are becoming a subject of choice. Hence, combinations of identity aspects can be incompatible. Our theoretical framework is based on P. Ricoeur's concept of narrative identity and hermeneutics, E. Goffman’s theory of social stigma, self-presentation, discrepant roles and W. James lectures about varieties of religious experience. This paper aims to reconstruct ways of coping with incompatible identities of Orthodox gays (an extreme sampling of a combination of sexual orientation and religious identity in a heteronormative society). This study focuses on the discourse of Orthodox gay parishioners and ROC gay priests in Russia (sampling ‘hard to reach’ populations because of the secrecy of gay community in ROC and sensitivity of the topic itself). We conducted a qualitative research design, using in-depth personal semi-structured online-interviews. Recruiting of informants took place in 'Nuntiare et Recreare' (Russian movement of religious LGBT) page in VKontakte through the post with an invitation to participate in the research. In this work, we analyzed interview transcripts using axial coding. We chose the Grounded Theory methodology to construct a theory from empirical data and contribute to the growing body of knowledge in ways of harmonizing incompatible identities in late modern societies. The research has found that there are two types of conflicts Orthodox gays meet with: canonic contradictions (postulates of Scripture and its interpretations) and problems in social interaction, mainly with ROC priests and Orthodox parishioners. We have revealed semantic meanings of most commonly used words that appear in the narratives (words such as ‘love’, ‘sin’, ‘religion’ etc.). Finally, we have reconstructed biographical patterns of LGBT social movements’ involvement. This paper argues that all incompatibilities are harmonizing in the narrative itself. As Ricoeur has suggested, the narrative configuration allows the speaker to gather facts and events together and to compose causal relationships between them. Sexual orientation and religious identity are getting along and harmonizing in the narrative.Keywords: gay priests, incompatible identities, narrative identity, Orthodox gays, religious identity, ROC, sexual orientation
Procedia PDF Downloads 1371444 Artificial Intelligent-Based Approaches for Task Offloading, Resource Allocation and Service Placement of Internet of Things Applications: State of the Art
Authors: Fatima Z. Cherhabil, Mammar Sedrati, Sonia-Sabrina Bendib
Abstract:
In order to support the continued growth, critical latency of IoT applications, and various obstacles of traditional data centers, mobile edge computing (MEC) has emerged as a promising solution that extends cloud data-processing and decision-making to edge devices. By adopting a MEC structure, IoT applications could be executed locally, on an edge server, different fog nodes, or distant cloud data centers. However, we are often faced with wanting to optimize conflicting criteria such as minimizing energy consumption of limited local capabilities (in terms of CPU, RAM, storage, bandwidth) of mobile edge devices and trying to keep high performance (reducing response time, increasing throughput and service availability) at the same time. Achieving one goal may affect the other, making task offloading (TO), resource allocation (RA), and service placement (SP) complex processes. It is a nontrivial multi-objective optimization problem to study the trade-off between conflicting criteria. The paper provides a survey on different TO, SP, and RA recent multi-objective optimization (MOO) approaches used in edge computing environments, particularly artificial intelligent (AI) ones, to satisfy various objectives, constraints, and dynamic conditions related to IoT applications.Keywords: mobile edge computing, multi-objective optimization, artificial intelligence approaches, task offloading, resource allocation, service placement
Procedia PDF Downloads 1151443 Laser Shock Peening of Additively Manufactured Nickel-Based Superalloys
Authors: Michael Munther, Keivan Davami
Abstract:
One significant roadblock for additively manufactured (AM) parts is the buildup of residual tensile stresses during the fabrication process. These residual stresses are formed due to the intense localized thermal gradients and high cooling rates that cause non-uniform material expansion/contraction and mismatched strain profiles during powder-bed fusion techniques, such as direct metal laser sintering (DMLS). The residual stresses adversely affect the fatigue life of the AM parts. Moreover, if the residual stresses become higher than the material’s yield strength, they will lead to acute geometric distortion. These are limiting the applications and acceptance of AM components for safety-critical applications. Herein, we discuss laser shock peening method as an advanced technique for the manipulation of the residual stresses in AM parts. An X-ray diffraction technique is used for the measurements of the residual stresses before and after the laser shock peening process. Also, the hardness of the structures is measured using a nanoindentation technique. Maps of nanohardness and modulus are obtained from the nanoindentation, and a correlation is made between the residual stresses and the mechanical properties. The results indicate that laser shock peening is able to induce compressive residual stresses in the structure that mitigate the tensile residual stresses and increase the hardness of AM IN718, a superalloy, almost 20%. No significant changes were observed in the modulus after laser shock peening. The results strongly suggest that laser shock peening can be used as an advanced post-processing technique to optimize the service lives of critical components for various applications.Keywords: additive manufacturing, Inconel 718, laser shock peening, residual stresses
Procedia PDF Downloads 1271442 3D Object Retrieval Based on Similarity Calculation in 3D Computer Aided Design Systems
Authors: Ahmed Fradi
Abstract:
Nowadays, recent technological advances in the acquisition, modeling, and processing of three-dimensional (3D) objects data lead to the creation of models stored in huge databases, which are used in various domains such as computer vision, augmented reality, game industry, medicine, CAD (Computer-aided design), 3D printing etc. On the other hand, the industry is currently benefiting from powerful modeling tools enabling designers to easily and quickly produce 3D models. The great ease of acquisition and modeling of 3D objects make possible to create large 3D models databases, then, it becomes difficult to navigate them. Therefore, the indexing of 3D objects appears as a necessary and promising solution to manage this type of data, to extract model information, retrieve an existing model or calculate similarity between 3D objects. The objective of the proposed research is to develop a framework allowing easy and fast access to 3D objects in a CAD models database with specific indexing algorithm to find objects similar to a reference model. Our main objectives are to study existing methods of similarity calculation of 3D objects (essentially shape-based methods) by specifying the characteristics of each method as well as the difference between them, and then we will propose a new approach for indexing and comparing 3D models, which is suitable for our case study and which is based on some previously studied methods. Our proposed approach is finally illustrated by an implementation, and evaluated in a professional context.Keywords: CAD, 3D object retrieval, shape based retrieval, similarity calculation
Procedia PDF Downloads 2621441 Strategic Interventions to Combat Socio-economic Impacts of Drought in Thar - A Case Study of Nagarparkar
Authors: Anila Hayat
Abstract:
Pakistan is one of those developing countries that are least involved in emissions but has the most vulnerable environmental conditions. Pakistan is ranked 8th in most affected countries by climate change on the climate risk index 1992-2011. Pakistan is facing severe water shortages and flooding as a result of changes in rainfall patterns, specifically in the least developed areas such as Tharparkar. Nagarparkar, once an attractive tourist spot located in Tharparkar because of its tropical desert climate, is now facing severe drought conditions for the last few decades. This study investigates the present socio-economic situation of local communities, major impacts of droughts and their underlying causes and current mitigation strategies adopted by local communities. The study uses both secondary (quantitative in nature) and primary (qualitative in nature) methods to understand the impacts and explore causes on the socio-economic life of local communities of the study area. The relevant data has been collected through household surveys using structured questionnaires, focus groups and in-depth interviews of key personnel from local and international NGOs to explore the sensitivity of impacts and adaptation to droughts in the study area. This investigation is limited to four rural communities of union council Pilu of Nagarparkar district, including Bheel, BhojaBhoon, Mohd Rahan Ji Dhani and Yaqub Ji Dhani villages. The results indicate that drought has caused significant economic and social hardships for the local communities as more than 60% of the overall population is dependent on rainfall which has been disturbed by irregular rainfall patterns. The decline in Crop yields has forced the local community to migrate to nearby areas in search of livelihood opportunities. Communities have not undertaken any appropriate adaptive actions to counteract the adverse effect of drought; they are completely dependent on support from the government and external aid for survival. Respondents also reported that poverty is a major cause of their vulnerability to drought. An increase in population, limited livelihood opportunities, caste system, lack of interest from the government sector, unawareness shaped their vulnerability to drought and other social issues. Based on the findings of this study, it is recommended that the local authorities shall create awareness about drought hazards and improve the resilience of communities against drought. It is further suggested to develop, introduce and implement water harvesting practices at the community level to promote drought-resistant crops.Keywords: migration, vulnerability, awareness, Drought
Procedia PDF Downloads 1321440 The Effect of Alkaline Treatment on Tensile Strength and Morphological Properties of Kenaf Fibres for Yarn Production
Authors: A. Khalina, K. Shaharuddin, M. S. Wahab, M. P. Saiman, H. A. Aisyah
Abstract:
This paper investigates the effect of alkali treatment and mechanical properties of kenaf (Hibiscus cannabinus) fibre for the development of yarn. Two different fibre sources are used for the yarn production. Kenaf fibres were treated with sodium hydroxide (NaOH) in the concentration of 3, 6, 9, and 12% prior to fibre opening process and tested for their tensile strength and Young’s modulus. Then, the selected fibres were introduced to fibre opener at three different opening processing parameters; namely, speed of roller feeder, small drum, and big drum. The diameter size, surface morphology, and fibre durability towards machine of the fibres were characterized. The results show that concentrations of NaOH used have greater effects on fibre mechanical properties. From this study, the tensile and modulus properties of the treated fibres for both types have improved significantly as compared to untreated fibres, especially at the optimum level of 6% NaOH. It is also interesting to highlight that 6% NaOH is the optimum concentration for the alkaline treatment. The untreated and treated fibres at 6% NaOH were then introduced to fibre opener, and it was found that the treated fibre produced higher fibre diameter with better surface morphology compared to the untreated fibre. Higher speed parameter during opening was found to produce higher yield of opened-kenaf fibres.Keywords: alkaline treatment, kenaf fibre, tensile strength, yarn production
Procedia PDF Downloads 2461439 Increasing the Efficiency of the Biomass Gasification Technology with Using the Organic Rankin Cycle
Authors: Jaroslav Frantík, Jan Najser
Abstract:
This article deals with increasing the energy efficiency of a plant in terms of optimizing the process. The European Union is striving to achieve the climate-energy package in the area increasing of energy efficiency. The goal of energy efficiency is to reduce primary energy consumption by 20% within the EU until 2020. The objective of saving energy consumption in the Czech Republic was set at 47.84 PJ (13.29 TWh). For reducing electricity consumption, it is possible to choose: a) mandatory increasing of energy efficiency, b) alternative scheme, c) combination of both actions. The Czech Republic has chosen for reducing electricity consumption using-alternative scheme. The presentation is focused on the proposal of a technological unit dealing with the gasification process of processing of biomass with an increase of power in the output. The synthesis gas after gasification of biomass is used as fuel in a cogeneration process of reciprocating internal combustion engine with the classic production of heat and electricity. Subsequently, there is an explanation of the ORC system dealing with the conversion of waste heat to electricity with the using closed cycle of the steam process with organic medium. The arising electricity is distributed to the power grid as a further energy source, or it is used for needs of the partial coverage of the technological unit. Furthermore, there is a presented schematic description of the technology with the identification of energy flows starting from the biomass treatment by drying, through its conversion to gaseous fuel, producing of electricity and utilize of thermal energy with minimizing losses. It has been found that using of ORC system increased the efficiency of the produced electricity by 7.5%.Keywords: biomass, efficiency, gasification, ORC system
Procedia PDF Downloads 2171438 Mobile Traffic Management in Congested Cells using Fuzzy Logic
Authors: A. A. Balkhi, G. M. Mir, Javid A. Sheikh
Abstract:
To cater the demands of increasing traffic with new applications the cellular mobile networks face new changes in deployment in infrastructure for making cellular networks heterogeneous. To reduce overhead processing the densely deployed cells require smart behavior with self-organizing capabilities with high adaptation to the neighborhood. We propose self-organization of unused resources usually excessive unused channels of neighbouring cells with densely populated cells to reduce handover failure rates. The neighboring cells share unused channels after fulfilling some conditional candidature criterion using threshold values so that they are not suffered themselves for starvation of channels in case of any abrupt change in traffic pattern. The cells are classified as ‘red’, ‘yellow’, or ‘green’, as per the available channels in cell which is governed by traffic pattern and thresholds. To combat the deficiency of channels in red cell, migration of unused channels from under-loaded cells, hierarchically from the qualified candidate neighboring cells is explored. The resources are returned back when the congested cell is capable of self-contained traffic management. In either of the cases conditional sharing of resources is executed for enhanced traffic management so that User Equipment (UE) is provided uninterrupted services with high Quality of Service (QoS). The fuzzy logic-based simulation results show that the proposed algorithm is efficiently in coincidence with improved successful handoffs.Keywords: candidate cell, channel sharing, fuzzy logic, handover, small cells
Procedia PDF Downloads 1201437 Client Hacked Server
Authors: Bagul Abhijeet
Abstract:
Background: Client-Server model is the backbone of today’s internet communication. In which normal user can not have control over particular website or server? By using the same processing model one can have unauthorized access to particular server. In this paper, we discussed about application scenario of hacking for simple website or server consist of unauthorized way to access the server database. This application emerges to autonomously take direct access of simple website or server and retrieve all essential information maintain by administrator. In this system, IP address of server given as input to retrieve user-id and password of server. This leads to breaking administrative security of server and acquires the control of server database. Whereas virus helps to escape from server security by crashing the whole server. Objective: To control malicious attack and preventing all government website, and also find out illegal work to do hackers activity. Results: After implementing different hacking as well as non-hacking techniques, this system hacks simple web sites with normal security credentials. It provides access to server database and allow attacker to perform database operations from client machine. Above Figure shows the experimental result of this application upon different servers and provides satisfactory results as required. Conclusion: In this paper, we have presented a to view to hack the server which include some hacking as well as non-hacking methods. These algorithms and methods provide efficient way to hack server database. By breaking the network security allow to introduce new and better security framework. The terms “Hacking” not only consider for its illegal activities but also it should be use for strengthen our global network.Keywords: Hacking, Vulnerabilities, Dummy request, Virus, Server monitoring
Procedia PDF Downloads 2511436 Music Training as an Innovative Approach to the Treatment of Language Disabilities
Authors: Jonathan Bolduc
Abstract:
Studies have demonstrated the effectiveness of music training approaches to help children with language disabilities. Because music is closely associated with a number of cognitive functions, including language, it has been hypothesized that musical skills transfer to other domains. Research suggests that music training strengthens basic auditory processing skills in dyslexic children and may ameliorate phonological deficits. Furthermore, music instruction has the particular advantage of being non-literacy-based, thus removing the frustrations that can be associated with reading and writing activities among children with specific learning disabilities. In this study, we assessed the effect of implementing an intensive music program on the development of language skills (phonological and reading) in 4- to 9-year-old children. Seventeen children (N=17) participated in the study. The experiment took place over 6 weeks in a controlled environment. Eighteen lessons of 40 minutes were offered during this period by two music specialists. The Dalcroze, Orff, and Kodaly approaches were used. A series of qualitative measures were implemented to document the contribution of music training to this population. Currently, the data is being analyzed. The first results show that learning music seems to significantly improve verbal memory. We already know that language disabilities are considered one of the main causes of school dropout as well as later professional and social failure. We aim to corroborate that an integrated music education program can provide children with language disabilities with the same opportunities to develop and succeed in school as their classmates. Scientifically, the results will contribute to advance the knowledge by identifying the more effective music education strategies to improve the overall development of children worldwide.Keywords: music education, music, art education, language diasabilities
Procedia PDF Downloads 2311435 Investigation of Mass Transfer for RPB Distillation at High Pressure
Authors: Amiza Surmi, Azmi Shariff, Sow Mun Serene Lock
Abstract:
In recent decades, there has been a significant emphasis on the pivotal role of Rotating Packed Beds (RPBs) in absorption processes, encompassing the removal of Volatile Organic Compounds (VOCs) from groundwater, deaeration, CO2 absorption, desulfurization, and similar critical applications. The primary focus is elevating mass transfer rates, enhancing separation efficiency, curbing power consumption, and mitigating pressure drops. Additionally, substantial efforts have been invested in exploring the adaptation of RPB technology for offshore deployment. This comprehensive study delves into the intricacies of nitrogen removal under low temperature and high-pressure conditions, employing the high gravity principle via innovative RPB distillation concept with a specific emphasis on optimizing mass transfer. Based on the author's knowledge and comprehensive research, no cryogenic experimental testing was conducted to remove nitrogen via RPB. The research identifies pivotal process control factors through meticulous experimental testing, with pressure, reflux ratio, and reboil ratio emerging as critical determinants in achieving the desired separation performance. The results are remarkable, with nitrogen removal reaching less than one mole% in the Liquefied Natural Gas (LNG) product and less than three moles% methane in the nitrogen-rich gas stream. The study further unveils the mass transfer coefficient, revealing a noteworthy trend of decreasing Number of Transfer Units (NTU) and Area of Transfer Units (ATU) as the rotational speed escalates. Notably, the condenser and reboiler impose varying demands based on the operating pressure, with lower pressures at 12 bar requiring a more substantial duty than the 15-bar operation of the RPB. In pursuit of optimal energy efficiency, a meticulous sensitivity analysis is conducted, pinpointing the ideal combination of pressure and rotating speed that minimizes overall energy consumption. These findings underscore the efficiency of the RPB distillation approach in effecting efficient separation, even when operating under the challenging conditions of low temperature and high pressure. This achievement is attributed to a rigorous process control framework that diligently manages the operational pressure and temperature profile of the RPB. Nonetheless, the study's conclusions point towards the need for further research to address potential scaling challenges and associated risks, paving the way for the industrial implementation of this transformative technology.Keywords: mass transfer coefficient, nitrogen removal, liquefaction, rotating packed bed
Procedia PDF Downloads 54