Search results for: natural features
7092 A Trends Analysis of Yatch Simulator
Authors: Jae-Neung Lee, Keun-Chang Kwak
Abstract:
This paper describes an analysis of Yacht Simulator international trends and also explains about Yacht. Examples of yacht Simulator using Yacht Simulator include image processing for totaling the total number of vehicles, edge/target detection, detection and evasion algorithm, image processing using SIFT (scale invariant features transform) matching, and application of median filter and thresholding.Keywords: yacht simulator, simulator, trends analysis, SIFT
Procedia PDF Downloads 4327091 Fate of Organic Waste, Refuse and Inert from Municipal Discards as Source of Energy and Nutrient in India: A Brief Review
Authors: Kunwar Paritosh, Vivekanand Vivekanand, Nidhi Pareek
Abstract:
Presently, India depends primarily on fossil fuels for its acute energy demand. The swift in development of India in last two decades is accentuating its natural resources and compelling expenditures to cope energy security for the habitats. A total inhabitant of 1.2 billion, observing growing industrialization; is generating 68.8 million tonnes of municipal solid waste per year, 53.7 million tonnes is collected, and only trifling amount of 10.3 million tonnes of waste is treated per year that integrates to a massive amount of unimaginable land hill. In India, waste is mostly landfilled and/or incinerated with low technology and is poorly managed. Underutilization of this waste not only gulps resources but also stresses environment, public health and bionetwork thus affecting the bioeconomy negatively. It also creates conditions that invoke inevitable expenditures and loss of its renewable energy potential. The non-scientific approach to manage waste may lead to an economy downfall, underutilization and degradation of natural resources. Waste treatment technologies must be scientifically tailored and engineered as per the type of waste where it may be utilized as a source of energy (here biogas) and nutrients employing anaerobic digestion to the sorted waste. This paper presents a brief review on current practices, key achievements and forthcoming aspects of harnessing energy from municipal solid waste in Indian scenario.Keywords: municipal discards, organic waste, anaerobic digestion, incineration, energy
Procedia PDF Downloads 2627090 Designing an Operational Control System for the Continuous Cycle of Industrial Technological Processes Using Fuzzy Logic
Authors: Teimuraz Manjapharashvili, Ketevani Manjaparashvili
Abstract:
Fuzzy logic is a modeling method for complex or ill-defined systems and is a relatively new mathematical approach. Its basis is to consider overlapping cases of parameter values and define operations to manipulate these cases. Fuzzy logic can successfully create operative automatic management or appropriate advisory systems. Fuzzy logic techniques in various operational control technologies have grown rapidly in the last few years. Fuzzy logic is used in many areas of human technological activity. In recent years, fuzzy logic has proven its great potential, especially in the automation of industrial process control, where it allows to form of a control design based on the experience of experts and the results of experiments. The engineering of chemical technological processes uses fuzzy logic in optimal management, and it is also used in process control, including the operational control of continuous cycle chemical industrial, technological processes, where special features appear due to the continuous cycle and correct management acquires special importance. This paper discusses how intelligent systems can be developed, in particular, how fuzzy logic can be used to build knowledge-based expert systems in chemical process engineering. The implemented projects reveal that the use of fuzzy logic in technological process control has already given us better solutions than standard control techniques. Fuzzy logic makes it possible to develop an advisory system for decision-making based on the historical experience of the managing operator and experienced experts. The present paper deals with operational control and management systems of continuous cycle chemical technological processes, including advisory systems. Because of the continuous cycle, many features are introduced in them compared to the operational control of other chemical technological processes. Among them, there is a greater risk of transitioning to emergency mode; the return from emergency mode to normal mode must be done very quickly due to the impossibility of stopping the technological process due to the release of defective products during this period (i.e., receiving a loss), accordingly, due to the need for high qualification of the operator managing the process, etc. For these reasons, operational control systems of continuous cycle chemical technological processes have been specifically discussed, as they are different systems. Special features of such systems in control and management were brought out, which determine the characteristics of the construction of control and management systems. To verify the findings, the development of an advisory decision-making information system for operational control of a lime kiln using fuzzy logic, based on the creation of a relevant expert-targeted knowledge base, was discussed. The control system has been implemented in a real lime production plant with a lime burn kiln, which has shown that suitable and intelligent automation improves operational management, reduces the risks of releasing defective products, and, therefore, reduces costs. The special advisory system was successfully used in the said plant both for the improvement of operational management and, if necessary, for the training of new operators due to the lack of an appropriate training institution.Keywords: chemical process control systems, continuous cycle industrial technological processes, fuzzy logic, lime kiln
Procedia PDF Downloads 297089 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals
Authors: Christine F. Boos, Fernando M. Azevedo
Abstract:
Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing
Procedia PDF Downloads 5287088 The Impact of Data Science on Geography: A Review
Authors: Roberto Machado
Abstract:
We conducted a systematic review using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses methodology, analyzing 2,996 studies and synthesizing 41 of them to explore the evolution of data science and its integration into geography. By employing optimization algorithms, we accelerated the review process, significantly enhancing the efficiency and precision of literature selection. Our findings indicate that data science has developed over five decades, facing challenges such as the diversified integration of data and the need for advanced statistical and computational skills. In geography, the integration of data science underscores the importance of interdisciplinary collaboration and methodological innovation. Techniques like large-scale spatial data analysis and predictive algorithms show promise in natural disaster management and transportation route optimization, enabling faster and more effective responses. These advancements highlight the transformative potential of data science in geography, providing tools and methodologies to address complex spatial problems. The relevance of this study lies in the use of optimization algorithms in systematic reviews and the demonstrated need for deeper integration of data science into geography. Key contributions include identifying specific challenges in combining diverse spatial data and the necessity for advanced computational skills. Examples of connections between these two fields encompass significant improvements in natural disaster management and transportation efficiency, promoting more effective and sustainable environmental solutions with a positive societal impact.Keywords: data science, geography, systematic review, optimization algorithms, supervised learning
Procedia PDF Downloads 307087 Reduced Lung Volume: A Possible Cause of Stuttering
Authors: Shantanu Arya, Sachin Sakhuja, Gunjan Mehta, Sanjay Munjal
Abstract:
Stuttering may be defined as a speech disorder affecting the fluency domain of speech and characterized by covert features like word substitution, omittance and circumlocution and overt features like prolongation of sound, syllables and blocks etc. Many etiologies have been postulated to explain stuttering based on various experiments and research. Moreover, Breathlessness has also been reported by many individuals with stuttering for which breathing exercises are generally advised. However, no studies reporting objective evaluation of the pulmonary capacity and further objective assessment of the efficacy of breathing exercises have been conducted. Pulmonary Function Test which evaluates parameters like Forced Vital Capacity, Peak Expiratory Flow Rate, Forced expiratory flow Rate can be used to study the pulmonary behavior of individuals with stuttering. The study aimed: a) To identify speech motor & physiologic behaviours associated with stuttering by administering PFT. b) To recognize possible reasons for an association between speech motor behaviour & stuttering severity. In this regard, PFT tests were administered on individuals who reported signs and symptoms of stuttering and showed abnormal scores on Stuttering Severity Index. Parameters like Forced Vital Capacity, Forced Expiratory Volume, Peak Expiratory Flow Rate (L/min), Forced Expiratory Flow Rate (L/min) were evaluated and correlated with scores of Stuttering Severity Index. Results showed significant decrease in the parameters (lower than normal scores) in individuals with established stuttering. Strong correlation was also found between degree of stuttering and the degree of decrease in the pulmonary volumes. Thus, it is evident that fluent speech requires strong support of lung pressure and requisite volumes. Further research in demonstrating the efficacy of abdominal breathing exercises in this regard is needed.Keywords: forced expiratory flow rate, forced expiratory volume, forced vital capacity, peak expiratory flow rate, stuttering
Procedia PDF Downloads 2757086 Hibiscus Sabdariffa Extracts: A Sustainable and Eco-Friendly Resource for Multifunctional Cellulosic Fibers
Authors: Mohamed Rehan, Gamil E. Ibrahim, Mohamed S. Abdel-Aziz, Shaimaa R. Ibrahim, Tawfik A. Khattab
Abstract:
The utilization of natural products in finishing textiles toward multifunctional applications without side effects is an extremely motivating goal. Hibiscus sabdariffa usually has been used for many traditional medicine applications. To develop an additional use for Hibiscus sabdariffa, an extraction of bioactive compounds from Hibiscus sabdariffa followed by finishing on cellulosic fibers was designed to cleaner production of the value-added textiles fibers with multifunctional applications. The objective of this study is to explore, identify, and evaluate the bioactive compound extracted from Hibiscus sabdariffa by different solvent via ultrasonic technique as a potential eco-friendly agent for multifunctional cellulosic fabrics via two approaches. In the first approach, Hibiscus sabdariffa extract was used as a source of sustainable eco-friendly for simultaneous coloration and multi-finishing of cotton fabrics via in situ incorporations of nanoparticles (silver and metal oxide). In the second approach, the micro-capsulation of Hibiscus sabdariffa extracts was followed by coating onto cotton gauze to introduce multifunctional healthcare applications. The effect of the solvent type was accelerated by ultrasonic on the phytochemical, antioxidant, and volatile compounds of Hibiscus sabdariffa. The surface morphology and elemental content of the treated fabrics were explored using Fourier transform infrared spectroscopy (FT-IR), scanning electron microscope (SEM), and energy-dispersive X-ray spectroscopy (EDX). The multifunctional properties of treated fabrics, including coloration, sensor properties and protective properties against pathogenic microorganisms and UV radiation as well as wound healing property were evaluated. The results showed that the water, as well as ethanol/water, was selected as a solvent for the extraction of natural compounds from Hibiscus Sabdariffa with high in extract yield, total phenolic contents, flavonoid contents, and antioxidant activity. These natural compounds were utilized to enhance cellulosic fibers functionalization by imparting faint/dark red color, antimicrobial against different organisms, and antioxidants as well as UV protection properties. The encapsulation of Hibiscus Sabdariffa extracts, as well as wound healing, is under consideration and evaluation. As a result, the current study presents a sustainable and eco-friendly approach to design cellulosic fabrics for multifunctional medical and healthcare applications.Keywords: cellulosic fibers, Hibiscus sabdariffa extract, multifunctional application, nanoparticles
Procedia PDF Downloads 1467085 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services
Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme
Abstract:
Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing
Procedia PDF Downloads 1137084 Engineering Topology of Ecological Model for Orientation Impact of Sustainability Urban Environments: The Spatial-Economic Modeling
Authors: Moustafa Osman Mohammed
Abstract:
The modeling of a spatial-economic database is crucial in recitation economic network structure to social development. Sustainability within the spatial-economic model gives attention to green businesses to comply with Earth’s Systems. The natural exchange patterns of ecosystems have consistent and periodic cycles to preserve energy and materials flow in systems ecology. When network topology influences formal and informal communication to function in systems ecology, ecosystems are postulated to valence the basic level of spatial sustainable outcome (i.e., project compatibility success). These referred instrumentalities impact various aspects of the second level of spatial sustainable outcomes (i.e., participant social security satisfaction). The sustainability outcomes are modeling composite structure based on a network analysis model to calculate the prosperity of panel databases for efficiency value, from 2005 to 2025. The database is modeling spatial structure to represent state-of-the-art value-orientation impact and corresponding complexity of sustainability issues (e.g., build a consistent database necessary to approach spatial structure; construct the spatial-economic-ecological model; develop a set of sustainability indicators associated with the model; allow quantification of social, economic and environmental impact; use the value-orientation as a set of important sustainability policy measures), and demonstrate spatial structure reliability. The structure of spatial-ecological model is established for management schemes from the perspective pollutants of multiple sources through the input–output criteria. These criteria evaluate the spillover effect to conduct Monte Carlo simulations and sensitivity analysis in a unique spatial structure. The balance within “equilibrium patterns,” such as collective biosphere features, has a composite index of many distributed feedback flows. The following have a dynamic structure related to physical and chemical properties for gradual prolong to incremental patterns. While these spatial structures argue from ecological modeling of resource savings, static loads are not decisive from an artistic/architectural perspective. The model attempts to unify analytic and analogical spatial structure for the development of urban environments in a relational database setting, using optimization software to integrate spatial structure where the process is based on the engineering topology of systems ecology.Keywords: ecological modeling, spatial structure, orientation impact, composite index, industrial ecology
Procedia PDF Downloads 687083 Surface Temperature of Asphalt Pavements with Colored Cement-Based Grouting Materials Containing Ceramic Waste Powder and Zeolite
Authors: H. Higashiyama, M. Sano, F. Nakanishi, M. Sugiyama, M. Kawanishi, S. Tsukuma
Abstract:
The heat island phenomenon and extremely hot summer climate are becoming environmental problems in Japan. Cool pavements reduce the surface temperature compared to conventional asphalt pavements in the hot summer climate and improve the thermal environment in the urban area. The authors have studied cement–based grouting materials poured into voids in porous asphalt pavements to reduce the road surface temperature. For the cement–based grouting material, cement, ceramic waste powder, and natural zeolite were used. This cement–based grouting material developed reduced the road surface temperature by 20 °C or more in the hot summer season. Considering the urban landscape, this study investigates the effect of surface temperature reduction of colored cement–based grouting materials containing pigments poured into voids in porous asphalt pavements by measuring the surface temperature of asphalt pavements outdoors. The yellow color performed the same as the original cement–based grouting material containing no pigment and was thermally better performance than the other color. However, all the tested cement–based grouting materials performed well for reducing the surface temperature and for creating the urban landscape.Keywords: ceramic waste powder, natural zeolite, road surface temperature, asphalt pavement, urban landscape
Procedia PDF Downloads 3157082 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education
Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue
Abstract:
In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education
Procedia PDF Downloads 1087081 Caught in the Crossfire : Natural Resources, Energy Transition, and Conflict in the Democratic Republic of Congo
Authors: Koami West Togbetse
Abstract:
The global shift towards clean and sustainable energy sources, known as the energy transition, is compelling numerous countries to transition from polluting energy systems to cleaner alternatives, commonly referred to as green energies. In this context, cobalt holds significant importance as a crucial mineral in facilitating this energy transition due to its pivotal role in electric batteries. Considering the Democratic Republic of Congo’s reputation for political instability and its position as the largest producer of cobalt, possessing over 50% of the world’s reserves, we have assessed the potential conflicts that may arise as a result of the rapid increase in cobalt demand. The results show that cobalt does not appear to be a determinant contributing to all past conflicts over the study period in the Democratic Republic of Congo (DRC). Gold, on the other hand, stands out as one of the coveted metals for rebel groups engaged in rampant exploitation, increasing the likelihood of conflicts occurring. However, a more in-depth analysis reveals a shift in the relationship between cobalt production and conflict events around 2006. Prior to 2006, increased cobalt production was significantly associated with a reduction in conflict events. However, after 2006, this relationship became positive, indicating that higher cobalt production is now linked to a slight increase in conflict events. This suggests a change in the dynamics affecting conflicts related to cobalt production before and after 2006. According to our predictive model, cobalt has the potential to emerge increasingly as a contributing factor, just like gold.Keywords: conflicts, natural resources, energy transition, geopolitics
Procedia PDF Downloads 317080 Improving Security by Using Secure Servers Communicating via Internet with Standalone Secure Software
Authors: Carlos Gonzalez
Abstract:
This paper describes the use of the Internet as a feature to enhance the security of our software that is going to be distributed/sold to users potentially all over the world. By placing in a secure server some of the features of the secure software, we increase the security of such software. The communication between the protected software and the secure server is done by a double lock algorithm. This paper also includes an analysis of intruders and describes possible responses to detect threats.Keywords: internet, secure software, threats, cryptography process
Procedia PDF Downloads 3337079 Urban River As Living Infrastructure: Tidal Flooding And Sea Level Rise In A Working Waterway In Hampton Roads, Virginia
Authors: William Luke Hamel
Abstract:
Existing conceptions of urban flooding caused by tidal fluctuations and sea-level rise have been inadequately conceptualized by metrics of resilience and methods of flow modeling. While a great deal of research has been devoted to the effects of urbanization on pluvial flooding, the kind of tidal flooding experienced by locations like Hampton Roads, Virginia, has not been adequately conceptualized as being a result of human factors such as urbanization and gray infrastructure. Resilience from sea level rise and its associated flooding has been pioneered in the region with the 2015 Norfolk Resilience Plan from 100 Resilient Cities as well as the 2016 Norfolk Vision 2100 plan, which envisions different patterns of land use for the city. Urban resilience still conceptualizes the city as having the ability to maintain an equilibrium in the face of disruptions. This economic and social equilibrium relies on the Elizabeth River, narrowly conceptualized. Intentionally or accidentally, the river was made to be a piece of infrastructure. Its development was meant to serve the docks, shipyards, naval yards, and port infrastructure that gives the region so much of its economic life. Inasmuch as it functions to permit the movement of cargo; the raising and lowering of ships to be repaired, commissioned, or decommissioned; or the provisioning of military vessels, the river as infrastructure is functioning properly. The idea that the infrastructure is malfunctioning when high tides and sea-level rise create flooding is predicated on the idea that the infrastructure is truly a human creation and can be controlled. The natural flooding cycles of an urban river, combined with the action of climate change and sea-level rise, are only abnormal so much as they encroach on the development that first encroached on the river. The urban political ecology of water provides the ability to view the river as an infrastructural extension of urban networks while also calling for its emancipation from stationarity and human control. Understanding the river and city as a hydrosocial territory or as a socio-natural system liberates both actors from the duality of the natural and the social while repositioning river flooding as a normal part of coexistence on a floodplain. This paper argues for the adoption of an urban political ecology lens in the analysis and governance of urban rivers like the Elizabeth River as a departure from the equilibrium-seeking and stability metrics of urban resilience.Keywords: urban flooding, political ecology, Elizabeth river, Hampton roads
Procedia PDF Downloads 1697078 Effects of Using Alternative Energy Sources and Technologies to Reduce Energy Consumption and Expenditure of a Single Detached House
Authors: Gul Nihal Gugul, Merih Aydinalp-Koksal
Abstract:
In this study, hourly energy consumption model of a single detached house in Ankara, Turkey is developed using ESP-r building energy simulation software. Natural gas is used for space heating, cooking, and domestic water heating in this two story 4500 square feet four-bedroom home. Hourly electricity consumption of the home is monitored by an automated meter reading system, and daily natural gas consumption is recorded by the owners during 2013. Climate data of the region and building envelope data are used to develop the model. The heating energy consumption of the house that is estimated by the ESP-r model is then compared with the actual heating demand to determine the performance of the model. Scenarios are applied to the model to determine the amount of reduction in the total energy consumption of the house. The scenarios are using photovoltaic panels to generate electricity, ground source heat pumps for space heating and solar panels for domestic hot water generation. Alternative scenarios such as improving wall and roof insulations and window glazing are also applied. These scenarios are evaluated based on annual energy, associated CO2 emissions, and fuel expenditure savings. The pay-back periods for each scenario are also calculated to determine best alternative energy source or technology option for this home to reduce annual energy use and CO2 emission.Keywords: ESP-r, building energy simulation, residential energy saving, CO2 reduction
Procedia PDF Downloads 1997077 Analysis of Eco-Efficiency and the Determinants of Family Agriculture in Southeast Spain
Authors: Emilio Galdeano-Gómez, Ángeles Godoy-Durán, Juan C. Pérez-Mesa, Laura Piedra-Muñoz
Abstract:
Eco-efficiency is receiving ever-increasing interest as an indicator of sustainability, as it links environmental and economic performances in productive activities. In agriculture, these indicators and their determinants prove relevant due to the close relationships in this activity between the use of natural resources, which is generally limited, and the provision of basic goods to society. In this context, various analyses have focused on eco-efficiency by considering individual family farms as the basic production unit. However, not only must the measure of efficiency be taken into account, but also the existence of a series of factors which constitute socio-economic, political-institutional, and environmental determinants. Said factors have been studied to a lesser extent in the literature. The present work analyzes eco-efficiency at a micro level, focusing on small-scale family farms as the main decision-making units in horticulture in southeast Spain, a sector which represents about 30% of the fresh vegetables produced in the country and about 20% of those consumed in Europe. The objectives of this study are a) to obtain a series of eco-efficiency indicators by estimating several pressure ratios and economic value added in farming, b) to analyze the influence of specific social, economic and environmental variables on the aforementioned eco-efficiency indicators. The present work applies the method of Data Envelopment Analysis (DEA), which calculates different combinations of environmental pressures (water usage, phytosanitary contamination, waste management, etc.) and aggregate economic value. In a second stage, an analysis is conducted on the influence of the socio-economic and environmental characteristics of family farms on the eco-efficiency indicators, as endogeneous variables, through the use of truncated regression and bootstrapping techniques, following Simar-Wilson methodology. The results reveal considerable inefficiency in aspects such as waste management, while there is relatively little inefficiency in water usage and nitrogen balance. On the other hand, characteristics, such as product specialization, the adoption of quality certifications and belonging to a cooperative do have a positive impact on eco-efficiency. These results are deemed to be of interest to agri-food systems structured on small-scale producers, and they may prove useful to policy-makers as regards managing public environmental programs in agriculture.Keywords: data envelopment analysis, eco-efficiency, family farms, horticulture, socioeconomic features
Procedia PDF Downloads 1937076 Numerical and Simulation Analysis of Composite Friction Materials Using Single Plate Clutch Pad in Agricultural Tractors
Authors: Ravindra Raju, Vidhu Kampurath
Abstract:
For smooth transition of the power from the engine to the transmission system, a clutch is used. In agricultural tractors, friction clutches are widely used in power transmission applications. To transmit the maximum torque in friction clutches, selection of materials is one of the important tasks. The present used material for friction disc is Asbestos, Ceramic etc. In this study, analysis is performed using composites materials. The composite materials are considered due to their high strength to weight ratio. Composite materials like kevlar49, kevlar 29U were used in the study. The paper presents a systematic approach to optimize the structural and thermal characteristics of the clutch friction pad. A single plate clutch is modeled using Creo 2.0 software and analyzed using ANSYS. Thermal analysis considers the reduction of heat generated between the friction surfaces and reducing the temperature rise during the steady state period. Structural analysis is done to minimize the stresses developed as a result of the loading contact between friction surfaces. Also, modal analysis is done to optimize the natural frequency of the friction plate to avoid being in resonance with the engine frequency range. The analysis carried out on ANSYS workbench to get the foremost appropriate friction material for clutch. From the analyzed results stress, strain / total deformation values and natural frequency of the materials were compared for all the composite materials and the best one was taken out. For the study purpose, specifications of the clutch are obtained from the MF1035 (47KW) Tractor model.Keywords: ANSYS, clutch, composite materials, creo
Procedia PDF Downloads 2997075 Modeling Visual Memorability Assessment with Autoencoders Reveals Characteristics of Memorable Images
Authors: Elham Bagheri, Yalda Mohsenzadeh
Abstract:
Image memorability refers to the phenomenon where certain images are more likely to be remembered by humans than others. It is a quantifiable and intrinsic attribute of an image. Understanding how visual perception and memory interact is important in both cognitive science and artificial intelligence. It reveals the complex processes that support human cognition and helps to improve machine learning algorithms by mimicking the brain's efficient data processing and storage mechanisms. To explore the computational underpinnings of image memorability, this study examines the relationship between an image's reconstruction error, distinctiveness in latent space, and its memorability score. A trained autoencoder is used to replicate human-like memorability assessment inspired by the visual memory game employed in memorability estimations. This study leverages a VGG-based autoencoder that is pre-trained on the vast ImageNet dataset, enabling it to recognize patterns and features that are common to a wide and diverse range of images. An empirical analysis is conducted using the MemCat dataset, which includes 10,000 images from five broad categories: animals, sports, food, landscapes, and vehicles, along with their corresponding memorability scores. The memorability score assigned to each image represents the probability of that image being remembered by participants after a single exposure. The autoencoder is finetuned for one epoch with a batch size of one, attempting to create a scenario similar to human memorability experiments where memorability is quantified by the likelihood of an image being remembered after being seen only once. The reconstruction error, which is quantified as the difference between the original and reconstructed images, serves as a measure of how well the autoencoder has learned to represent the data. The reconstruction error of each image, the error reduction, and its distinctiveness in latent space are calculated and correlated with the memorability score. Distinctiveness is measured as the Euclidean distance between each image's latent representation and its nearest neighbor within the autoencoder's latent space. Different structural and perceptual loss functions are considered to quantify the reconstruction error. The results indicate that there is a strong correlation between the reconstruction error and the distinctiveness of images and their memorability scores. This suggests that images with more unique distinct features that challenge the autoencoder's compressive capacities are inherently more memorable. There is also a negative correlation between the reduction in reconstruction error compared to the autoencoder pre-trained on ImageNet, which suggests that highly memorable images are harder to reconstruct, probably due to having features that are more difficult to learn by the autoencoder. These insights suggest a new pathway for evaluating image memorability, which could potentially impact industries reliant on visual content and mark a step forward in merging the fields of artificial intelligence and cognitive science. The current research opens avenues for utilizing neural representations as instruments for understanding and predicting visual memory.Keywords: autoencoder, computational vision, image memorability, image reconstruction, memory retention, reconstruction error, visual perception
Procedia PDF Downloads 917074 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features
Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh
Abstract:
In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve
Procedia PDF Downloads 2627073 A Study of the Planning and Designing of the Built Environment under the Green Transit-Oriented Development
Authors: Wann-Ming Wey
Abstract:
In recent years, the problems of global climate change and natural disasters have induced the concerns and attentions of environmental sustainability issues for the public. Aside from the environmental planning efforts done for human environment, Transit-Oriented Development (TOD) has been widely used as one of the future solutions for the sustainable city development. In order to be more consistent with the urban sustainable development, the development of the built environment planning based on the concept of Green TOD which combines both TOD and Green Urbanism is adapted here. The connotation of the urban development under the green TOD including the design toward environment protect, the maximum enhancement resources and the efficiency of energy use, use technology to construct green buildings and protected areas, natural ecosystems and communities linked, etc. Green TOD is not only to provide the solution to urban traffic problems, but to direct more sustainable and greener consideration for future urban development planning and design. In this study, we use both the TOD and Green Urbanism concepts to proceed to the study of the built environment planning and design. Fuzzy Delphi Technique (FDT) is utilized to screen suitable criteria of the green TOD. Furthermore, Fuzzy Analytic Network Process (FANP) and Quality Function Deployment (QFD) were then developed to evaluate the criteria and prioritize the alternatives. The study results can be regarded as the future guidelines of the built environment planning and designing under green TOD development in Taiwan.Keywords: green TOD, built environment, fuzzy delphi technique, quality function deployment, fuzzy analytic network process
Procedia PDF Downloads 3847072 Humanizing Industrial Architecture: When Form Meets Function and Emotion
Authors: Sahar Majed Asad
Abstract:
Industrial structures have historically focused on functionality and efficiency, often disregarding aesthetics and human experience. However, a new approach is emerging that prioritizes humanizing industrial architecture and creating spaces that promote well-being, sustainability, and social responsibility. This study explores the motivations and design strategies behind this shift towards more human-centered industrial environments, providing practical guidance for architects, designers, and other stakeholders interested in incorporating these principles into their work. Through in-depth interviews with architects, designers, and industry experts, as well as a review of relevant literature, this study uncovers the reasons for this change in industrial design. The findings reveal that this shift is driven by a desire to create environments that prioritize the needs and experiences of the people who use them. The study identifies strategies such as incorporating natural elements, flexible design, and advanced technologies as crucial in achieving human-centric industrial design. It also emphasizes that effective communication and collaboration among stakeholders are crucial for successful human-centered design outcomes. This paper provides a comprehensive analysis of the motivations and design strategies behind the humanization of industrial architecture. It begins by examining the history of industrial architecture and highlights the focus on functionality and efficiency. The paper then explores the emergence of human-centered design principles in industrial architecture, discussing the benefits of this approach, including creating more sustainable and socially responsible environments.The paper explains specific design strategies that prioritize the human experience of industrial spaces. It outlines how incorporating natural elements like greenery and natural lighting can create more visually appealing and comfortable environments for industrial workers. Flexible design solutions, such as movable walls and modular furniture, can make spaces more adaptable to changing needs and promote a sense of ownership and creativity among workers. Advanced technologies, such as sensors and automation, can improve the efficiency and safety of industrial spaces while also enhancing the human experience. To provide practical guidance, the paper offers recommendations for incorporating human-centered design principles into industrial structures. It emphasizes the importance of understanding the needs and experiences of the people who use these spaces and provides specific examples of how natural elements, flexible design, and advanced technologies can be incorporated into industrial structures to promote human well-being. In conclusion, this study demonstrates that the humanization of industrial architecture is a growing trend that offers tremendous potential for creating more sustainable and socially responsible built environments. By prioritizing the human experience of industrial spaces, designers can create environments that promote well-being, sustainability, and social responsibility. This research study provides practical guidance for architects, designers, and other stakeholders interested in incorporating human-centered design principles into their work, demonstrating that a human-centered approach can lead to functional and aesthetically pleasing industrial spaces that promote human well-being and contribute to a better future for all.Keywords: human-centered design, industrial architecture, sustainability, social responsibility
Procedia PDF Downloads 1617071 Study on the Efficiency of Some Antioxidants on Reduction of Maillard Reaction in Low Lactose Milk
Authors: Farnaz Alaeimoghadam, Farzad Alaeimoghadam
Abstract:
In low-lactose milk, due to lactose hydrolysis and its conversion to monosaccharides like glucose and galactose, the Maillard reaction (non-enzymatic browning) occurs more readily compared to non-hydrolyzed milk. This reaction incurs off-flavor and dark color, as well as a decrease in the nutritional value of milk. The target of this research was to evaluate the effect of natural antioxidants in diminishing the browning in low-lactose milk. In this research, three antioxidants, namely ascorbic acid, gallic acid, and pantothenic acid in the concentration range of 0-1 mM/L, either in combination with each other or separately, were added to low-lactose milk. After heat treatment (120 0C for 3 min.), milk samples incubated at 55 0C for one day and then stored at 4 0C for 9 days. Quality indices, including total phenol content, antioxidant activity, color indices, and sensory characters, were measured during intervals of 0, 2, 5, 7, and 9 days. Results of this research showed that the effect of storage time and adding antioxidants were significant on pH, antioxidant activity, total phenolic compounds either before or after heating, index L*, color change, and sensational characteristics (p < 0.05); however, acidity, a* and b* indices, chroma, and hue angle showed no significant changes (p > 0.05). The findings showed that the simultaneous application of gallic acid and ascorbic in the diminishing of non-enzymatic browning and color change, increasing pH, longevity, and antioxidant activity after heat treatment, and augmenting phenolic compounds before heat treatment was better than that of pantothenic acid.Keywords: Maillard, low-lactose milk, non-enzymatic browning, natural antioxidant
Procedia PDF Downloads 1387070 Potential Applications of Biosurfactants from Corn Steep Liquor in Cosmetic
Authors: J. M. Cruz, X. Vecıno, L. Rodrıguez-López, J. M. Dominguez, A. B. Moldes
Abstract:
The cosmetic and personal care industry are the fields where biosurfactants could have more possibilities of success because in this kind of products the replacement of synthetic detergents by natural surfactants will provide an additional added value to the product, at the same time that the harmful effects produced by some synthetic surfactants could be avoided or reduced. Therefore, nowadays, consumers are disposed to pay and additional cost if they obtain more natural products. In this work we provide data about the potential of biosurfactants in the cosmetic and personal care industry. Biosurfactants from corn steep liquor, that is a fermented and condensed stream, have showed good surface-active properties, reducing substantially the surface tension of water. The bacteria that usually growth in corn steep liquor comprises Lactobacillus species, generally recognize as safe. The biosurfactant extracted from CSL consists of a lipopeptide, composed by fatty acids, which can reduce the surface tension of water in more than 30 units. It is a yellow and viscous liquid with a density of 1.053 mg/mL and pH=4. By these properties, they could be introduced in the formulation of cosmetic creams, hair conditioners or shampoos. Moreover this biosurfactant extracted from corn steep liquor, have showed a potent antimicrobial effect on different strains of Streptococcus. Some species of Streptococcus are commonly found weakly living in the human respiratory and genitourinary systems, producing several diseases in humans, including skin diseases. For instance, Streptococcus pyogenes produces many toxins and enzymes that help to stabilize skin infections; probably biosurfactants from corn steep liquor can inhibit the mechanisms of the S. pyogenes enzymes. S. pyogenes is an important cause of pharyngitis, impetigo, cellulitis and necrotizing fasciitis. In this work it was observed that 50 mg/L of biosurfactant extract obtained from corn steep liquor is able to inhibit more than 50% the growth of S. pyogenes. Thus, cosmetic and personal care products, formulated with biosurfactants from corn steep liquor, could have prebiotic properties. The natural biosurfactant presented in this work and obtained from corn milling industry streams, have showed a high potential to provide an interesting and sustainable alternative to those, antibacterial and surfactant ingredients used in cosmetic and personal care manufacture, obtained by chemical synthesis, which can cause irritation, and often only show short time effects.Keywords: antimicrobial activity, biosurfactants, cosmetic, personal care
Procedia PDF Downloads 2577069 Lithuanian Sign Language Literature: Metaphors at the Phonological Level
Authors: Anželika Teresė
Abstract:
In order to solve issues in sign language linguistics, address matters pertaining to maintaining high quality of sign language (SL) translation, contribute to dispelling misconceptions about SL and deaf people, and raise awareness and understanding of the deaf community heritage, this presentation discusses literature in Lithuanian Sign Language (LSL) and inherent metaphors that are created by using the phonological parameter –handshape, location, movement, palm orientation and nonmanual features. The study covered in this presentation is twofold, involving both the micro-level analysis of metaphors in terms of phonological parameters as a sub-lexical feature and the macro-level analysis of the poetic context. Cognitive theories underlie research of metaphors in sign language literature in a range of SL. The study follows this practice. The presentation covers the qualitative analysis of 34 pieces of LSL literature. The analysis employs ELAN software widely used in SL research. The target is to examine how specific types of each phonological parameter are used for the creation of metaphors in LSL literature and what metaphors are created. The results of the study show that LSL literature employs a range of metaphors created by using classifier signs and by modifying the established signs. The study also reveals that LSL literature tends to create reference metaphors indicating status and power. As the study shows, LSL poets metaphorically encode status by encoding another meaning in the same sign, which results in creating double metaphors. The metaphor of identity has been determined. Notably, the poetic context has revealed that the latter metaphor can also be identified as a metaphor for life. The study goes on to note that deaf poets create metaphors related to the importance of various phenomena significance of the lyrical subject. Notably, the study has allowed detecting locations, nonmanual features and etc., never mentioned in previous SL research as used for the creation of metaphors.Keywords: Lithuanian sign language, sign language literature, sign language metaphor, metaphor at the phonological level, cognitive linguistics
Procedia PDF Downloads 1367068 Antibacterial and Anti-Biofilm Activity of Vaccinium meridionale S. Pomace Extract Against Staphylococcus aureus, Escherichia coli and Salmonella Enterica
Authors: Carlos Y. Soto, Camila A. Lota, G. Astrid Garzón
Abstract:
Bacterial biofilms cause an ongoing problem for food safety. They are formed when microorganisms aggregate to form a community that attaches to solid surfaces. Biofilms increase the resistance of pathogens to cleaning, disinfection and antibacterial products. This resistance gives rise to problems for human health, industry, and agriculture. At present, plant extracts rich in polyphenolics are being investigated as natural alternatives to degrade bacterial biofilms. The pomace of the tropical Berry Vaccinium meridionale S. contains high amounts of phenolic compounds. Therefore, in the current study, the antimicrobial and antibiofilm effects of extracts from the pomace of Vaccinium meridionale S. were tested on three foodborne pathogens: Enterohaemorrhagic Escherichia coli O157:H7 (ATCC®700728TM), Staphylococcus aureus subsp. aureus (ATCC® 6538TM), and Salmonella enterica serovar Enteritidis (ATCC® 13076TM). Microwave-assisted extraction was used to extract polyphenols with aqueous methanol (80% v/v) at a solid to solvent ratio of 1:10 (w/v) for 20 min. The magnetic stirring was set at 400 rpm, and the microwave power was adjusted to 400 W. The antimicrobial effect of the extract was assessed by determining the half maximal inhibitory concentration (IC50) against the three food poisoning pathogens at concentrations ranging from 50 to 2,850 μg gallic acid equivalents (GAE)/mL of the extract. Biofilm inhibition was assessed using a crystal violet assay applying the same range of concentration. Three replications of the experiments were carried out, and all analyses were run in triplicate. IC50 values were determined using the GraphPad Prism8® program. Significant differences (P<0.05) among means were identified using one-factor analysis of variance (ANOVA) and the post-hoc least significant difference (LSD) test using the Statgraphics plus program, version 2.1.There was significant difference among the mean IC50 values for the tested bacteria. The IC50 for S. aureus was 48 ± 9 μg GAE/mL, followed by 123 ± 49 μg GAE/mL for Salmonella and 376 ± 32 μg GAE/mL for E. coli. The percent inhibition of the extract on biofilm formation was significantly higher for S. aureus (85.8 0.3), followed by E. coli (74.5 1.0) and Salmonella (53.6 9.7). These findings suggest that polyphenolic extracts obtained from the pomace of V. meridionale S. might be used as natural antimicrobial and anti-biofilm natural agents, effective against S. aureus, E. coli and Salmonella enterica.Keywords: antibiofilm, antimicrobial, E. coli, S. aureus, salmonella, IC50, pomace, V. meridionale
Procedia PDF Downloads 637067 Sustainability of Ecotourism Related Activities in the Town of Yercaud: A Modeling Study
Authors: Manoj Gupta Charan Pushparaj
Abstract:
Tourism related activities are getting popular day by day and tourism has become an integral part of everyone’s life. Ecotourism initiatives have grown enormously in the past decade, and the concept of ecotourism has shown to bring great benefits in terms of environment conservation and to improve the livelihood of local people. However, the potential of ecotourism to sustain improving the livelihood of the local population in the remote future is a topic of active debate. A primary challenge that exists in this regard is the enormous costs of limiting the impacts of tourism related activities on the environment. Here we employed systems modeling approach using computer simulations to determine if ecotourism activities in the small hill town of Yercaud (Tamil Nadu, India) can be sustained over years in improving the livelihood of the local population. Increasing damage to the natural environment as a result of tourism-related activities have plagued the pristine hill station of Yercaud. Though ecotourism efforts can help conserve the environment and enrich local population, questions remain if this can be sustained in the distant future. The vital state variables in the model are the existing tourism foundation (labor, services available to tourists, etc.,) in the town of Yercaud and its natural environment (water, flora and fauna). Another state variable is the textile industry that drives the local economy. Our results would help to understand if environment conservation efforts are sustainable in Yercaud and would also offer suggestions to make it sustainable over the course of several years.Keywords: ecotourism, simulations, modeling, Yercaud
Procedia PDF Downloads 2757066 Dido: An Automatic Code Generation and Optimization Framework for Stencil Computations on Distributed Memory Architectures
Authors: Mariem Saied, Jens Gustedt, Gilles Muller
Abstract:
We present Dido, a source-to-source auto-generation and optimization framework for multi-dimensional stencil computations. It enables a large programmer community to easily and safely implement stencil codes on distributed-memory parallel architectures with Ordered Read-Write Locks (ORWL) as an execution and communication back-end. ORWL provides inter-task synchronization for data-oriented parallel and distributed computations. It has been proven to guarantee equity, liveness, and efficiency for a wide range of applications, particularly for iterative computations. Dido consists mainly of an implicitly parallel domain-specific language (DSL) implemented as a source-level transformer. It captures domain semantics at a high level of abstraction and generates parallel stencil code that leverages all ORWL features. The generated code is well-structured and lends itself to different possible optimizations. In this paper, we enhance Dido to handle both Jacobi and Gauss-Seidel grid traversals. We integrate temporal blocking to the Dido code generator in order to reduce the communication overhead and minimize data transfers. To increase data locality and improve intra-node data reuse, we coupled the code generation technique with the polyhedral parallelizer Pluto. The accuracy and portability of the generated code are guaranteed thanks to a parametrized solution. The combination of ORWL features, the code generation pattern and the suggested optimizations, make of Dido a powerful code generation framework for stencil computations in general, and for distributed-memory architectures in particular. We present a wide range of experiments over a number of stencil benchmarks.Keywords: stencil computations, ordered read-write locks, domain-specific language, polyhedral model, experiments
Procedia PDF Downloads 1277065 iPSCs More Effectively Differentiate into Neurons on PLA Scaffolds with High Adhesive Properties for Primary Neuronal Cells
Authors: Azieva A. M., Yastremsky E. V., Kirillova D. A., Patsaev T. D., Sharikov R. V., Kamyshinsky R. A., Lukanina K. I., Sharikova N. A., Grigoriev T. E., Vasiliev A. L.
Abstract:
Adhesive properties of scaffolds, which predominantly depend on the chemical and structural features of their surface, play the most important role in tissue engineering. The basic requirements for such scaffolds are biocompatibility, biodegradation, high cell adhesion, which promotes cell proliferation and differentiation. In many cases, synthetic polymers scaffolds have proven advantageous because they are easy to shape, they are tough, and they have high tensile properties. The regeneration of nerve tissue still remains a big challenge for medicine, and neural stem cells provide promising therapeutic potential for cell replacement therapy. However, experiments with stem cells have their limitations, such as low level of cell viability and poor control of cell differentiation. Whereas the study of already differentiated neuronal cell culture obtained from newborn mouse brain is limited only to cell adhesion. The growth and implantation of neuronal culture requires proper scaffolds. Moreover, the polymer scaffolds implants with neuronal cells could demand specific morphology. To date, it has been proposed to use numerous synthetic polymers for these purposes, including polystyrene, polylactic acid (PLA), polyglycolic acid, and polylactide-glycolic acid. Tissue regeneration experiments demonstrated good biocompatibility of PLA scaffolds, despite the hydrophobic nature of the compound. Problem with poor wettability of the PLA scaffold surface could be overcome in several ways: the surface can be pre-treated by poly-D-lysine or polyethyleneimine peptides; roughness and hydrophilicity of PLA surface could be increased by plasma treatment, or PLA could be combined with natural fibers, such as collagen or chitosan. This work presents a study of adhesion of both induced pluripotent stem cells (iPSCs) and mouse primary neuronal cell culture on the polylactide scaffolds of various types: oriented and non-oriented fibrous nonwoven materials and sponges – with and without the effect of plasma treatment and composites with collagen and chitosan. To evaluate the effect of different types of PLA scaffolds on the neuronal differentiation of iPSCs, we assess the expression of NeuN in differentiated cells through immunostaining. iPSCs more effectively differentiate into neurons on PLA scaffolds with high adhesive properties for primary neuronal cells.Keywords: PLA scaffold, neurons, neuronal differentiation, stem cells, polylactid
Procedia PDF Downloads 857064 A Theoretical Study on Pain Assessment through Human Facial Expresion
Authors: Mrinal Kanti Bhowmik, Debanjana Debnath Jr., Debotosh Bhattacharjee
Abstract:
A facial expression is undeniably the human manners. It is a significant channel for human communication and can be applied to extract emotional features accurately. People in pain often show variations in facial expressions that are readily observable to others. A core of actions is likely to occur or to increase in intensity when people are in pain. To illustrate the changes in the facial appearance, a system known as Facial Action Coding System (FACS) is pioneered by Ekman and Friesen for human observers. According to Prkachin and Solomon, a set of such actions carries the bulk of information about pain. Thus, the Prkachin and Solomon pain intensity (PSPI) metric is defined. So, it is very important to notice that facial expressions, being a behavioral source in communication media, provide an important opening into the issues of non-verbal communication in pain. People express their pain in many ways, and this pain behavior is the basis on which most inferences about pain are drawn in clinical and research settings. Hence, to understand the roles of different pain behaviors, it is essential to study the properties. For the past several years, the studies are concentrated on the properties of one specific form of pain behavior i.e. facial expression. This paper represents a comprehensive study on pain assessment that can model and estimate the intensity of pain that the patient is suffering. It also reviews the historical background of different pain assessment techniques in the context of painful expressions. Different approaches incorporate FACS from psychological views and a pain intensity score using the PSPI metric in pain estimation. This paper investigates in depth analysis of different approaches used in pain estimation and presents different observations found from each technique. It also offers a brief study on different distinguishing features of real and fake pain. Therefore, the necessity of the study lies in the emerging fields of painful face assessment in clinical settings.Keywords: facial action coding system (FACS), pain, pain behavior, Prkachin and Solomon pain intensity (PSPI)
Procedia PDF Downloads 3467063 Study on the Impact of Windows Location on Occupancy Thermal Comfort by Computational Fluid Dynamics (CFD) Simulation
Authors: Farhan E Shafrin, Khandaker Shabbir Ahmed
Abstract:
Natural ventilation strategies continue to be a key alternative to costly mechanical ventilation systems, especially in healthcare facilities, due to increasing energy issues in developing countries, including Bangladesh. Besides, overcrowding and insufficient ventilation strategies remain significant causes of thermal discomfort and hospital infection in Bangladesh. With the proper location of inlet and outlet windows, uniform flow is possible in the occupancy area to achieve thermal comfort. It also determines the airflow pattern of the ward that decreases the movement of the contaminated air. This paper aims to establish a relationship between the location of the windows and the thermal comfort of the occupants in a naturally ventilated hospital ward. It defines the openings and ventilation variables that are interrelated in a way that enhances or limits the health and thermal comfort of occupants. The study conducts a full-scale experiment in one of the naturally ventilated wards in a primary health care hospital in Manikganj, Dhaka. CFD simulation is used to explore the performance of various opening positions in ventilation efficiency and thermal comfort in the study area. The results indicate that the opening located in the hospital ward has a significant impact on the thermal comfort of the occupants and the airflow pattern inside the ward. The findings can contribute to design the naturally ventilated hospital wards by identifying and predicting future solutions when it comes to relationships with the occupants' thermal comforts.Keywords: CFD simulation, hospital ward, natural ventilation, thermal comfort, window location
Procedia PDF Downloads 197