Search results for: interior point methods
18046 Simulation Studies of High-Intensity, Nanosecond Pulsed Electric Fields Induced Dynamic Membrane Electroporation
Authors: Jiahui Song
Abstract:
The application of an electric field can cause poration at cell membranes. This includes the outer plasma membrane, as well as the membranes of intracellular organelles. In order to analyze and predict such electroporation effects, it becomes necessary to first evaluate the electric fields and the transmembrane voltages. This information can then be used to assess changes in the pore formation energy that finally yields the pore distributions and their radii based on the Smolchowski equation. The dynamic pore model can be achieved by including a dynamic aspect and a dependence on the pore population density into the pore formation energy equation. These changes make the pore formation energy E(r) self-adjusting in response to pore formation without causing uncontrolled growth and expansion. By using dynamic membrane tension, membrane electroporation in response to a 180kV/cm trapezoidal pulse with a 10 ns on time and 1.5 ns rise- and fall-times is discussed. Poration is predicted to occur at times beyond the peak at around 9.2 ns. Modeling also yields time-dependent distributions of the membrane pore population after multiple pulses. It shows that the pore distribution shifts to larger values of the radius with multiple pulsing. Molecular dynamics (MD) simulations are also carried out for a fixed field of 0.5 V/nm to demonstrate nanopore formation from a microscopic point of view. The result shows that the pore is predicted to be about 0.9 nm in diameter and somewhat narrower at the central point.Keywords: high-intensity, nanosecond, dynamics, electroporation
Procedia PDF Downloads 15918045 Exploration of Correlation between Design Principles and Elements with the Visual Aesthetic in Residential Interiors
Authors: Ikra Khan, Reenu Singh
Abstract:
Composition is essential when designing the interiors of residential spaces. The ability to adopt a unique style of using design principles and design elements is another. This research report explores how the visual aesthetic within a space is achieved through the use of design principles and design elements while maintaining a signature style. It also observes the relationship between design styles and compositions that are achieved as a result of the implementation of the principles. Information collected from books and the internet helped to understand how a composition can be achieved in residential interiors by resorting to design principles and design elements as tools for achieving an aesthetic composition. It also helped determine the results of authentic representation of design ideas and how they make one’s work exceptional. A questionnaire survey was also conducted to understand the impact of a visually aesthetic residential interior of a signature style on the lifestyle of individuals residing in them. The findings denote a pattern in the application of design principles and design elements. Individual principles and elements or a combination of the same are used to achieve an aesthetically pleasing composition. This was supported by creating CAD illustrations of two different residential projects with varying approaches and design styles. These illustrations include mood boards, 3D models, and sectional elevations as rendered views to understand the concept design and its translation via these mediums. A direct relation is observed between the application of design principles and design elements to achieve visually aesthetic residential interiors that suit an individual’s taste. These practices can be applied when designing bespoke commercial as well as industrial interiors that are suited to specific aesthetic and functional needs.Keywords: composition, design principles, elements, interiors, residential spaces
Procedia PDF Downloads 10318044 Exploring the Challenges to Usage of Building Construction Cost Indices in Ghana
Authors: Jerry Gyimah, Ernest Kissi, Safowaa Osei-Tutu, Charles Dela Adobor, Theophilus Adjei-Kumi, Ernest Osei-Tutu
Abstract:
Price fluctuation contract is imperative and of paramount essence, in the construction industry as it provides adequate relief and cushioning for changes in the prices of input resources during construction. As a result, several methods have been devised to better help in arriving at fair recompense in the event of price changes. However, stakeholders often appear not to be satisfied with the existing methods of fluctuation evaluation, ostensibly because of the challenges associated with them. The aim of this study was to identify the challenges to the usage of building construction cost indices in Ghana. Data was gathered from contractors and quantity surveying firms. The study utilized a survey questionnaire approach to elicit responses from the contractors and the consultants. Data gathered was analyzed scientifically, using the relative importance index (RII) to rank the problems associated with the existing methods. The findings revealed the following, among others, late release of data, inadequate recovery of costs, and work items of interest not included in the published indices as the main challenges of the existing methods. Findings provide useful lessons for policymakers and practitioners in decision making towards the usage and improvement of available indices.Keywords: building construction cost indices, challenges, usage, Ghana
Procedia PDF Downloads 15218043 Feature Evaluation Based on Random Subspace and Multiple-K Ensemble
Authors: Jaehong Yu, Seoung Bum Kim
Abstract:
Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors.Keywords: clustering analysis, multiple-k ensemble, random subspace-based feature evaluation, unsupervised feature ranking
Procedia PDF Downloads 33918042 Geometrical Fluid Model for Blood Rheology and Pulsatile Flow in Stenosed Arteries
Authors: Karan Kamboj, Vikramjeet Singh, Vinod Kumar
Abstract:
Considering blood to be a non-Newtonian Carreau liquid, this indirect numerical model investigates the pulsatile blood flow in a constricted restricted conduit that has numerous gentle stenosis inside the view of an increasing body speed. Asymptotic answers are obtained for the flow rate, pressure inclination, speed profile, sheer divider pressure, and longitudinal impedance to stream after the use of the twofold irritation approach to the problem of the succeeding non-straight limit esteem. It has been observed that the speed of the blood increases when there is an increase in the point of tightening of the conduit, the body speed increase, and the power regulation file. However, this rheological manner of behaving changes to one of longitudinal impedance to stream and divider sheer pressure when each of the previously mentioned boundaries increases. It has also been seen that the sheer divider pressure in the bloodstream greatly increases when there is an increase in the maximum depth of the stenosis but that it significantly decreases when there is an increase in the pulsatile Reynolds number. This is an interesting phenomenon. The assessments of the amount of growth in the longitudinal resistance to flow increase overall with the increment of the maximum depth of the stenosis and the Weissenberg number. Additionally, it is noted that the average speed of blood increases noticeably with the growth of the point of tightening of the corridor, and body speed increases border. This is something that can be observed.Keywords: geometry of artery, pulsatile blood flow, numerous stenosis
Procedia PDF Downloads 9918041 Dialogue Meetings as an Arena for Collaboration and Reflection among Researchers and Practitioners
Authors: Kerstin Grunden, Ann Svensson, Berit Forsman, Christina Karlsson, Ayman Obeid
Abstract:
The research question of the article is to explore whether the dialogue meetings method could be relevant for reflective learning among researchers and practitioners when welfare technology should be implemented in municipalities, or not. A testbed was planned to be implemented in a retirement home in a Swedish municipality, and the practitioners worked with a pre-study of that testbed. In the article, the dialogue between the researchers and the practitioners in the dialogue meetings is described and analyzed. The potential of dialogue meetings as an arena for learning and reflection among researchers and practitioners is discussed. The research methodology approach is participatory action research with mixed methods (dialogue meetings, focus groups, participant observations). The main findings from the dialogue meetings were that the researchers learned more about the use of traditional research methods, and the practitioners learned more about how they could improve their use of the methods to facilitate change processes in their organization. These findings have the potential both for the researchers and the practitioners to result in more relevant use of research methods in change processes in organizations. It is concluded that dialogue meetings could be relevant for reflective learning among researchers and practitioners when welfare technology should be implemented in a health care organization.Keywords: dialogue meetings, implementation, reflection, test bed, welfare technology, participatory action research
Procedia PDF Downloads 14618040 Mathematical Study for Traffic Flow and Traffic Density in Kigali Roads
Authors: Kayijuka Idrissa
Abstract:
This work investigates a mathematical study for traffic flow and traffic density in Kigali city roads and the data collected from the national police of Rwanda in 2012. While working on this topic, some mathematical models were used in order to analyze and compare traffic variables. This work has been carried out on Kigali roads specifically at roundabouts from Kigali Business Center (KBC) to Prince House as our study sites. In this project, we used some mathematical tools to analyze the data collected and to understand the relationship between traffic variables. We applied the Poisson distribution method to analyze and to know the number of accidents occurred in this section of the road which is from KBC to Prince House. The results show that the accidents that occurred in 2012 were at very high rates due to the fact that this section has a very narrow single lane on each side which leads to high congestion of vehicles, and consequently, accidents occur very frequently. Using the data of speeds and densities collected from this section of road, we found that the increment of the density results in a decrement of the speed of the vehicle. At the point where the density is equal to the jam density the speed becomes zero. The approach is promising in capturing sudden changes on flow patterns and is open to be utilized in a series of intelligent management strategies and especially in noncurrent congestion effect detection and control.Keywords: statistical methods, traffic flow, Poisson distribution, car moving technics
Procedia PDF Downloads 28218039 A New Model to Perform Preliminary Evaluations of Complex Systems for the Production of Energy for Buildings: Case Study
Authors: Roberto de Lieto Vollaro, Emanuele de Lieto Vollaro, Gianluca Coltrinari
Abstract:
The building sector is responsible, in many industrialized countries, for about 40% of the total energy requirements, so it seems necessary to devote some efforts in this area in order to achieve a significant reduction of energy consumption and of greenhouse gases emissions. The paper presents a study aiming at providing a design methodology able to identify the best configuration of the system building/plant, from a technical, economic and environmentally point of view. Normally, the classical approach involves a building's energy loads analysis under steady state conditions, and subsequent selection of measures aimed at improving the energy performance, based on previous experience made by architects and engineers in the design team. Instead, the proposed approach uses a sequence of two well known scientifically validated calculation methods (TRNSYS and RETScreen), that allow quite a detailed feasibility analysis. To assess the validity of the calculation model, an existing, historical building in Central Italy, that will be the object of restoration and preservative redevelopment, was selected as a case-study. The building is made of a basement and three floors, with a total floor area of about 3,000 square meters. The first step has been the determination of the heating and cooling energy loads of the building in a dynamic regime by means of TRNSYS, which allows to simulate the real energy needs of the building in function of its use. Traditional methodologies, based as they are on steady-state conditions, cannot faithfully reproduce the effects of varying climatic conditions and of inertial properties of the structure. With TRNSYS it is possible to obtain quite accurate and reliable results, that allow to identify effective combinations building-HVAC system. The second step has consisted of using output data obtained with TRNSYS as input to the calculation model RETScreen, which enables to compare different system configurations from the energy, environmental and financial point of view, with an analysis of investment, and operation and maintenance costs, so allowing to determine the economic benefit of possible interventions. The classical methodology often leads to the choice of conventional plant systems, while RETScreen provides a financial-economic assessment for innovative energy systems and low environmental impact. Computational analysis can help in the design phase, particularly in the case of complex structures with centralized plant systems, by comparing the data returned by the calculation model RETScreen for different design options. For example, the analysis performed on the building, taken as a case study, found that the most suitable plant solution, taking into account technical, economic and environmental aspects, is the one based on a CCHP system (Combined Cooling, Heating, and Power) using an internal combustion engine.Keywords: energy, system, building, cooling, electrical
Procedia PDF Downloads 57318038 Production of Fish Hydrolyzates by Single and Multiple Protease Treatments under Medium High Pressure of 300 MPa
Authors: Namsoo Kim, So-Hee Son, Jin-Soo Maeng, Yong-Jin Cho, Chong-Tai Kim
Abstract:
It has been reported that some enzymes such as trypsin and Alcalase 2.4L are tolerant to a medium high pressure of 300 MPa and preparation of protein hydrolyzates under 300 MPa was advantageous with regard to hydrolysis rate and thus production yield compared with the counterpart under ambient pressure.1,2) In this study, nine fish comprising halibut, soft shell clam and carp were hydrolyzed using Flavourzyme 500MG only, and the combination of Flavourzyme 500 mg, Alcalase 2.4 L, Marugoto E, and Protamex under 300 MPa. Then, the effects of single and multiple protease treatments were determined with respect to contents of soluble solid (SS) and soluble nitrogen, sensory attributes, electrophoretic profiles, and HPLC peak patterns of the fish hydrolyzates (FHs) from various species. The contents of SS of the FHs were quite species-specific and the hydrolyzates of halibut showed the highest SS contents. At this point, multiple protease treatment increased SS content conspicuously in all fish tested. The contents of total soluble nitrogen and TCA-soluble nitrogen were well correlated with those of SS irrespective of fish species and methods of enzyme treatment. Also, it was noticed that multiple protease treatment improved sensory attributes of the FHs considerably. Electropherograms of the FHs showed fast migrating peptide bands that had the molecular masses mostly lower than 1 kDa and this was confirmed by peptide patterns from HPLC analysis for some FHs that had good sensory quality.Keywords: production, fish hydrolyzates, protease treatments, high pressure
Procedia PDF Downloads 28318037 Post-Structural Study of Gender in Shakespearean Othello from Butlerian Perspective
Authors: Muhammad Shakeel Rehman Hissam
Abstract:
This study aims at analyzing gender in Othello by applying Judith Butler’s Post-Structural theory of gender and gender performance. The analysis of the play provides us context by which we can examine what kinds of effects the drama have on understanding of the researchers regarding gender identity. The study sets out to examine that, is there any evidence or ground in Shakespearean selected work which leads to challenge the patriarchal taken for granted prescribed roles of gender? This would be the focal point in study of Othello that actions and performances of characters determine their gender identity rather than their sexuality. It argues that gender of Shakespearean characters has no constant, fixed and structural impression. On the contrary, they undergo consistent variations in their behavior and performance which impart fluidity and volatility to them. The focal point of the present study is Butler’s prominent work; Gender Trouble: Feminism and subversion of Identity and her post structural theory of Gender performativity as the theoretical underpinning of the text. It analyzes the selected play in Post-Structural gender perspective. The gender-centric plot of the play is riddled with fluidity of gender. The most fascinating aspect of the play is the transformations of genders on the basis of performances by different characters and through these transformations; gender identity is revealed and determined. The study reconstructs the accepted gender norms by challenging the traditional concept of gender that is based on sexual differences of characters.Keywords: post structural, gender, performativity, socio-cultural gender norms, binaries, Othello, Butler, identity
Procedia PDF Downloads 37218036 Protein Remote Homology Detection and Fold Recognition by Combining Profiles with Kernel Methods
Authors: Bin Liu
Abstract:
Protein remote homology detection and fold recognition are two most important tasks in protein sequence analysis, which is critical for protein structure and function studies. In this study, we combined the profile-based features with various string kernels, and constructed several computational predictors for protein remote homology detection and fold recognition. Experimental results on two widely used benchmark datasets showed that these methods outperformed the competing methods, indicating that these predictors are useful computational tools for protein sequence analysis. By analyzing the discriminative features of the training models, some interesting patterns were discovered, reflecting the characteristics of protein superfamilies and folds, which are important for the researchers who are interested in finding the patterns of protein folds.Keywords: protein remote homology detection, protein fold recognition, profile-based features, Support Vector Machines (SVMs)
Procedia PDF Downloads 16118035 Vision and Challenges of Developing VR-Based Digital Anatomy Learning Platforms and a Solution Set for 3D Model Marking
Authors: Gizem Kayar, Ramazan Bakir, M. Ilkay Koşar, Ceren U. Gencer, Alperen Ayyildiz
Abstract:
Anatomy classes are crucial for general education of medical students, whereas learning anatomy is quite challenging and requires memorization of thousands of structures. In traditional teaching methods, learning materials are still based on books, anatomy mannequins, or videos. This results in forgetting many important structures after several years. However, more interactive teaching methods like virtual reality, augmented reality, gamification, and motion sensors are becoming more popular since such methods ease the way we learn and keep the data in mind for longer terms. During our study, we designed a virtual reality based digital head anatomy platform to investigate whether a fully interactive anatomy platform is effective to learn anatomy and to understand the level of teaching and learning optimization. The Head is one of the most complicated human anatomy structures, with thousands of tiny, unique structures. This makes the head anatomy one of the most difficult parts to understand during class sessions. Therefore, we developed a fully interactive digital tool with 3D model marking, quiz structures, 2D/3D puzzle structures, and VR support so as to integrate the power of VR and gamification. The project has been developed in Unity game engine with HTC Vive Cosmos VR headset. The head anatomy 3D model has been selected with full skeletal, muscular, integumentary, head, teeth, lymph, and vein system. The biggest issue during the development was the complexity of our model and the marking of it in the 3D world system. 3D model marking requires to access to each unique structure in the counted subsystems which means hundreds of marking needs to be done. Some parts of our 3D head model were monolithic. This is why we worked on dividing such parts to subparts which is very time-consuming. In order to subdivide monolithic parts, one must use an external modeling tool. However, such tools generally come with high learning curves, and seamless division is not ensured. Second option was to integrate tiny colliders to all unique items for mouse interaction. However, outside colliders which cover inner trigger colliders cause overlapping, and these colliders repel each other. Third option is using raycasting. However, due to its own view-based nature, raycasting has some inherent problems. As the model rotate, view direction changes very frequently, and directional computations become even harder. This is why, finally, we studied on the local coordinate system. By taking the pivot point of the model into consideration (back of the nose), each sub-structure is marked with its own local coordinate with respect to the pivot. After converting the mouse position to the world position and checking its relation with the corresponding structure’s local coordinate, we were able to mark all points correctly. The advantage of this method is its applicability and accuracy for all types of monolithic anatomical structures.Keywords: anatomy, e-learning, virtual reality, 3D model marking
Procedia PDF Downloads 10018034 Site Analysis’ Importance as a Valid Factor in Building Design
Authors: Mekwa Eme, Anya chukwuma
Abstract:
The act of evaluating a particular site physically and socially in order to create a good design solution that will address the physical and interior environment of the location is known as architectural site analysis. This essay will describe site analysis as a useful design component. According to the introduction and supporting research, site evaluation and analysis are crucial to good design in terms of topography, orientation, site size, accessibility, rainfall, wind direction, and times of sunrise and sunset. Methodology: Both quantitative and qualitative analyses are used in this paper. The primary and secondary types of data collection are as follows. This information was gathered via the case study approach, already published literature, journals, the internet, a local poll, oral interviews, inquiries, and in-person interviews. The purpose of this is to clarify the benefits of site analysis for the design process and its implications for the working or building stage. Results: Each site's criteria are unique in terms of things like soil, plants, trees, accessibility, topography, and security. This will make it easier for the architect and environmentalist to decide on the idea, shape, and supporting structures of the design. It is crucial because before any design work is done, the nature of the target location will be determined through site visits and research. The location, contours, site features, and accessibility are just a few of the topics included in this site study. In order for students and working architects to understand the nature of the site they will be working on, site analysis is a key component of architectural education. The building's orientation, the site's circulation, and the sustainability of the site may all be determined with thorough research of the site's features.Keywords: analysis, climate, statistics, design
Procedia PDF Downloads 24918033 Developing a Viral Artifact to Improve Employees’ Security Behavior
Authors: Stefan Bauer, Josef Frysak
Abstract:
According to the scientific information management literature, the improper use of information technology (e.g. personal computers) by employees are one main cause for operational and information security loss events. Therefore, organizations implement information security awareness programs to increase employees’ awareness to further prevention of loss events. However, in many cases these information security awareness programs consist of conventional delivery methods like posters, leaflets, or internal messages to make employees aware of information security policies. We assume that a viral information security awareness video might be more effective medium than conventional methods commonly used by organizations. The purpose of this research is to develop a viral video artifact to improve employee security behavior concerning information technology.Keywords: information security awareness, delivery methods, viral videos, employee security behavior
Procedia PDF Downloads 54218032 Effectiveness of Online Language Learning
Authors: Shazi Shah Jabeen, Ajay Jesse Thomas
Abstract:
The study is aimed at understanding the learning trends of students who opt for online language courses and to assess the effectiveness of the same. Multiple factors including use of the latest available technology and the skills that are trained by these online methods have been assessed. An attempt has been made to answer how each of the various language skills is trained online and how effective the online methods are compared to the classroom methods when students interact with peers and instructor. A mixed method research design was followed for collecting information for the study where a survey by means of a questionnaire and in-depth interviews with a number of respondents were undertaken across the various institutes and study centers located in the United Arab Emirates. The questionnaire contained 19 questions which included 7 sub-questions. The study revealed that the students find learning with an instructor to be a lot more effective than learning alone in an online environment. They prefer classroom environment more than the online setting for language learning.Keywords: effectiveness, language, online learning, skills
Procedia PDF Downloads 58918031 Starting Order Eight Method Accurately for the Solution of First Order Initial Value Problems of Ordinary Differential Equations
Authors: James Adewale, Joshua Sunday
Abstract:
In this paper, we developed a linear multistep method, which is implemented in predictor corrector-method. The corrector is developed by method of collocation and interpretation of power series approximate solutions at some selected grid points, to give a continuous linear multistep method, which is evaluated at some selected grid points to give a discrete linear multistep method. The predictors were also developed by method of collocation and interpolation of power series approximate solution, to give a continuous linear multistep method. The continuous linear multistep method is then solved for the independent solution to give a continuous block formula, which is evaluated at some selected grid point to give discrete block method. Basic properties of the corrector were investigated and found to be zero stable, consistent and convergent. The efficiency of the method was tested on some linear, non-learn, oscillatory and stiff problems of first order, initial value problems of ordinary differential equations. The results were found to be better in terms of computer time and error bound when compared with the existing methods.Keywords: predictor, corrector, collocation, interpolation, approximate solution, independent solution, zero stable, consistent, convergent
Procedia PDF Downloads 50118030 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models
Authors: I. V. Pinto, M. R. Sooriyarachchi
Abstract:
It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error
Procedia PDF Downloads 14218029 Delayed Contralateral Prophylactic Mastectomy (CPM): Reasons and Rationale for Patients with Unilateral Breast Cancer
Authors: C. Soh, S. Muktar, C. M. Malata, J. R. Benson
Abstract:
Introduction Reasons for requesting CPM include prevention of recurrence, peace of mind and moving on after breast cancer. Some women seek CPM as a delayed procedure but factors influencing this are poorly understood. Methods A retrospective analysis examined patients undergoing CPM as either an immediate or delayed procedure with or without breast reconstruction (BR) between January 2009 and December 2019. A cross-sectional survey based on validated questionnaires (5 point Likert scale) explored patients’ decision-making process in terms of timing of CPM and any BR. Results A total of 123 patients with unilateral breast cancer underwent CPM with 39 (32.5%) delayed procedures with or without BR. The response rate amongst patients receiving questionnaires (n=33) was 22/33 (66%). Within this delayed CPM cohort were three reconstructive scenarios 1) unilateral immediate BR with CPM (n=12); 2) delayed CPM with concomitant bilateral BR (n=22); 3) delayed bilateral BR after delayed CPM (n=3). Two patients had delayed CPM without BR. The most common reason for delayed CPM was to complete all cancer treatments (including radiotherapy) before surgery on the unaffected breast (score 2.91). The second reason was unavailability of genetic test results at the time of therapeutic mastectomy (score 2.64) whilst the third most cited reason was a subsequent change in family cancer history. Conclusion Factors for delayed CPM are patient-driven with few women spontaneously changing their mind having initially decided against immediate CPM for reasons also including surgical duration. CPM should be offered as a potentially delayed option with informed discussion of risks and benefits.Keywords: Breast Cancer, CPM, Prophylactic, Rationale
Procedia PDF Downloads 11218028 A Comparison Study of Different Methods Used in the Detection of Giardia lamblia on Fecal Specimen of Children
Authors: Muhammad Farooq Baig
Abstract:
Objective: The purpose of this study was to compare results obtained using a single fecal specimen for O&P examination, direct immunofluorescence assay (DFA), and two conventional staining methods. Design: Hundred and fifty children fecal specimens were collected and examined by each method. The O&P and the DFA were used as the reference method. Setting: The study was performed at the laboratory in the Basic Medical Science Institute JPMC Karachi. Patients or Other Participants: The fecal specimens were collected from children with a suspected Giardia lamblia infection. Main Outcome Measures: The amount of agreement and disagreement between methods.1) Presence of giardiasis in our population. 2) The sensitivity and specificity of each method. Results: There was 45(30%) positive 105 (70%) negative on DFA, 41 (27.4%) positive 109 (72.6%) negative on iodine and 34 (22.6%) positive 116(77.4%) on saline method. The sensitivity and specificity of DFA in comparision to iodine were 92.2%, 92.7% respectively. The sensitivity and specificity of DFA in comparisoin to saline method were 91.2%, 87.9% respectively. The sensitivity of iodine method and saline method in compariosn to DFA were 82.2%, 68.8% respectively. There is mark diffrence in sensitivity of DFA to conventional method. Conclusion: The study supported findings of other investigators who concluded that DFA method have the greater sensitivity. The immunologic methods were more efficient and quicker than the conventional O&P method.Keywords: direct immunofluorescence assay (DFA), ova and parasite (O&P), Giardia lamblia, children, medical science
Procedia PDF Downloads 42318027 Evaluation of Simple, Effective and Affordable Processing Methods to Reduce Phytates in the Legume Seeds Used for Feed Formulations
Authors: N. A. Masevhe, M. Nemukula, S. S. Gololo, K. G. Kgosana
Abstract:
Background and Study Significance: Legume seeds are important in agriculture as they are used for feed formulations due to their nutrient-dense, low-cost, and easy accessibility. Although they are important sources of energy, proteins, carbohydrates, vitamins, and minerals, they contain abundant quantities of anti-nutritive factors that reduce the bioavailability of nutrients, digestibility of proteins, and mineral absorption in livestock. However, the removal of these factors is too costly as it requires expensive state-of-the-art techniques such as high pressure and thermal processing. Basic Methodologies: The aim of the study was to investigate cost-effective methods that can be used to reduce the inherent phytates as putative antinutrients in the legume seeds. The seeds of Arachis hypogaea, Pisum sativum and Vigna radiata L. were subjected to the single processing methods viz raw seeds plus dehulling (R+D), soaking plus dehulling (S+D), ordinary cooking plus dehulling (C+D), infusion plus dehulling (I+D), autoclave plus dehulling (A+D), microwave plus dehulling (M+D) and five combined methods (S+I+D; S+A+D; I+M+D; S+C+D; S+M+D). All the processed seeds were dried, ground into powder, extracted, and analyzed on a microplate reader to determine the percentage of phytates per dry mass of the legume seeds. Phytic acid was used as a positive control, and one-way ANOVA was used to determine the significant differences between the means of the processing methods at a threshold of 0.05. Major Findings: The results of the processing methods showed the percentage yield ranges of 39.1-96%, 67.4-88.8%, and 70.2-93.8% for V. radiata, A. hypogaea and P. sativum, respectively. Though the raw seeds contained the highest contents of phytates that ranged between 0.508 and 0.527%, as expected, the R+D resulted in a slightly lower phytate percentage range of 0.469-0.485%, while other processing methods resulted in phytate contents that were below 0.35%. The M+D and S+M+D methods showed low phytate percentage ranges of 0.276-0.296% and 0.272-0.294%, respectively, where the lowest percentage yield was determined in S+M+D of P. sativum. Furthermore, these results were found to be significantly different (p<0.05). Though phytates cause micronutrient deficits as they chelate important minerals such as calcium, zinc, iron, and magnesium, their reduction may enhance nutrient bioavailability since they cannot be digested by the ruminants. Concluding Statement: Despite the nutritive aspects of the processed legume seeds, which are still in progress, the M+D and S+M+D methods, which significantly reduced the phytates in the investigated legume seeds, may be recommended to the local farmers and feed-producing industries so as to enhance animal health and production at an affordable cost.Keywords: anti-nutritive factors, extraction, legume seeds, phytate
Procedia PDF Downloads 2818026 Dependence of Free Fatty Acid and Chlorophyll Content on Thermal Stability of Extra Virgin Olive Oil
Authors: Yongjun Ahn, Sung Gyu Choi, Seung-Yeop Kwak
Abstract:
Selective removal of free fatty acid (FFA) and chlorophyll in extra virgin olive oil (EVOO) is necessary to enhance the thermal stability in the condition of the deep frying. In this work, we demonstrated improving the thermal stability of EVOO by selective removal of free fatty acid and chlorophyll using (3-Aminopropyl)trimethoxysilane (APTMS) functionalized mesoporous silica with controlled pore size. The adsorption kinetics of free fatty acid and chlorophyll into the mesoporous silica were quantitatively analyzed by Freundlich and Langmuir model. The highest chlorophyll adsorption efficiency was shown in the pore size at 5 nm, suggesting that the interaction between the silica and the chlorophyll could be optimized at this point. The amino-functionalized mesoporous silica showed drastically improved removal efficiency of FFA than the bare silica. Moreover, beneficial compounds like tocopherol and phenolic compounds maintained even after adsorptive removal. Extra virgin olive oil treated by aminopropyl-functionalized silica had a smoke point high enough to be used as commercial frying oil. Based on these results, it is expected to attract the considerable amount of interest toward facile adsorptive refining process of EVOO using pore size controlled and amino-functionalized mesoporous silica.Keywords: mesoporous silica, extra virgin olive oil, selective adsorption, thermal stability
Procedia PDF Downloads 24118025 Sensitivity Analysis for 14 Bus Systems in a Distribution Network with Distributed Generators
Authors: Lakshya Bhat, Anubhav Shrivastava, Shiva Rudraswamy
Abstract:
There has been a formidable interest in the area of Distributed Generation in recent times. A wide number of loads are addressed by Distributed Generators and have better efficiency too. The major disadvantage in Distributed Generation is voltage control- is highlighted in this paper. The paper addresses voltage control at buses in IEEE 14 Bus system by regulating reactive power. An analysis is carried out by selecting the most optimum location in placing the Distributed Generators through load flow analysis and seeing where the voltage profile rises. MATLAB programming is used for simulation of voltage profile in the respective buses after introduction of DG’s. A tolerance limit of +/-5% of the base value has to be maintained. To maintain the tolerance limit, 3 methods are used. Sensitivity analysis of 3 methods for voltage control is carried out to determine the priority among the methods.Keywords: distributed generators, distributed system, reactive power, voltage control, sensitivity analysis
Procedia PDF Downloads 70318024 A Study of Indoor Comfort in Affordable Contemporary Courtyard Housing with Outdoor Welfare in Afghan Sustainable Neighborhoods
Authors: Mohammad Saraj Sharifzai, Keisuke Kitagawa, Ahmad Javid Habib Mohammad Kamil Halimee, Daishi Sakaguchi
Abstract:
The main purpose of this research is to recognize indoor comfort in contemporary Afghan courtyard house with outdoor welfare in housing layout and neighborhood design where sustainability is a local consideration. This research focuses on three new neighborhoods (Gawoond) in three different provinces of Afghanistan. Since 2001, the capital Kabul and major cities including Kandahar, which will be compared with Peshawar city in Pakistan, have faced a fast, rough-and-tumble process of urban innovation. The effects of this innovation necessitate reconsideration of the formation of sustainable urban environments and in-house thermal comfort. The lack of sustainable urban life in many newly developed Afghan neighborhoods can pose a major challenge to the process of sustainable urban development. Several factors can affect the success or failure of new neighborhoods in the context of urban life. For thermal analysis, we divide our research into three different climatic zones. This study is an evaluation of the environmental impacts of the interior comfort of contemporary courtyard housing with the exterior welfare of neighborhood sustainable design strategy in dry and cold, semi-hot and arid, and semi-humid and hot climates in Afghan cities and Peshawar.Keywords: Afghan contemporary courtyard house, neighbourhood, street pattern and housing layout, sustainability, welfare, comfort, climate zone, Afghanistan
Procedia PDF Downloads 42918023 The Effects of Extraction Methods on Fat Content and Fatty Acid Profiles of Marine Fish Species
Authors: Yesim Özogul, Fethiye Takadaş, Mustafa Durmus, Yılmaz Ucar, Ali Rıza Köşker, Gulsun Özyurt, Fatih Özogul
Abstract:
It has been well documented that polyunsaturated fatty acids (PUFAs), especially eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) have beneficial effects on health, regarding prevention of cardiovascular diseases, cancer and autoimmune disorders, development the brain and retina and treatment of major depressive disorder etc. Thus, an adequate intake of omega PUFA is essential and generally marine fish are the richest sources of PUFA in human diet. Thus, this study was conducted to evaluate the efficiency of different extraction methods (Bligh and Dyer, soxhlet, microwave and ultrasonics) on the fat content and fatty acid profiles of marine fish species (Mullus babatus, Upeneus moluccensis, Mullus surmuletus, Anguilla anguilla, Pagellus erythrinus and Saurida undosquamis). Fish species were caught by trawl in Mediterranean Sea and immediately iced. After that, fish were transported to laboratory in ice and stored at -18oC in a freezer until the day of analyses. After extracting lipid from fish by different methods, lipid samples were converted to their constituent fatty acid methyl esters. The fatty acid composition was analysed by a GC Clarus 500 with an autosampler (Perkin Elmer, Shelton, CT, USA) equipped with a flame ionization detector and a fused silica capillary SGE column (30 m x 0.32 mm ID x 0.25 mm BP20 0.25 UM, USA). The results showed that there were significant differences (P < 0.05) in fatty acids of all species and also extraction methods affected fat contents and fatty acid profiles of fish species.Keywords: extraction methods, fatty acids, marine fish, PUFA
Procedia PDF Downloads 26718022 Row Detection and Graph-Based Localization in Tree Nurseries Using a 3D LiDAR
Authors: Ionut Vintu, Stefan Laible, Ruth Schulz
Abstract:
Agricultural robotics has been developing steadily over recent years, with the goal of reducing and even eliminating pesticides used in crops and to increase productivity by taking over human labor. The majority of crops are arranged in rows. The first step towards autonomous robots, capable of driving in fields and performing crop-handling tasks, is for robots to robustly detect the rows of plants. Recent work done towards autonomous driving between plant rows offers big robotic platforms equipped with various expensive sensors as a solution to this problem. These platforms need to be driven over the rows of plants. This approach lacks flexibility and scalability when it comes to the height of plants or distance between rows. This paper proposes instead an algorithm that makes use of cheaper sensors and has a higher variability. The main application is in tree nurseries. Here, plant height can range from a few centimeters to a few meters. Moreover, trees are often removed, leading to gaps within the plant rows. The core idea is to combine row detection algorithms with graph-based localization methods as they are used in SLAM. Nodes in the graph represent the estimated pose of the robot, and the edges embed constraints between these poses or between the robot and certain landmarks. This setup aims to improve individual plant detection and deal with exception handling, like row gaps, which are falsely detected as an end of rows. Four methods were developed for detecting row structures in the fields, all using a point cloud acquired with a 3D LiDAR as an input. Comparing the field coverage and number of damaged plants, the method that uses a local map around the robot proved to perform the best, with 68% covered rows and 25% damaged plants. This method is further used and combined with a graph-based localization algorithm, which uses the local map features to estimate the robot’s position inside the greater field. Testing the upgraded algorithm in a variety of simulated fields shows that the additional information obtained from localization provides a boost in performance over methods that rely purely on perception to navigate. The final algorithm achieved a row coverage of 80% and an accuracy of 27% damaged plants. Future work would focus on achieving a perfect score of 100% covered rows and 0% damaged plants. The main challenges that the algorithm needs to overcome are fields where the height of the plants is too small for the plants to be detected and fields where it is hard to distinguish between individual plants when they are overlapping. The method was also tested on a real robot in a small field with artificial plants. The tests were performed using a small robot platform equipped with wheel encoders, an IMU and an FX10 3D LiDAR. Over ten runs, the system achieved 100% coverage and 0% damaged plants. The framework built within the scope of this work can be further used to integrate data from additional sensors, with the goal of achieving even better results.Keywords: 3D LiDAR, agricultural robots, graph-based localization, row detection
Procedia PDF Downloads 13918021 Expression-Based Learning as a Starting Point to Promote Students’ Creativity in K-12 Schools in China
Authors: Yanyue Yuan
Abstract:
In this paper, the author shares the findings of a pilot study that examines students’ creative expressions and their perceptions of creativity when engaged in project-based learning. The study is based on an elective course that the author co-designed and co-taught with a colleague to sixteen grade six and seven students over the spring semester in 2019. Using the Little Prince story as the main prompt, they facilitated students’ original creation of a storytelling concert that integrated script writing, music production, lyrics, songs, and visual design as a result of both individual and collaborative work. The author will share the specific challenges we met during the project, including learning cultures of the school, class management, teachers' and parents’ attitude, process-oriented versus product-oriented mindset, and facilities and logistical resources. The findings of this pilot study will inform the ongoing research initiative of exploring how we can foster creative learning in public schools in the Chinese context. While K-12 schools of China’s public education system are still dominated by exam-oriented and teacher-centered approaches, the author proposes that expression-based learning can be a starting point for promoting students’ creativity and can serve as experimental efforts to initiate incremental changes within the current education framework. The paper will also touch upon insights gained from collaborations between university and K-12 schools.Keywords: creativity, expression-based learning, K-12, incremental changes
Procedia PDF Downloads 10318020 Force Measurement for E-Cadherin-Mediated Intercellular Adhesion Probed by Protein Micropattern and Traction Force Microscopy
Authors: Chieh-Chung Tsou, Chun-Min Lo, Yeh-Shiu Chu
Abstract:
Cell’s mechanical forces provide important physical cues in regulation of proper cellular functions, such as cell differentiation, proliferation and migration. It is believed that adhesive forces generated by cell-cell interaction are able to transmit to the interior of cell through filamentous cortical cytoskeleton. Prominent among other membrane receptors, Cadherins are prototypical adhesive molecules able to generate remarkable forces to regulate intercellular adhesion. However, the mechanistic steps of mechano-transduction in Cadherin-mediated adhesion remain very controversial. We are interested in understanding how Cadherin protein complexes enable force generation and transmission at cell-cell contact in the initial stage of intercellular adhesion. For providing a better control of time, space, and substrate stiffness, in this study, a combination of protein micropattern, micropipette manipulation, and traction force microscopy is used. Pair micropattern with different forms confines cell spreading area and the gaps in pairs varied from 2 to 8 microns are applied for monitoring the forces that cell pairs generated, measured by traction force microscopy. Moreover, cell clones obtained from epithelial cells undergone genome editing are used to score the importance for known components of Cadherin complexes in force generation. We believe that our results from this combinatory mechanobiological method will provide deep insights on understanding the biophysical principle governing mechano- transduction of Cadherin-mediated intercellular adhesion.Keywords: cadherin, intercellular adhesion, protein micropattern, traction force microscopy
Procedia PDF Downloads 25118019 A Two-Stage Bayesian Variable Selection Method with the Extension of Lasso for Geo-Referenced Data
Authors: Georgiana Onicescu, Yuqian Shen
Abstract:
Due to the complex nature of geo-referenced data, multicollinearity of the risk factors in public health spatial studies is a commonly encountered issue, which leads to low parameter estimation accuracy because it inflates the variance in the regression analysis. To address this issue, we proposed a two-stage variable selection method by extending the least absolute shrinkage and selection operator (Lasso) to the Bayesian spatial setting, investigating the impact of risk factors to health outcomes. Specifically, in stage I, we performed the variable selection using Bayesian Lasso and several other variable selection approaches. Then, in stage II, we performed the model selection with only the selected variables from stage I and compared again the methods. To evaluate the performance of the two-stage variable selection methods, we conducted a simulation study with different distributions for the risk factors, using geo-referenced count data as the outcome and Michigan as the research region. We considered the cases when all candidate risk factors are independently normally distributed, or follow a multivariate normal distribution with different correlation levels. Two other Bayesian variable selection methods, Binary indicator, and the combination of Binary indicator and Lasso were considered and compared as alternative methods. The simulation results indicated that the proposed two-stage Bayesian Lasso variable selection method has the best performance for both independent and dependent cases considered. When compared with the one-stage approach, and the other two alternative methods, the two-stage Bayesian Lasso approach provides the highest estimation accuracy in all scenarios considered.Keywords: Lasso, Bayesian analysis, spatial analysis, variable selection
Procedia PDF Downloads 14318018 A Quantitative Evaluation of Text Feature Selection Methods
Authors: B. S. Harish, M. B. Revanasiddappa
Abstract:
Due to rapid growth of text documents in digital form, automated text classification has become an important research in the last two decades. The major challenge of text document representations are high dimension, sparsity, volume and semantics. Since the terms are only features that can be found in documents, selection of good terms (features) plays an very important role. In text classification, feature selection is a strategy that can be used to improve classification effectiveness, computational efficiency and accuracy. In this paper, we present a quantitative analysis of most widely used feature selection (FS) methods, viz. Term Frequency-Inverse Document Frequency (tfidf ), Mutual Information (MI), Information Gain (IG), CHISquare (x2), Term Frequency-Relevance Frequency (tfrf ), Term Strength (TS), Ambiguity Measure (AM) and Symbolic Feature Selection (SFS) to classify text documents. We evaluated all the feature selection methods on standard datasets like 20 Newsgroups, 4 University dataset and Reuters-21578.Keywords: classifiers, feature selection, text classification
Procedia PDF Downloads 45818017 3D Steady and Transient Centrifugal Pump Flow within Ansys CFX and OpenFOAM
Authors: Clement Leroy, Guillaume Boitel
Abstract:
This paper presents a comparative benchmarking review of a steady and transient three-dimensional (3D) flow computations in centrifugal pump using commercial (AnsysCFX) and open source (OpenFOAM) computational fluid dynamics (CFD) software. In centrifugal rotor-dynamic pump, the fluid enters in the impeller along to the rotating axis to be accelerated in order to increase the pressure, flowing radially outward into another stage, vaned diffuser or volute casing, from where it finally exits into a downstream pipe. Simulations are carried out at the best efficiency point (BEP) and part load, for single-phase flow with several turbulence models. The results are compared with overall performance report from experimental data. The use of CFD technology in industry is still limited by the high computational costs, and even more by the high cost of commercial CFD software and high-performance computing (HPC) licenses. The main objectives of the present study are to define OpenFOAM methodology for high-quality 3D steady and transient turbomachinery CFD simulation to conduct a thorough time-accurate performance analysis. On the other hand a detailed comparisons between computational methods, features on latest Ansys release 18 and OpenFOAM is investigated to assess the accuracy and industrial applications of those solvers. Finally an automated connected workflow (IoT) for turbine blade applications is presented.Keywords: benchmarking, CFX, internet of things, openFOAM, time-accurate, turbomachinery
Procedia PDF Downloads 205