Search results for: mathematical structure
773 Rain Gauges Network Optimization in Southern Peninsular Malaysia
Authors: Mohd Khairul Bazli Mohd Aziz, Fadhilah Yusof, Zulkifli Yusop, Zalina Mohd Daud, Mohammad Afif Kasno
Abstract:
Recent developed rainfall network design techniques have been discussed and compared by many researchers worldwide due to the demand of acquiring higher levels of accuracy from collected data. In many studies, rain-gauge networks are designed to provide good estimation for areal rainfall and for flood modelling and prediction. In a certain study, even using lumped models for flood forecasting, a proper gauge network can significantly improve the results. Therefore existing rainfall network in Johor must be optimized and redesigned in order to meet the required level of accuracy preset by rainfall data users. The well-known geostatistics method (variance-reduction method) that is combined with simulated annealing was used as an algorithm of optimization in this study to obtain the optimal number and locations of the rain gauges. Rain gauge network structure is not only dependent on the station density; station location also plays an important role in determining whether information is acquired accurately. The existing network of 84 rain gauges in Johor is optimized and redesigned by using rainfall, humidity, solar radiation, temperature and wind speed data during monsoon season (November – February) for the period of 1975 – 2008. Three different semivariogram models which are Spherical, Gaussian and Exponential were used and their performances were also compared in this study. Cross validation technique was applied to compute the errors and the result showed that exponential model is the best semivariogram. It was found that the proposed method was satisfied by a network of 64 rain gauges with the minimum estimated variance and 20 of the existing ones were removed and relocated. An existing network may consist of redundant stations that may make little or no contribution to the network performance for providing quality data. Therefore, two different cases were considered in this study. The first case considered the removed stations that were optimally relocated into new locations to investigate their influence in the calculated estimated variance and the second case explored the possibility to relocate all 84 existing stations into new locations to determine the optimal position. The relocations of the stations in both cases have shown that the new optimal locations have managed to reduce the estimated variance and it has proven that locations played an important role in determining the optimal network.Keywords: geostatistics, simulated annealing, semivariogram, optimization
Procedia PDF Downloads 302772 Residual Analysis and Ground Motion Prediction Equation Ranking Metrics for Western Balkan Strong Motion Database
Authors: Manuela Villani, Anila Xhahysa, Christopher Brooks, Marco Pagani
Abstract:
The geological structure of Western Balkans is strongly affected by the collision between Adria microplate and the southwestern Euroasia margin, resulting in a considerably active seismic region. The Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project (BSHAP) (2007-2011, 2012-2015) by NATO supported the preparation of new seismic hazard maps of the Western Balkan, but when inspecting the seismic hazard models produced later by these countries on a national scale, significant differences in design PGA values are observed in the border, for instance, North Albania-Montenegro, South Albania- Greece, etc. Considering the fact that the catalogues were unified and seismic sources were defined within BSHAP framework, obviously, the differences arise from the Ground Motion Prediction Equations selection, which are generally the component with highest impact on the seismic hazard assessment. At the time of the project, a modest database was present, namely 672 three-component records, whereas nowadays, this strong motion database has increased considerably up to 20,939 records with Mw ranging in the interval 3.7-7 and epicentral distance distribution from 0.47km to 490km. Statistical analysis of the strong motion database showed the lack of recordings in the moderate-to-large magnitude and short distance ranges; therefore, there is need to re-evaluate the Ground Motion Prediction Equation in light of the recently updated database and the new generations of GMMs. In some cases, it was observed that some events were more extensively documented in one database than the other, like the 1979 Montenegro earthquake, with a considerably larger number of records in the BSHAP Analogue SM database when compared to ESM23. Therefore, the strong motion flat-file provided from the Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project was merged with the ESM23 database for the polygon studied in this project. After performing the preliminary residual analysis, the candidate GMPE-s were identified. This process was done using the GMPE performance metrics available within the SMT in the OpenQuake Platform. The Likelihood Model and Euclidean Distance Based Ranking (EDR) were used. Finally, for this study, a GMPE logic tree was selected and following the selection of candidate GMPEs, model weights were assigned using the average sample log-likelihood approach of Scherbaum.Keywords: residual analysis, GMPE, western balkan, strong motion, openquake
Procedia PDF Downloads 88771 Psychotherapeutic Narratives and the Importance of Truth
Authors: Spencer Jay Knafelc
Abstract:
Some mental health practitioners and theorists have suggested that we approach remedying psychological problems by centering and intervening upon patients’ narrations. Such theorists and their corresponding therapeutic approaches see persons as narrators of their lives, where the stories they tell constitute and reflect their sense-making of the world. Psychological problems, according to these approaches to therapy, are often the result of problematic narratives. The solution is the construction of more salubrious narratives through therapy. There is trouble lurking within the history of these narrative approaches. These thinkers tend to denigrate the importance of truth, insisting that narratives are not to be thought of as aiming at truth, and thus the truth of our self-narratives is not important. There are multiple motivations for the tendency to eschew truth’s importance within the tradition of narrative approaches to therapy. The most plausible and interesting motivation comes from the observation that, in general, all dominant approaches to therapy are equally effective. The theoretical commitments of each approach are quite different and are often ostensibly incompatible (psychodynamic therapists see psychological problems as resulting from unconscious conflict and repressed desires, Cognitive-Behavioral approaches see them as resulting from distorted cognitions). This strongly suggests that there must be some cases in which therapeutic efficacy does not depend on truth and that insisting that patient’s therapeutic narratives be true in all instances is a mistake. Lewis’ solution is to suggest that narratives are metaphors. Lewis’ account appreciates that there are many ways to tell a story and that many different approaches to mental health treatment can be appropriate without committing us to any contradictions, providing us with an ostensibly coherent way to treat narratives as non-literal, instead of seeing them as tools that can be more or less apt. Here, it is argued that Lewis’ metaphor approach fails. Narratives do not have the right kind of structure to be metaphors. Still, another way to understand Lewis’ view might be that self-narratives, especially when articulated in the language of any specific approach, should not be taken literally. This is an idea at the core of the narrative theorists’ tendency to eschew the importance of the ordinary understanding of truth. This very tendency will be critiqued. The view defended in this paper more accurately captures the nature of self-narratives. The truth of one’s self-narrative is important. Not only do people care about having the right conception of their abilities, who they are, and the way the world is, but self-narratives are composed of beliefs, and the nature of belief is to aim at truth. This view also allows the recognition of the importance of developing accurate representations of oneself and reality for one’s psychological well-being. It is also argued that in many cases, truth factors in as a mechanism of change over the course of therapy. Therapeutic benefit can be achieved by coming to have a better understanding of the nature of oneself and the world. Finally, the view defended here allows for the recognition of the nature of the tension between values: truth and efficacy. It is better to recognize this tension and develop strategies to navigate it as opposed to insisting that it doesn’t exist.Keywords: philosophy, narrative, psychotherapy, truth
Procedia PDF Downloads 104770 Principles for the Realistic Determination of the in-situ Concrete Compressive Strength under Consideration of Rearrangement Effects
Authors: Rabea Sefrin, Christian Glock, Juergen Schnell
Abstract:
The preservation of existing structures is of great economic interest because it contributes to higher sustainability and resource conservation. In the case of existing buildings, in addition to repair and maintenance, modernization or reconstruction works often take place in the course of adjustments or changes in use. Since the structural framework and the associated load level are usually changed in the course of the structural measures, the stability of the structure must be verified in accordance with the currently valid regulations. The concrete compressive strength of the existing structures concrete and the derived mechanical parameters are of central importance for the recalculation and verification. However, the compressive strength of the existing concrete is usually set comparatively low and thus underestimated. The reasons for this are too small numbers, and large scatter of material properties of the drill cores, which are used for the experimental determination of the design value of the compressive strength. Within a structural component, the load is usually transferred over the area with higher stiffness and consequently with higher compressive strength. Therefore, existing strength variations within a component only play a subordinate role due to rearrangement effects. This paper deals with the experimental and numerical determination of such rearrangement effects in order to calculate the concrete compressive strength of existing structures more realistic and economical. The influence of individual parameters such as the specimen geometry (prism or cylinder) or the coefficient of variation of the concrete compressive strength is analyzed in experimental small-part tests. The coefficients of variation commonly used in practice are adjusted by dividing the test specimens into several layers consisting of different concretes, which are monolithically connected to each other. From each combination, a sufficient number of the test specimen is produced and tested to enable evaluation on a statistical basis. Based on the experimental tests, FE simulations are carried out to validate the test results. In the frame of a subsequent parameter study, a large number of combinations is considered, which had not been investigated in the experimental tests yet. Thus, the influence of individual parameters on the size and characteristic of the rearrangement effect is determined and described more detailed. Based on the parameter study and the experimental results, a calculation model for a more realistic determination of the in situ concrete compressive strength is developed and presented. By considering rearrangement effects in concrete during recalculation, a higher number of existing structures can be maintained without structural measures. The preservation of existing structures is not only decisive from an economic, sustainable, and resource-saving point of view but also represents an added value for cultural and social aspects.Keywords: existing structures, in-situ concrete compressive strength, rearrangement effects, recalculation
Procedia PDF Downloads 118769 Assessment of Pedestrian Comfort in a Portuguese City Using Computational Fluid Dynamics Modelling and Wind Tunnel
Authors: Bruno Vicente, Sandra Rafael, Vera Rodrigues, Sandra Sorte, Sara Silva, Ana Isabel Miranda, Carlos Borrego
Abstract:
Wind comfort for pedestrians is an important condition in urban areas. In Portugal, a country with 900 km of coastline, the wind direction are predominantly from Nor-Northwest with an average speed of 2.3 m·s -1 (at 2 m height). As a result, a set of city authorities have been requesting studies of pedestrian wind comfort for new urban areas/buildings, as well as to mitigate wind discomfort issues related to existing structures. This work covers the efficiency evaluation of a set of measures to reduce the wind speed in an outdoor auditorium (open space) located in a coastal Portuguese urban area. These measures include the construction of barriers, placed at upstream and downstream of the auditorium, and the planting of trees, placed upstream of the auditorium. The auditorium is constructed in the form of a porch, aligned with North direction, driving the wind flow within the auditorium, promoting channelling effects and increasing its speed, causing discomfort in the users of this structure. To perform the wind comfort assessment, two approaches were used: i) a set of experiments using the wind tunnel (physical approach), with a representative mock-up of the study area; ii) application of the CFD (Computational Fluid Dynamics) model VADIS (numerical approach). Both approaches were used to simulate the baseline scenario and the scenarios considering a set of measures. The physical approach was conducted through a quantitative method, using hot-wire anemometer, and through a qualitative analysis (visualizations), using the laser technology and a fog machine. Both numerical and physical approaches were performed for three different velocities (2, 4 and 6 m·s-1 ) and two different directions (NorNorthwest and South), corresponding to the prevailing wind speed and direction of the study area. The numerical results show an effective reduction (with a maximum value of 80%) of the wind speed inside the auditorium, through the application of the proposed measures. A wind speed reduction in a range of 20% to 40% was obtained around the audience area, for a wind direction from Nor-Northwest. For southern winds, in the audience zone, the wind speed was reduced from 60% to 80%. Despite of that, for southern winds, the design of the barriers generated additional hot spots (high wind speed), namely, in the entrance to the auditorium. Thus, a changing in the location of the entrance would minimize these effects. The results obtained in the wind tunnel compared well with the numerical data, also revealing the high efficiency of the purposed measures (for both wind directions).Keywords: urban microclimate, pedestrian comfort, numerical modelling, wind tunnel experiments
Procedia PDF Downloads 230768 Response Analysis of a Steel Reinforced Concrete High-Rise Building during the 2011 Tohoku Earthquake
Authors: Naohiro Nakamura, Takuya Kinoshita, Hiroshi Fukuyama
Abstract:
The 2011 off The Pacific Coast of Tohoku Earthquake caused considerable damage to wide areas of eastern Japan. A large number of earthquake observation records were obtained at various places. To design more earthquake-resistant buildings and improve earthquake disaster prevention, it is necessary to utilize these data to analyze and evaluate the behavior of a building during an earthquake. This paper presents an earthquake response simulation analysis (hereafter a seismic response analysis) that was conducted using data recorded during the main earthquake (hereafter the main shock) as well as the earthquakes before and after it. The data were obtained at a high-rise steel-reinforced concrete (SRC) building in the bay area of Tokyo. We first give an overview of the building, along with the characteristics of the earthquake motion and the building during the main shock. The data indicate that there was a change in the natural period before and after the earthquake. Next, we present the results of our seismic response analysis. First, the analysis model and conditions are shown, and then, the analysis result is compared with the observational records. Using the analysis result, we then study the effect of soil-structure interaction on the response of the building. By identifying the characteristics of the building during the earthquake (i.e., the 1st natural period and the 1st damping ratio) by the Auto-Regressive eXogenous (ARX) model, we compare the analysis result with the observational records so as to evaluate the accuracy of the response analysis. In this study, a lumped-mass system SR model was used to conduct a seismic response analysis using observational data as input waves. The main results of this study are as follows: 1) The observational records of the 3/11 main shock put it between a level 1 and level 2 earthquake. The result of the ground response analysis showed that the maximum shear strain in the ground was about 0.1% and that the possibility of liquefaction occurring was low. 2) During the 3/11 main shock, the observed wave showed that the eigenperiod of the building became longer; this behavior could be generally reproduced in the response analysis. This prolonged eigenperiod was due to the nonlinearity of the superstructure, and the effect of the nonlinearity of the ground seems to have been small. 3) As for the 4/11 aftershock, a continuous analysis in which the subject seismic wave was input after the 3/11 main shock was input was conducted. The analyzed values generally corresponded well with the observed values. This means that the effect of the nonlinearity of the main shock was retained by the building. It is important to consider this when conducting the response evaluation. 4) The first period and the damping ratio during a vibration were evaluated by an ARX model. Our results show that the response analysis model in this study is generally good at estimating a change in the response of the building during a vibration.Keywords: ARX model, response analysis, SRC building, the 2011 off the Pacific Coast of Tohoku Earthquake
Procedia PDF Downloads 164767 Evaluation of Antarctic Bacteria as Potential Producers of Cellulolytic Enzymes of Industrial Interest
Authors: Claudio Lamilla, Andrés Santos, Vicente Llanquinao, Jocelyn Hermosilla, Leticia Barrientos
Abstract:
The industry in general is very interested in improving and optimizing industrial processes in order to reduce the costs involved in obtaining raw materials and production. Thus, an interesting and cost-effective alternative is the incorporation of bioactive metabolites in such processes, being an example of this enzymes which catalyze efficiently a large number of enzymatic reactions of industrial and biotechnological interest. In the search for new sources of these active metabolites, Antarctica is one of the least explored places on our planet where the most drastic cold conditions, salinity, UVA-UVB and liquid water available are present, features that have shaped all life in this very harsh environment, especially bacteria that live in different Antarctic ecosystems, which have had to develop different strategies to adapt to these conditions, producing unique biochemical strategies. In this work the production of cellulolytic enzymes of seven bacterial strains isolated from marine sediments at different sites in the Antarctic was evaluated. Isolation of the strains was performed using serial dilutions in the culture medium at M115°C. The identification of the strains was performed using universal primers (27F and 1492R). The enzyme activity assays were performed on R2A medium, carboxy methyl cellulose (CMC)was added as substrate. Degradation of the substrate was revealed by adding Lugol. The results show that four of the tested strains produce enzymes which degrade CMC substrate. The molecular identifications, showed that these bacteria belong to the genus Streptomyces and Pseudoalteromonas, being Streptomyces strain who showed the highest activity. Only some bacteria in marine sediments have the ability to produce these enzymes, perhaps due to their greater adaptability to degrade at temperatures bordering zero degrees Celsius, some algae that are abundant in this environment and have cellulose as the main structure. The discovery of new enzymes adapted to cold is of great industrial interest, especially for paper, textiles, detergents, biofuels, food and agriculture. These enzymes represent 8% of industrial demand worldwide and is expected to increase their demand in the coming years. Mainly in the paper and food industry are required in extraction processes starch, protein and juices, as well as the animal feed industry where treating vegetables and grains helps improve the nutritional value of the food, all this clearly puts Antarctic microorganisms and their enzymes specifically as a potential contribution to industry and the novel biotechnological applications.Keywords: antarctic, bacteria, biotechnological, cellulolytic enzymes
Procedia PDF Downloads 297766 An Effective Approach to Knowledge Capture in Whole Life Costing in Constructions Project
Authors: Ndibarafinia Young Tobin, Simon Burnett
Abstract:
In spite of the benefits of implementing whole life costing technique as a valuable approach for comparing alternative building designs allowing operational cost benefits to be evaluated against any initial cost increases and also as part of procurement in the construction industry, its adoption has been relatively slow due to the lack of tangible evidence, ‘know-how’ skills and knowledge of the practice, i.e. the lack of professionals in many establishments with knowledge and training on the use of whole life costing technique, this situation is compounded by the absence of available data on whole life costing from relevant projects, lack of data collection mechanisms and so on. This has proved to be very challenging to those who showed some willingness to employ the technique in a construction project. The knowledge generated from a project can be considered as best practices learned on how to carry out tasks in a more efficient way, or some negative lessons learned which have led to losses and slowed down the progress of the project and performance. Knowledge management in whole life costing practice can enhance whole life costing analysis execution in a construction project, as lessons learned from one project can be carried on to future projects, resulting in continuous improvement, providing knowledge that can be used in the operation and maintenance phases of an assets life span. Purpose: The purpose of this paper is to report an effective approach which can be utilised in capturing knowledge in whole life costing practice in a construction project. Design/methodology/approach: An extensive literature review was first conducted on the concept of knowledge management and whole life costing. This was followed by a semi-structured interview to explore the existing and good practice knowledge management in whole life costing practice in a construction project. The data gathered from the semi-structured interview was analyzed using content analysis and used to structure an effective knowledge capturing approach. Findings: From the results obtained in the study, it shows that the practice of project review is the common method used in the capturing of knowledge and should be undertaken in an organized and accurate manner, and results should be presented in the form of instructions or in a checklist format, forming short and precise insights. The approach developed advised that irrespective of how effective the approach to knowledge capture, the absence of an environment for sharing knowledge, would render the approach ineffective. Open culture and resources are critical for providing a knowledge sharing setting, and leadership has to sustain whole life costing knowledge capture, giving full support for its implementation. The knowledge capturing approach has been evaluated by practitioners who are experts in the area of whole life costing practice. The results have indicated that the approach to knowledge capture is suitable and efficient.Keywords: whole life costing, knowledge capture, project review, construction industry, knowledge management
Procedia PDF Downloads 260765 Functional Dimension of Reuse: Use of Antalya Kaleiçi Traditional Dwellings as Hotel
Authors: Dicle Aydın, Süheyla Büyükşahin Sıramkaya
Abstract:
Conservation concept gained importance especially in 19th century, it found value with the change and developments lived globally. Basic values in the essence of the concept are important in the continuity of historical and cultural fabrics which have character special to them. Reuse of settlements and spaces carrying historical and cultural values in the frame of socio-cultural and socio-economic conditions is related with functional value. Functional dimension of reuse signifies interrogation of the usage potential of the building with a different aim other than its determined aim. If a building carrying historical and cultural values cannot be used with its own function because of environmental, economical, structural and functional reasons, it is advantageous to maintain its reuse from the point of environmental ecology. By giving a new function both a requirement of the society is fulfilled and a culture entity is conserved because of its functional value. In this study, functional dimension of reuse is exemplified in Antalya Kaleiçi where has a special location and importance with its natural, cultural and historical heritage characteristics. Antayla Kaleiçi settlement preserves its liveliness as a touristic urban fabric with its almost fifty thousand years of past, traditional urban form, civil architectural examples of 18th–19th century reflecting the life style of the region and monumental buildings. The civil architectural examples in the fabric have a special character formed according to Mediterranean climate with their outer sofa (open or closed), one, two or three storey, courtyards and oriels. In the study reuse of five civil architectural examples as boutique hotel by forming a whole with their environmental arrangements is investigated, it is analyzed how the spatial requirements of a boutique hotel are fulfilled in traditional dwellings. Usage of a cultural entity as a boutique hotel is evaluated under the headlines of i.functional requirement, ii.satisfactoriness of spatial dimensions, iii.functional organization. There are closed and open restaurant, kitchen, pub, lobby, administrative offices in the hotel with 70 bed capacity and 28 rooms in total. There are expansions to urban areas on second and third floors by the means of oriels in the hotel surrounded by narrow streets in three directions. This boutique hotel, formed by unique five different dwellings having similar plan scheme in traditional fabric, is different with its structure opened to outside and connected to each other by the means of courtyards, and its outside spaces which gained mobility because of the elevation differences in courtyards.Keywords: reuse, adaptive reuse, functional dimension of reuse, traditional dwellings
Procedia PDF Downloads 319764 Expanding the Atelier: Design Lead Academic Project Using Immersive User-Generated Mobile Images and Augmented Reality
Authors: David Sinfield, Thomas Cochrane, Marcos Steagall
Abstract:
While there is much hype around the potential and development of mobile virtual reality (VR), the two key critical success factors are the ease of user experience and the development of a simple user-generated content ecosystem. Educational technology history is littered with the debris of over-hyped revolutionary new technologies that failed to gain mainstream adoption or were quickly superseded. Examples include 3D television, interactive CDROMs, Second Life, and Google Glasses. However, we argue that this is the result of curriculum design that substitutes new technologies into pre-existing pedagogical strategies that are focused upon teacher-delivered content rather than exploring new pedagogical strategies that enable student-determined learning or heutagogy. Visual Communication design based learning such as Graphic Design, Illustration, Photography and Design process is heavily based on the traditional forms of the classroom environment whereby student interaction takes place both at peer level and indeed teacher based feedback. In doing so, this makes for a healthy creative learning environment, but does raise other issue in terms of student to teacher learning ratios and reduced contact time. Such issues arise when students are away from the classroom and cannot interact with their peers and teachers and thus we see a decline in creative work from the student. Using AR and VR as a means of stimulating the students and to think beyond the limitation of the studio based classroom this paper will discuss the outcomes of a student project considering the virtual classroom and the techniques involved. The Atelier learning environment is especially suited to the Visual Communication model as it deals with the creative processing of ideas that needs to be shared in a collaborative manner. This has proven to have been a successful model over the years, in the traditional form of design education, but has more recently seen a shift in thinking as we move into a more digital model of learning and indeed away from the classical classroom structure. This study focuses on the outcomes of a student design project that employed Augmented Reality and Virtual Reality technologies in order to expand the dimensions of the classroom beyond its physical limits. Augmented Reality when integrated into the learning experience can improve the learning motivation and engagement of students. This paper will outline some of the processes used and the findings from the semester-long project that took place.Keywords: augmented reality, blogging, design in community, enhanced learning and teaching, graphic design, new technologies, virtual reality, visual communications
Procedia PDF Downloads 238763 Early Buddhist History in Architecture before Sui Dynasty
Authors: Yin Ruoxi
Abstract:
During the Eastern Han to Three Kingdoms period, Buddhism had not yet received comprehensive support from the ruling class, and its dissemination remained relatively limited. Based on existing evidence, Buddhist architecture was primarily concentrated in regions central to scripture translation and cultural exchange with the Western Regions, such as Luoyang, Pengcheng, and Guangling. The earliest Buddhist structures largely adhered to the traditional forms of ancient Indian architecture. The frequent wars of the late Western Jin and Sixteen Kingdoms periods compelled the Central Plains culture to interact with other civilizations. As a result, Buddhist architecture gradually integrated characteristics of Central Asian, ancient Indian, and native Chinese styles. In the Northern and Southern Dynasties, Buddhism gained formal support from rulers, leading to the establishment of numerous temples across the Central Plains. The prevalence of warfare, combined with the emergence of Wei-Jin reclusive thought and Buddhism’s own ascetic philosophy, gave rise to mountain temples. Additionally, the eastward spread of rock-cut cave architecture along the Silk Road accelerated the development of such mountain temples. Temple layouts also became increasingly complex with the deeper translation of Buddhist scriptures and the influence of traditional Chinese architectural concepts. From the earliest temples, where the only Buddhist structure was the temple itself, to layouts centered on the stupa with a "front stupa, rear hall" arrangement, and finally to Mahavira Halls becoming the sacred focal point, temple design evolved significantly. The grand halls eventually matched the scale of the central halls in imperial palaces, reflecting the growing deification of the Buddha in the public imagination. The multi-storied wooden pagoda exemplifies Buddhism’s remarkable adaptability during its early introduction to the Central Plains, while the dense- eaved pagoda represents a synthesis of Gandharan stupas, Central Asian temple shrines, ancient Indian devalaya, and Chinese multi-storied pavilions. This form demonstrates Buddhism’s ability to absorb features from diverse cultures during its dissemination. Through its continuous interaction with various cultures, Buddhist architecture achieved sustained development in both form and meaning, laying a solid foundation for the establishment and growth of Buddhism across different regions.Keywords: Buddhism, buddhist architecture, pagoda, temple, South Asian Buddhism, Chinese Buddhism
Procedia PDF Downloads 2762 Epigenetic and Archeology: A Quest to Re-Read Humanity
Authors: Salma A. Mahmoud
Abstract:
Epigenetic, or alteration in gene expression influenced by extragenetic factors, has emerged as one of the most promising areas that will address some of the gaps in our current knowledge in understanding patterns of human variation. In the last decade, the research investigating epigenetic mechanisms in many fields has flourished and witnessed significant progress. It paved the way for a new era of integrated research especially between anthropology/archeology and life sciences. Skeletal remains are considered the most significant source of information for studying human variations across history, and by utilizing these valuable remains, we can interpret the past events, cultures and populations. In addition to archeological, historical and anthropological importance, studying bones has great implications in other fields such as medicine and science. Bones also can hold within them the secrets of the future as they can act as predictive tools for health, society characteristics and dietary requirements. Bones in their basic forms are composed of cells (osteocytes) that are affected by both genetic and environmental factors, which can only explain a small part of their variability. The primary objective of this project is to examine the epigenetic landscape/signature within bones of archeological remains as a novel marker that could reveal new ways to conceptualize chronological events, gender differences, social status and ecological variations. We attempted here to address discrepancies in common variants such as methylome as well as novel epigenetic regulators such as chromatin remodelers, which to our best knowledge have not yet been investigated by anthropologists/ paleoepigenetists using plethora of techniques (biological, computational, and statistical). Moreover, extracting epigenetic information from bones will highlight the importance of osseous material as a vector to study human beings in several contexts (social, cultural and environmental), and strengthen their essential role as model systems that can be used to investigate and construct various cultural, political and economic events. We also address all steps required to plan and conduct an epigenetic analysis from bone materials (modern and ancient) as well as discussing the key challenges facing researchers aiming to investigate this field. In conclusion, this project will serve as a primer for bioarcheologists/anthropologists and human biologists interested in incorporating epigenetic data into their research programs. Understanding the roles of epigenetic mechanisms in bone structure and function will be very helpful for a better comprehension of their biology and highlighting their essentiality as interdisciplinary vectors and a key material in archeological research.Keywords: epigenetics, archeology, bones, chromatin, methylome
Procedia PDF Downloads 108761 A 1T1R Nonvolatile Memory with Al/TiO₂/Au and Sol-Gel Processed Barium Zirconate Nickelate Gate in Pentacene Thin Film Transistor
Authors: Ke-Jing Lee, Cheng-Jung Lee, Yu-Chi Chang, Li-Wen Wang, Yeong-Her Wang
Abstract:
To avoid the cross-talk issue of only resistive random access memory (RRAM) cell, one transistor and one resistor (1T1R) architecture with a TiO₂-based RRAM cell connected with solution barium zirconate nickelate (BZN) organic thin film transistor (OTFT) device is successfully demonstrated. The OTFT were fabricated on a glass substrate. Aluminum (Al) as the gate electrode was deposited via a radio-frequency (RF) magnetron sputtering system. The barium acetate, zirconium n-propoxide, and nickel II acetylacetone were synthesized by using the sol-gel method. After the BZN solution was completely prepared using the sol-gel process, it was spin-coated onto the Al/glass substrate as the gate dielectric. The BZN layer was baked at 100 °C for 10 minutes under ambient air conditions. The pentacene thin film was thermally evaporated on the BZN layer at a deposition rate of 0.08 to 0.15 nm/s. Finally, gold (Au) electrode was deposited using an RF magnetron sputtering system and defined through shadow masks as both the source and drain. The channel length and width of the transistors were 150 and 1500 μm, respectively. As for the manufacture of 1T1R configuration, the RRAM device was fabricated directly on drain electrodes of TFT device. A simple metal/insulator/metal structure, which consisting of Al/TiO₂/Au structures, was fabricated. First, Au was deposited to be a bottom electrode of RRAM device by RF magnetron sputtering system. Then, the TiO₂ layer was deposited on Au electrode by sputtering. Finally, Al was deposited as the top electrode. The electrical performance of the BZN OTFT was studied, showing superior transfer characteristics with the low threshold voltage of −1.1 V, good saturation mobility of 5 cm²/V s, and low subthreshold swing of 400 mV/decade. The integration of the BZN OTFT and TiO₂ RRAM devices was finally completed to form 1T1R configuration with low power consumption of 1.3 μW, the low operation current of 0.5 μA, and reliable data retention. Based on the I-V characteristics, the different polarities of bipolar switching are found to be determined by the compliance current with the different distribution of the internal oxygen vacancies used in the RRAM and 1T1R devices. Also, this phenomenon can be well explained by the proposed mechanism model. It is promising to make the 1T1R possible for practical applications of low-power active matrix flat-panel displays.Keywords: one transistor and one resistor (1T1R), organic thin-film transistor (OTFT), resistive random access memory (RRAM), sol-gel
Procedia PDF Downloads 354760 A Concept in Addressing the Singularity of the Emerging Universe
Authors: Mahmoud Reza Hosseini
Abstract:
The universe is in a continuous expansion process, resulting in the reduction of its density and temperature. Also, by extrapolating back from its current state, the universe at its early times has been studied known as the big bang theory. According to this theory, moments after creation, the universe was an extremely hot and dense environment. However, its rapid expansion due to nuclear fusion led to a reduction in its temperature and density. This is evidenced through the cosmic microwave background and the universe structure at a large scale. However, extrapolating back further from this early state reaches singularity which cannot be explained by modern physics and the big bang theory is no longer valid. In addition, one can expect a nonuniform energy distribution across the universe from a sudden expansion. However, highly accurate measurements reveal an equal temperature mapping across the universe which is contradictory to the big bang principles. To resolve this issue, it is believed that cosmic inflation occurred at the very early stages of the birth of the universe According to the cosmic inflation theory, the elements which formed the universe underwent a phase of exponential growth due to the existence of a large cosmological constant. The inflation phase allows the uniform distribution of energy so that an equal maximum temperature could be achieved across the early universe. Also, the evidence of quantum fluctuations of this stage provides a means for studying the types of imperfections the universe would begin with. Although well-established theories such as cosmic inflation and the big bang together provide a comprehensive picture of the early universe and how it evolved into its current state, they are unable to address the singularity paradox at the time of universe creation. Therefore, a practical model capable of describing how the universe was initiated is needed. This research series aims at addressing the singularity issue by introducing an energy conversion mechanism. This is accomplished by establishing a state of energy called a “neutral state”, with an energy level which is referred to as “base energy” capable of converting into other states. Although it follows the same principles, the unique quanta state of the base energy allows it to be distinguishable from other states and have a uniform distribution at the ground level. Although the concept of base energy can be utilized to address the singularity issue, to establish a complete picture, the origin of the base energy should be also identified. This matter is the subject of the first study in the series “A Conceptual Study for Investigating the Creation of Energy and Understanding the Properties of Nothing” which is discussed in detail. Therefore, the proposed concept in this research series provides a road map for enhancing our understating of the universe's creation from nothing and its evolution and discusses the possibility of base energy as one of the main building blocks of this universe.Keywords: big bang, cosmic inflation, birth of universe, energy creation
Procedia PDF Downloads 89759 Identification of Natural Liver X Receptor Agonists as the Treatments or Supplements for the Management of Alzheimer and Metabolic Diseases
Authors: Hsiang-Ru Lin
Abstract:
Cholesterol plays an essential role in the regulation of the progression of numerous important diseases including atherosclerosis and Alzheimer disease so the generation of suitable cholesterol-lowering reagents is urgent to develop. Liver X receptor (LXR) is a ligand-activated transcription factor whose natural ligands are cholesterols, oxysterols and glucose. Once being activated, LXR can transactivate the transcription action of various genes including CYP7A1, ABCA1, and SREBP1c, involved in the lipid metabolism, glucose metabolism and inflammatory pathway. Essentially, the upregulation of ABCA1 facilitates cholesterol efflux from the cells and attenuates the production of beta-amyloid (ABeta) 42 in brain so LXR is a promising target to develop the cholesterol-lowering reagents and preventative treatment of Alzheimer disease. Engelhardia roxburghiana is a deciduous tree growing in India, China, and Taiwan. However, its chemical composition is only reported to exhibit antitubercular and anti-inflammatory effects. In this study, four compounds, engelheptanoxides A, C, engelhardiol A, and B isolated from the root of Engelhardia roxburghiana were evaluated for their agonistic activity against LXR by the transient transfection reporter assays in the HepG2 cells. Furthermore, their interactive modes with LXR ligand binding pocket were generated by molecular modeling programs. By using the cell-based biological assays, engelheptanoxides A, C, engelhardiol A, and B showing no cytotoxic effect against the proliferation of HepG2 cells, exerted obvious LXR agonistic effects with similar activity as T0901317, a novel synthetic LXR agonist. Further modeling studies including docking and SAR (structure-activity relationship) showed that these compounds can locate in LXR ligand binding pocket in the similar manner as T0901317. Thus, LXR is one of nuclear receptors targeted by pharmaceutical industry for developing treatments of Alzheimer and atherosclerosis diseases. Importantly, the cell-based assays, together with molecular modeling studies suggesting a plausible binding mode, demonstrate that engelheptanoxides A, C, engelhardiol A, and B function as LXR agonists. This is the first report to demonstrate that the extract of Engelhardia roxburghiana contains LXR agonists. As such, these active components of Engelhardia roxburghiana or subsequent analogs may show important therapeutic effects through selective modulation of the LXR pathway.Keywords: Liver X receptor (LXR), Engelhardia roxburghiana, CYP7A1, ABCA1, SREBP1c, HepG2 cells
Procedia PDF Downloads 420758 Tackling Inequalities in Regional Health Care: Accompanying an Inter-Sectoral Cooperation Project between University Medicine and Regional Care Structures
Authors: Susanne Ferschl, Peter Holzmüller, Elisabeth Wacker
Abstract:
Ageing populations, advances in medical sciences and digitalization, diversity and social disparities, as well as the increasing need for skilled healthcare professionals, are challenging healthcare systems around the globe. To address these challenges, future healthcare systems need to center on human needs taking into account the living environments that shape individuals’ knowledge of and opportunities to access healthcare. Moreover, health should be considered as a common good and an integral part of securing livelihoods for all people. Therefore, the adoption of a systems approach, as well as inter-disciplinary and inter-sectoral cooperation among healthcare providers, are essential. Additionally, the active engagement of target groups in the planning and design of healthcare structures is indispensable to understand and respect individuals’ health and livelihood needs. We will present the research project b4 – identifying needs | building bridges | developing health care in the social space, which is situated within this reasoning and accompanies the cross-sectoral cooperation project Brückenschlag (building bridges) in a Bavarian district. Brückenschlag seeks to explore effective ways of health care linking university medicine (Maximalversorgung | maximum care) with regional inpatient, outpatient, rehabilitative, and preventive care structures (Regionalversorgung | regional care). To create advantages for both (potential) patients and the involved cooperation partners, project b4 qualitatively assesses needs and motivations among professionals, population groups, and political stakeholders at individual and collective levels. Besides providing an overview of the project structure as well as of regional population and healthcare characteristics, the first results of qualitative interviews conducted with different health experts will be presented. Interviewed experts include managers of participating hospitals, nurses, medical specialists working in the hospital and registered doctors operating in practices in rural areas. At the end of the project life and based on the identified factors relevant to the success -and also for failure- of participatory cooperation in health care, the project aims at informing other districts embarking on similar systems-oriented and human-centered healthcare projects. Individuals’ health care needs in dependence on the social space in which they live will guide the development of recommendations.Keywords: cross-sectoral collaboration in health care, human-centered health care, regional health care, individual and structural health conditions
Procedia PDF Downloads 101757 Fuzzy Multi-Objective Approach for Emergency Location Transportation Problem
Authors: Bidzina Matsaberidze, Anna Sikharulidze, Gia Sirbiladze, Bezhan Ghvaberidze
Abstract:
In the modern world emergency management decision support systems are actively used by state organizations, which are interested in extreme and abnormal processes and provide optimal and safe management of supply needed for the civil and military facilities in geographical areas, affected by disasters, earthquakes, fires and other accidents, weapons of mass destruction, terrorist attacks, etc. Obviously, these kinds of extreme events cause significant losses and damages to the infrastructure. In such cases, usage of intelligent support technologies is very important for quick and optimal location-transportation of emergency service in order to avoid new losses caused by these events. Timely servicing from emergency service centers to the affected disaster regions (response phase) is a key task of the emergency management system. Scientific research of this field takes the important place in decision-making problems. Our goal was to create an expert knowledge-based intelligent support system, which will serve as an assistant tool to provide optimal solutions for the above-mentioned problem. The inputs to the mathematical model of the system are objective data, as well as expert evaluations. The outputs of the system are solutions for Fuzzy Multi-Objective Emergency Location-Transportation Problem (FMOELTP) for disasters’ regions. The development and testing of the Intelligent Support System were done on the example of an experimental disaster region (for some geographical zone of Georgia) which was generated using a simulation modeling. Four objectives are considered in our model. The first objective is to minimize an expectation of total transportation duration of needed products. The second objective is to minimize the total selection unreliability index of opened humanitarian aid distribution centers (HADCs). The third objective minimizes the number of agents needed to operate the opened HADCs. The fourth objective minimizes the non-covered demand for all demand points. Possibility chance constraints and objective constraints were constructed based on objective-subjective data. The FMOELTP was constructed in a static and fuzzy environment since the decisions to be made are taken immediately after the disaster (during few hours) with the information available at that moment. It is assumed that the requests for products are estimated by homeland security organizations, or their experts, based upon their experience and their evaluation of the disaster’s seriousness. Estimated transportation times are considered to take into account routing access difficulty of the region and the infrastructure conditions. We propose an epsilon-constraint method for finding the exact solutions for the problem. It is proved that this approach generates the exact Pareto front of the multi-objective location-transportation problem addressed. Sometimes for large dimensions of the problem, the exact method requires long computing times. Thus, we propose an approximate method that imposes a number of stopping criteria on the exact method. For large dimensions of the FMOELTP the Estimation of Distribution Algorithm’s (EDA) approach is developed.Keywords: epsilon-constraint method, estimation of distribution algorithm, fuzzy multi-objective combinatorial programming problem, fuzzy multi-objective emergency location/transportation problem
Procedia PDF Downloads 321756 Gas-Phase Nondestructive and Environmentally Friendly Covalent Functionalization of Graphene Oxide Paper with Amines
Authors: Natalia Alzate-Carvajal, Diego A. Acevedo-Guzman, Victor Meza-Laguna, Mario H. Farias, Luis A. Perez-Rey, Edgar Abarca-Morales, Victor A. Garcia-Ramirez, Vladimir A. Basiuk, Elena V. Basiuk
Abstract:
Direct covalent functionalization of prefabricated free-standing graphene oxide paper (GOP) is considered as the only approach suitable for systematic tuning of thermal, mechanical and electronic characteristics of this important class of carbon nanomaterials. At the same time, the traditional liquid-phase functionalization protocols can compromise physical integrity of the paper-like material up to its total disintegration. To avoid such undesirable effects, we explored the possibility of employing an alternative, solvent-free strategy for facile and nondestructive functionalization of GOP with two representative aliphatic amines, 1-octadecylamine (ODA) and 1,12-diaminododecane (DAD), as well as with two aromatic amines, 1-aminopyrene (AP) and 1,5-diaminonaphthalene (DAN). The functionalization was performed under moderate heating at 150-180 °C in vacuum. Under such conditions, it proceeds through both amidation and epoxy ring opening reactions. Comparative characterization of pristine and amine-functionalized GOP mats was carried out by using Fourier-transform infrared, Raman, and X-ray photoelectron spectroscopy (XPS), thermogravimetric (TGA) and differential thermal analysis, scanning electron and atomic force microscopy (SEM and AFM, respectively). Besides that, we compared the stability in water, wettability, electrical conductivity and elastic (Young's) modulus of GOP mats before and after amine functionalization. The highest content of organic species was obtained in the case of GOP-ODA, followed by GOP-DAD, GOP-AP and GOP-DAN samples. The covalent functionalization increased mechanical and thermal stability of GOP, as well as its electrical conductivity. The magnitude of each effect depends on the particular chemical structure of amine employed, which allows for tuning a given GOP property. Morphological characterization by using SEM showed that, compared to pristine graphene oxide paper, amine-modified GOP mats become relatively ordered layered assemblies, in which individual GO sheets are organized in a near-parallel pattern. Financial support from the National Autonomous University of Mexico (grants DGAPA-IN101118 and IN200516) and from the National Council of Science and Technology of Mexico (CONACYT, grant 250655) is greatly appreciated. The authors also thank David A. Domínguez (CNyN of UNAM) for XPS measurements and Dr. Edgar Alvarez-Zauco (Faculty of Science of UNAM) for the opportunity to use TGA equipment.Keywords: amines, covalent functionalization, gas-phase, graphene oxide paper
Procedia PDF Downloads 181755 Aesthetics and Semiotics in Theatre Performance
Authors: Păcurar Diana Istina
Abstract:
Structured in three chapters, the article attempts an X-ray of the theatrical aesthetics, correctly understood through the emotions generated in the intimate structure of the spectator that precedes the triggering of the viewer’s perception and not through the superposition, unfortunately common, of the notion of aesthetics with the style in which a theater show is built. The first chapter contains a brief history of the appearance of the word aesthetic, the formulation of definitions for this new term, as well as its connections with the notions of semiotics, in particular with the perception of the message transmitted. Starting with Aristotle and Plato, and reaching Magritte, their interventions should not be interpreted in the sense that the two scientific concepts can merge into one discipline. The perception that is the object of everyone’s analysis, the understanding of meaning, the decoding of the messages sent, and the triggering of feelings that culminate in pleasure, shaping the aesthetic vision, are some elements that keep semiotics and aesthetics distinct, even though they share many methods of analysis. The compositional processes of aesthetic representation and symbolic formation are analyzed in the second part of the paper from perspectives that include or do not include historical, cultural, social, and political processes. Aesthetics and the organization of its symbolic process are treated, taking into account expressive activity. The last part of the article explores the notion of aesthetics in applied theater, more specifically in the theater show. Taking the postmodern approach that aesthetics applies to the creation of an artifact and the reception of that artifact, the intervention of these elements in the theatrical system must be emphasized –that is, the analysis of the problems arising in the stages of the creation, presentation, and reception, by the public, of the theater performance. The aesthetic process is triggered involuntarily, simultaneously, or before the moment when people perceive the meaning of the messages transmitted by the work of art. The finding of this fact makes the mental process of aesthetics similar or related to that of semiotics. No matter how perceived individually, beauty, the mechanism of production can be reduced to two. The first step presents similarities to Peirce’s model, but the process between signified and signified additionally stimulates the related memory of the evaluation of beauty, adding to the meanings related to the signification itself. Then, the second step, a process of comparison, is followed, in which one examines whether the object being looked at matches the accumulated memory of beauty. Therefore, even though aesthetics is derived from the conceptual part, the judgment of beauty and, more than that, moral judgment come to be so important to the social activities of human beings that it evolves as a visible process independent of other conceptual contents.Keywords: aesthetics, semiotics, symbolic composition, subjective joints, signifying, signified
Procedia PDF Downloads 109754 Horizontal Cooperative Game Theory in Hotel Revenue Management
Authors: Ririh Rahma Ratinghayu, Jayu Pramudya, Nur Aini Masruroh, Shi-Woei Lin
Abstract:
This research studies pricing strategy in cooperative setting of hotel duopoly selling perishable product under fixed capacity constraint by using the perspective of managers. In hotel revenue management, competitor’s average room rate and occupancy rate should be taken into manager’s consideration in determining pricing strategy to generate optimum revenue. This information is not provided by business intelligence or available in competitor’s website. Thus, Information Sharing (IS) among players might result in improved performance of pricing strategy. IS is widely adopted in the logistics industry, but IS within hospitality industry has not been well-studied. This research put IS as one of cooperative game schemes, besides Mutual Price Setting (MPS) scheme. In off-peak season, hotel manager arranges pricing strategy to offer promotion package and various kinds of discounts up to 60% of full-price to attract customers. Competitor selling homogenous product will react the same, then triggers a price war. Price war which generates lower revenue may be avoided by creating collaboration in pricing strategy to optimize payoff for both players. In MPS cooperative game, players collaborate to set a room rate applied for both players. Cooperative game may avoid unfavorable players’ payoff caused by price war. Researches on horizontal cooperative game in logistics show better performance and payoff for the players, however, horizontal cooperative game in hotel revenue management has not been demonstrated. This paper aims to develop hotel revenue management models under duopoly cooperative schemes (IS & MPS), which are compared to models under non-cooperative scheme too. Each scheme has five models, Capacity Allocation Model; Demand Model; Revenue Model; Optimal Price Model; and Equilibrium Price Model. Capacity Allocation Model and Demand Model employs self-hotel and competitor’s full and discount price as predictors under non-linear relation. Optimal price is obtained by assuming revenue maximization motive. Equilibrium price is observed by interacting self-hotel’s and competitor’s optimal price under reaction equation. Equilibrium is analyzed using game theory approach. The sequence applies for three schemes. MPS Scheme differently aims to optimize total players’ payoff. The case study in which theoretical models are applied observes two hotels offering homogenous product in Indonesia during a year. The Capacity Allocation, Demand, and Revenue Models are built using multiple regression and statistically tested for validation. Case study data confirms that price behaves within demand model in a non-linear manner. IS Models can represent the actual demand and revenue data better than Non-IS Models. Furthermore, IS enables hotels to earn significantly higher revenue. Thus, duopoly hotel players in general, might have reasonable incentives to share information horizontally. During off-peak season, MPS Models are able to predict the optimal equal price for both hotels. However, Nash equilibrium may not always exist depending on actual payoff of adhering or betraying mutual agreement. To optimize performance, horizontal cooperative game may be chosen over non-cooperative game. Mathematical models can be used to detect collusion among business players. Empirical testing can be used as policy input for market regulator in preventing unethical business practices potentially harming society welfare.Keywords: horizontal cooperative game theory, hotel revenue management, information sharing, mutual price setting
Procedia PDF Downloads 289753 Microfluidic Plasmonic Bio-Sensing of Exosomes by Using a Gold Nano-Island Platform
Authors: Srinivas Bathini, Duraichelvan Raju, Simona Badilescu, Muthukumaran Packirisamy
Abstract:
A bio-sensing method, based on the plasmonic property of gold nano-islands, has been developed for detection of exosomes in a clinical setting. The position of the gold plasmon band in the UV-Visible spectrum depends on the size and shape of gold nanoparticles as well as on the surrounding environment. By adsorbing various chemical entities, or binding them, the gold plasmon band will shift toward longer wavelengths and the shift is proportional to the concentration. Exosomes transport cargoes of molecules and genetic materials to proximal and distal cells. Presently, the standard method for their isolation and quantification from body fluids is by ultracentrifugation, not a practical method to be implemented in a clinical setting. Thus, a versatile and cutting-edge platform is required to selectively detect and isolate exosomes for further analysis at clinical level. The new sensing protocol, instead of antibodies, makes use of a specially synthesized polypeptide (Vn96), to capture and quantify the exosomes from different media, by binding the heat shock proteins from exosomes. The protocol has been established and optimized by using a glass substrate, in order to facilitate the next stage, namely the transfer of the protocol to a microfluidic environment. After each step of the protocol, the UV-Vis spectrum was recorded and the position of gold Localized Surface Plasmon Resonance (LSPR) band was measured. The sensing process was modelled, taking into account the characteristics of the nano-island structure, prepared by thermal convection and annealing. The optimal molar ratios of the most important chemical entities, involved in the detection of exosomes were calculated as well. Indeed, it was found that the results of the sensing process depend on the two major steps: the molar ratios of streptavidin to biotin-PEG-Vn96 and, the final step, the capture of exosomes by the biotin-PEG-Vn96 complex. The microfluidic device designed for sensing of exosomes consists of a glass substrate, sealed by a PDMS layer that contains the channel and a collecting chamber. In the device, the solutions of linker, cross-linker, etc., are pumped over the gold nano-islands and an Ocean Optics spectrometer is used to measure the position of the Au plasmon band at each step of the sensing. The experiments have shown that the shift of the Au LSPR band is proportional to the concentration of exosomes and, thereby, exosomes can be accurately quantified. An important advantage of the method is the ability to discriminate between exosomes having different origins.Keywords: exosomes, gold nano-islands, microfluidics, plasmonic biosensing
Procedia PDF Downloads 172752 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques
Authors: Soheila Sadeghi
Abstract:
In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes
Procedia PDF Downloads 40751 BiVO₄‑Decorated Graphite Felt as Highly Efficient Negative Electrode for All-Vanadium Redox Flow Batteries
Authors: Daniel Manaye Kabtamu, Anteneh Wodaje Bayeh
Abstract:
With the development and utilization of new energy technology, people’s demand for large-scale energy storage system has become increasingly urgent. Vanadium redox flow battery (VRFB) is one of the most promising technologies for grid-scale energy storage applications because of numerous attractive features, such as long cycle life, high safety, and flexible design. However, the relatively low energy efficiency and high production cost of the VRFB still limit its practical implementations. It is of great attention to enhance its energy efficiency and reduce its cost. One of the main components of VRFB that can impressively impact the efficiency and final cost is the electrode materials, which provide the reactions sites for redox couples (V₂₊/V³⁺ and VO²⁺/VO₂⁺). Graphite felt (GF) is a typical carbon-based material commonly employed as electrode for VRFB due to low-cost, good chemical and mechanical stability. However, pristine GF exhibits insufficient wettability, low specific surface area, and poor kinetics reversibility, leading to low energy efficiency of the battery. Therefore, it is crucial to further modify the GF electrode to improve its electrochemical performance towards VRFB by employing active electrocatalysts, such as less expensive metal oxides. This study successfully fabricates low-cost plate-like bismuth vanadate (BiVO₄) material through a simple one-step hydrothermal route, employed as an electrocatalyst to adorn the GF for use as the negative electrode in VRFB. The experimental results show that BiVO₄-3h exhibits the optimal electrocatalytic activity and reversibility for the vanadium redox couples among all samples. The energy efficiency of the VRFB cell assembled with BiVO₄-decorated GF as the negative electrode is found to be 75.42% at 100 mA cm−2, which is about 10.24% more efficient than that of the cell assembled with heat-treated graphite felt (HT-GF) electrode. The possible reasons for the activity enhancement can be ascribed to the existence of oxygen vacancies in the BiVO₄ lattice structure and the relatively high surface area of BiVO₄, which provide more active sites for facilitating the vanadium redox reactions. Furthermore, the BiVO₄-GF electrode obstructs the competitive irreversible hydrogen evolution reaction on the negative side of the cell, and it also has better wettability. Impressively, BiVO₄-GF as the negative electrode shows good stability over 100 cycles. Thus, BiVO₄-GF is a promising negative electrode candidate for practical VRFB applications.Keywords: BiVO₄ electrocatalyst, electrochemical energy storage, graphite felt, vanadium redox flow battery
Procedia PDF Downloads 1573750 Structural and Biochemical Characterization of Red and Green Emitting Luciferase Enzymes
Authors: Wael M. Rabeh, Cesar Carrasco-Lopez, Juliana C. Ferreira, Pance Naumov
Abstract:
Bioluminescence, the emission of light from a biological process, is found in various living organisms including bacteria, fireflies, beetles, fungus and different marine organisms. Luciferase is an enzyme that catalyzes a two steps oxidation of luciferin in the presence of Mg2+ and ATP to produce oxyluciferin and releases energy in the form of light. The luciferase assay is used in biological research and clinical applications for in vivo imaging, cell proliferation, and protein folding and secretion analysis. The luciferase enzyme consists of two domains, a large N-terminal domain (1-436 residues) that is connected to a small C-terminal domain (440-544) by a flexible loop that functions as a hinge for opening and closing the active site. The two domains are separated by a large cleft housing the active site that closes after binding the substrates, luciferin and ATP. Even though all insect luciferases catalyze the same chemical reaction and share 50% to 90% sequence homology and high structural similarity, they emit light of different colors from green at 560nm to red at 640 nm. Currently, the majority of the structural and biochemical studies have been conducted on green-emitting firefly luciferases. To address the color emission mechanism, we expressed and purified two luciferase enzymes with blue-shifted green and red emission from indigenous Brazilian species Amydetes fanestratus and Phrixothrix, respectively. The two enzymes naturally emit light of different colors and they are an excellent system to study the color-emission mechanism of luciferases, as the current proposed mechanisms are based on mutagenesis studies. Using a vapor-diffusion method and a high-throughput approach, we crystallized and solved the crystal structure of both enzymes, at 1.7 Å and 3.1 Å resolution respectively, using X-ray crystallography. The free enzyme adopted two open conformations in the crystallographic unit cell that are different from the previously characterized firefly luciferase. The blue-shifted green luciferase crystalized as a monomer similar to other luciferases reported in literature, while the red luciferases crystalized as an octamer and was also purified as an octomer in solution. The octomer conformation is the first of its kind for any insect’s luciferase, which might be relate to the red color emission. Structurally designed mutations confirmed the importance of the transition between the open and close conformations in the fine-tuning of the color and the characterization of other interesting mutants is underway.Keywords: bioluminescence, enzymology, structural biology, x-ray crystallography
Procedia PDF Downloads 326749 Spectroscopic (Ir, Raman, Uv-Vis) and Biological Study of Copper and Zinc Complexes and Sodium Salt with Cichoric Acid
Authors: Renata Swislocka, Grzegorz Swiderski, Agata Jablonska-Trypuc, Wlodzimierz Lewandowski
Abstract:
Forming a complex of a phenolic compound with a metal not only alters the physicochemical properties of the ligand (including increase in stability or changes in lipophilicity), but also its biological activity, including antioxidant, antimicrobial and many others. As part of our previous projects, we examined the physicochemical and antimicrobial properties of phenolic acids and their complexes with metals naturally occurring in foods. Previously we studied the complexes of manganese(II), copper(II), cadmium(II) and alkali metals with ferulic, caffeic and p-coumaric acids. In the framework of this study, the physicochemical and biological properties of cicoric acid, its sodium salt, and complexes with copper and zinc were investigated. Cichoric acid is a derivative of both caffeic acid and tartaric acid. It has first been isolated from Cichorium intybus (chicory) but also it occurs in significant amounts in Echinacea, particularly E. purpurea, dandelion leaves, basil, lemon balm and in aquatic plants, including algae and sea grasses. For the study of spectroscopic and biological properties of cicoric acid, its sodium salt, and complexes with zinc and copper a variety of methods were used. Studies of antioxidant properties were carried out in relation to selected stable radicals (method of reduction of DPPH and reduction of FRAP). As a result, the structure and spectroscopic properties of cicoric acid and its complexes with selected metals in the solid state and in the solutions were defined. The IR and Raman spectra of cicoric acid displayed a number of bands that were derived from vibrations of caffeic and tartaric acids moieties. At 1746 and 1716 cm-1 the bands assigned to the vibrations of the carbonyl group of tartaric acid occurred. In the spectra of metal complexes with cichoric these bands disappeared what indicated that metal ion was coordinated by the carboxylic groups of tartaric acid. In the spectra of the sodium salt, a characteristic wide-band vibrations of carboxylate anion occurred. In the spectra of cicoric acid and its salt and complexes, a number of bands derived from the vibrations of the aromatic ring (caffeic acid) were assigned. Upon metal-ligand attachment, the changes in the values of the wavenumbers of these bands occurred. The impact of metals on the antioxidant properties of cicoric acid was also examined. Cichoric acid has a high antioxidant potential. Complexation by metals (zinc, copper) did not significantly affect its antioxidant capacity. The work was supported by the National Science Centre, Poland (grant no. 2015/17/B/NZ9/03581).Keywords: chicoric acid, metal complexes, natural antioxidant, phenolic acids
Procedia PDF Downloads 337748 Protective Effect of Ginger Root Extract on Dioxin-Induced Testicular Damage in Rats
Authors: Hamid Abdulroof Saleh
Abstract:
Background: Dioxins are one of the most widely distributed environmental pollutants. Dioxins consist of feedstock during the preparation of some industries, such as the paper industry as they can be produced in the atmosphere during the process of burning garbage and waste, especially medical waste. Dioxins can be found in the adipose tissues of animals in the food chain as well as in human breast milk. 2,3,7,8-Tetrachlorodibenzo-pdioxin (TCDD) is the most toxic component of a large group of dioxins. Humans are exposed to TCDD through contaminated food items like meat, fish, milk products, eggs etc. Recently, natural formulations relating to reducing or eliminating TCDD toxicity have been in focus. Ginger rhizome (Zingiber officinale R., family: Zingiberaceae), is used worldwide as a spice. Both antioxidative and androgenic activity of Z. officinale was reported in animal models. Researchers showed that ginger oil has dominative protective effect on DNA damage and might act as a scavenger of oxygen radical and might be used as an antioxidant. Aim of the work: The present study was undertaken to evaluate the toxic effect of TCDD on the structure and histoarchitecture of the testis and the protective role of co-administration of ginger root extract to prevent this toxicity. Materials & Methods: Male adult rats of Sprague-Dawley strain were assigned to four groups, eight rats in each; control group, dioxin treated group (given TCDD at the dose of 100 ng/kg Bwt/day by gavage), ginger treated group (given 50 mg/kg Bwt/day of ginger root extract by gavage), dioxin and ginger treated group (given TCDD at the dose of 100 ng/kg Bwt/day and 50 mg/kg Bwt/day of ginger root extract by gavages). After three weeks, rats were weighed and sacrificed where testis were removed and weighted. The testes were processed for routine paraffin embedding and staining. Tissue sections were examined for different morphometric and histopathological changes. Results: Dioxin administration showed a harmful effects in the body, testis weight and other morphometric parameters of the testis. In addition, it produced varying degrees of damage to the seminiferous tubules, which were shrunken and devoid of mature spermatids. The basement membrane was disorganized with vacuolization and loss of germinal cells. The co-administration of ginger root extract showed obvious improvement in the above changes and showed reversible morphometric and histopathological changes of the seminiferous tubules. Conclusion: Ginger root extract treatment in this study was successful in reversing all morphometric and histological changes of dioxin testicular damage. Therefore, it showed a protective effect on testis against dioxin toxicity.Keywords: dioxin, ginger, rat, testis
Procedia PDF Downloads 418747 Healthy Architecture Applied to Inclusive Design for People with Cognitive Disabilities
Authors: Santiago Quesada-García, María Lozano-Gómez, Pablo Valero-Flores
Abstract:
The recent digital revolution, together with modern technologies, is changing the environment and the way people interact with inhabited space. However, in society, the elderly are a very broad and varied group that presents serious difficulties in understanding these modern technologies. Outpatients with cognitive disabilities, such as those suffering from Alzheimer's disease (AD), are distinguished within this cluster. This population group is in constant growth, and they have specific requirements for their inhabited space. According to architecture, which is one of the health humanities, environments are designed to promote well-being and improve the quality of life for all. Buildings, as well as the tools and technologies integrated into them, must be accessible, inclusive, and foster health. In this new digital paradigm, artificial intelligence (AI) appears as an innovative resource to help this population group improve their autonomy and quality of life. Some experiences and solutions, such as those that interact with users through chatbots and voicebots, show the potential of AI in its practical application. In the design of healthy spaces, the integration of AI in architecture will allow the living environment to become a kind of 'exo-brain' that can make up for certain cognitive deficiencies in this population. The objective of this paper is to address, from the discipline of neuroarchitecture, how modern technologies can be integrated into everyday environments and be an accessible resource for people with cognitive disabilities. For this, the methodology has a mixed structure. On the one hand, from an empirical point of view, the research carries out a review of the existing literature about the applications of AI to build space, following the critical review foundations. As a unconventional architectural research, an experimental analysis is proposed based on people with AD as a resource of data to study how the environment in which they live influences their regular activities. The results presented in this communication are part of the progress achieved in the competitive R&D&I project ALZARQ (PID2020-115790RB-I00). These outcomes are aimed at the specific needs of people with cognitive disabilities, especially those with AD, since, due to the comfort and wellness that the solutions entail, they can also be extrapolated to the whole society. As a provisional conclusion, it can be stated that, in the immediate future, AI will be an essential element in the design and construction of healthy new environments. The discipline of architecture has the compositional resources to, through this emerging technology, build an 'exo-brain' capable of becoming a personal assistant for the inhabitants, with whom to interact proactively and contribute to their general well-being. The main objective of this work is to show how this is possible.Keywords: Alzheimer’s disease, artificial intelligence, healthy architecture, neuroarchitecture, architectural design
Procedia PDF Downloads 61746 In Vitro Evaluation of a Chitosan-Based Adhesive to Treat Bone Fractures
Authors: Francisco J. Cedano, Laura M. Pinzón, Camila I. Castro, Felipe Salcedo, Juan P. Casas, Juan C. Briceño
Abstract:
Complex fractures located in articular surfaces are challenging to treat and their reduction with conventional treatments could compromise the functionality of the affected limb. An adhesive material to treat those fractures is desirable for orthopedic surgeons. This adhesive must be biocompatible and have a high adhesion to bone surface in an aqueous environment. The proposed adhesive is based on chitosan, given its adhesive and biocompatibility properties. Chitosan is mixed with calcium carbonate and hydroxyapatite, which contribute to structural support and a gel like behavior, and glutaraldehyde is used as a cross-linking agent to keep the adhesive mechanical performance in aqueous environment. This work aims to evaluate the rheological, adhesion strength and biocompatibility properties of the proposed adhesive using in vitro tests. The gelification process of the adhesive was monitored by oscillatory rheometry in an ARG-2 TA Instruments rheometer, using a parallel plate geometry of 22 mm and a gap of 1 mm. Time sweep experiments were conducted at 1 Hz frequency, 1% strain and 37°C from 0 to 2400 s. Adhesion strength is measured using a butt joint test with bovine cancellous bone fragments as substrates. The test is conducted at 5 min, 20min and 24 hours after curing the adhesive under water at 37°C. Biocompatibility is evaluated by a cytotoxicity test in a fibroblast cell culture using MTT assay and SEM. Rheological results concluded that the average gelification time of the adhesive is 820±107 s, also it reaches storage modulus magnitudes up to 106 Pa; The adhesive show solid-like behavior. Butt joint test showed 28.6 ± 9.2 kPa of tensile bond strength for the adhesive cured for 24 hours. Also there was no significant difference in adhesion strength between 20 minutes and 24 hours. MTT showed 70 ± 23 % of active cells at sixth day of culture, this percentage is estimated respect to a positive control (only cells with culture medium and bovine serum). High vacuum SEM observation permitted to localize and study the morphology of fibroblasts presented in the adhesive. All captured fibroblasts presented in SEM typical flatted structure with filopodia growth attached to adhesive surface. This project reports an adhesive based on chitosan that is biocompatible due to high active cells presented in MTT test and these results were correlated using SEM. Also, it has adhesion properties in conditions that model the clinical application, and the adhesion strength do not decrease between 5 minutes and 24 hours.Keywords: bioadhesive, bone adhesive, calcium carbonate, chitosan, hydroxyapatite, glutaraldehyde
Procedia PDF Downloads 321745 Modeling Diel Trends of Dissolved Oxygen for Estimating the Metabolism in Pristine Streams in the Brazilian Cerrado
Authors: Wesley A. Saltarelli, Nicolas R. Finkler, Adriana C. P. Miwa, Maria C. Calijuri, Davi G. F. Cunha
Abstract:
The metabolism of the streams is an indicator of ecosystem disturbance due to the influences of the catchment on the structure of the water bodies. The study of the respiration and photosynthesis allows the estimation of energy fluxes through the food webs and the analysis of the autotrophic and heterotrophic processes. We aimed at evaluating the metabolism in streams located in the Brazilian savannah, Cerrado (Sao Carlos, SP), by determining and modeling the daily changes of dissolved oxygen (DO) in the water during one year. Three water bodies with minimal anthropogenic interference in their surroundings were selected, Espraiado (ES), Broa (BR) and Canchim (CA). Every two months, water temperature, pH and conductivity are measured with a multiparameter probe. Nitrogen and phosphorus forms are determined according to standard methods. Also, canopy cover percentages are estimated in situ with a spherical densitometer. Stream flows are quantified through the conservative tracer (NaCl) method. For the metabolism study, DO (PME-MiniDOT) and light (Odyssey Photosynthetic Active Radiation) sensors log data for at least three consecutive days every ten minutes. The reaeration coefficient (k2) is estimated through the method of the tracer gas (SF6). Finally, we model the variations in DO concentrations and calculate the rates of gross and net primary production (GPP and NPP) and respiration based on the one station method described in the literature. Three sampling were carried out in October and December 2015 and February 2016 (the next will be in April, June and August 2016). The results from the first two periods are already available. The mean water temperatures in the streams were 20.0 +/- 0.8C (Oct) and 20.7 +/- 0.5C (Dec). In general, electrical conductivity values were low (ES: 20.5 +/- 3.5uS/cm; BR 5.5 +/- 0.7uS/cm; CA 33 +/- 1.4 uS/cm). The mean pH values were 5.0 (BR), 5.7 (ES) and 6.4 (CA). The mean concentrations of total phosphorus were 8.0ug/L (BR), 66.6ug/L (ES) and 51.5ug/L (CA), whereas soluble reactive phosphorus concentrations were always below 21.0ug/L. The BR stream had the lowest concentration of total nitrogen (0.55mg/L) as compared to CA (0.77mg/L) and ES (1.57mg/L). The average discharges were 8.8 +/- 6L/s (ES), 11.4 +/- 3L/s and CA 2.4 +/- 0.5L/s. The average percentages of canopy cover were 72% (ES), 75% (BR) and 79% (CA). Significant daily changes were observed in the DO concentrations, reflecting predominantly heterotrophic conditions (respiration exceeded the gross primary production, with negative net primary production). The GPP varied from 0-0.4g/m2.d (in Oct and Dec) and the R varied from 0.9-22.7g/m2.d (Oct) and from 0.9-7g/m2.d (Dec). The predominance of heterotrophic conditions suggests increased vulnerability of the ecosystems to artificial inputs of organic matter that would demand oxygen. The investigation of the metabolism in the pristine streams can help defining natural reference conditions of trophic state.Keywords: low-order streams, metabolism, net primary production, trophic state
Procedia PDF Downloads 258744 An Evolutionary Approach for QAOA for Max-Cut
Authors: Francesca Schiavello
Abstract:
This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization
Procedia PDF Downloads 60