Search results for: radiation processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4913

Search results for: radiation processing

3023 Influence of 50 Hz, 1m Tesla Electromagnetic Fields on Serum Male Sex Hormones of Male Rats

Authors: Randa M. Mostafa, Y. Moustafa

Abstract:

During our daily life, we are continuously exposed to the extremely low frequency electromagnetic fields (ELF-EMFs) generated by electric appliances. The possible relation between exposure to (ELF-MFs) and adverse health effects has attracted and passed through long debate sessions. Extremely low frequency is a term used to describe radiation frequencies below 300 Hertz (Hz).It is very important for public health because of the widespread use of electrical power at 50-60 Hz in most countries. This study set out to investigate the impact of chronic exposure of male rats to 50- Hz, 1 mTesla (ELF-EMF) of over periods of 1, 2, and 4 weeks on concentration of serum FSH, LH, and testosterone hormones. 60 male albino rats were divided into 6 groups and were continuously exposed to 50-Hz, 1 m Tesla (ELF-EMF) generated by magnetic field chamber for periods of 1, 2, and 4 weeks. For each experimental point, sham treated group was used as a control. Assay of serum testosterone LH, and FSHwere performed. Serum testosterone showed no significant changes. FSH showed significant increase than sham exposed group after 1 week of field exposure. LH showed significant increase than sham exposed group only after 4 weeks of field exposure. A future detailed molecular studies must be carried out to figure out and may be able to explain the possible interactions between ELF-EMF and hypothalamic-pituitary gonadal axis.

Keywords: extremely low frequency electromagnetic fields, testosterone, follicular stimulating hormone, LH

Procedia PDF Downloads 444
3022 Vascular Targeted Photodynamic Therapy Monitored by Real-Time Laser Speckle Imaging

Authors: Ruth Goldschmidt, Vyacheslav Kalchenko, Lilah Agemy, Rachel Elmoalem, Avigdor Scherz

Abstract:

Vascular Targeted Photodynamic therapy (VTP) is a new modality for selective cancer treatment that leads to the complete tumor ablation. A photosensitizer, a bacteriochlorophyll derivative in our case, is first administered to the patient and followed by the illumination of the tumor area, by a near-IR laser for its photoactivation. The photoactivated drug releases reactive oxygen species (ROS) in the circulation, which reacts with blood cells and the endothelium leading to the occlusion of the blood vasculature. If the blood vessels are only partially closed, the tumor may recover, and cancer cells could survive. On the other hand, excessive treatment may lead to toxicity of healthy tissues nearby. Simultaneous VTP monitoring and image processing independent of the photoexcitation laser has not yet been reported, to our knowledge. Here we present a method for blood flow monitoring, using a real-time laser speckle imaging (RTLSI) in the tumor during VTP. We have synthesized over the years a library of bacteriochlorophyll derivatives, among them WST11 and STL-6014. Both are water soluble derivatives that are retained in the blood vasculature through their partial binding to HSA. WST11 has been approved in Mexico for VTP treatment of prostate cancer at a certain drug dose, and time/intensity of illumination. Application to other bacteriochlorophyll derivatives or other cancers may require different treatment parameters (such as light/drug administration). VTP parameters for STL-6014 are still under study. This new derivative mainly differs from WST11 by its lack of the central Palladium, and its conjugation to an Arg-Gly-Asp (RGD) sequence. RGD is a tumor-specific ligand that is used for targeting the necrotic tumor domains through its affinity to αVβ3 integrin receptors. This enables the study of cell-targeted VTP. We developed a special RTLSI module, based on Labview software environment for data processing. The new module enables to acquire raw laser speckle images and calculate the values of the laser temporal statistics of time-integrated speckles in real time, without additional off-line processing. Using RTLSI, we could monitor the tumor’s blood flow following VTP in a CT26 colon carcinoma ear model. VTP with WST11 induced an immediate slow down of the blood flow within the tumor and a complete final flow arrest, after some sporadic reperfusions. If the irradiation continued further, the blood flow stopped also in the blood vessels of the surrounding healthy tissue. This emphasizes the significance of light dose control. Using our RTLSI system, we could prevent any additional healthy tissue damage by controlling the illumination time and restrict blood flow arrest within the tumor only. In addition, we found that VTP with STL-6014 was the most effective when the photoactivation was conducted 4h post-injection, in terms of tumor ablation success in-vivo and blood vessel flow arrest. In conclusion, RTSLI application should allow to optimize VTP efficacy vs. toxicity in both the preclinical and clinical arenas.

Keywords: blood vessel occlusion, cancer treatment, photodynamic therapy, real time imaging

Procedia PDF Downloads 210
3021 The Design and Implementation of a Calorimeter for Evaluation of the Thermal Performance of Materials: The Case of Phase Change Materials

Authors: Ebrahim Solgi, Zahra Hamedani, Behrouz Mohammad Kari, Ruwan Fernando, Henry Skates

Abstract:

The use of thermal energy storage (TES) as part of a passive design strategy can reduce a building’s energy demand. TES materials do this by increasing the lag between energy consumption and energy supply by absorbing, storing and releasing energy in a controlled manner. The increase of lightweight construction in the building industry has made it harder to utilize thermal mass. Consequently, Phase Change Materials (PCMs) are a promising alternative as they can be manufactured in thin layers and used with lightweight construction to store latent heat. This research investigates utilizing PCMs, with the first step being measuring their performance under experimental conditions. To do this requires three components. The first is a calorimeter for measuring indoor thermal conditions, the second is a pyranometer for recording the solar conditions: global, diffuse and direct radiation and the third is a data-logger for recording temperature and humidity for the studied period. This paper reports on the design and implementation of an experimental setup used to measure the thermal characteristics of PCMs as part of a wall construction. The experimental model has been simulated with the software EnergyPlus to create a reliable simulation model that warrants further investigation.

Keywords: phase change materials, EnergyPlus, experimental evaluation, night ventilation

Procedia PDF Downloads 239
3020 Investigating the Editing's Effect of Advertising Photos on the Virtual Purchase Decision Based on the Quantitative Electroencephalogram (EEG) Parameters

Authors: Parya Tabei, Maryam Habibifar

Abstract:

Decision-making is an important cognitive function that can be defined as the process of choosing an option among available options to achieve a specific goal. Consumer ‘need’ is the main reason for purchasing decisions. Human decision-making while buying products online is subject to various factors, one of which is the quality and effect of advertising photos. Advertising photo editing can have a significant impact on people's virtual purchase decisions. This technique helps improve the quality and overall appearance of photos by adjusting various aspects such as brightness, contrast, colors, cropping, resizing, and adding filters. This study, by examining the effect of editing advertising photos on the virtual purchase decision using EEG data, tries to investigate the effect of edited images on the decision-making of customers. A group of 30 participants were asked to react to 24 edited and unedited images while their EEG was recorded. Analysis of the EEG data revealed increased alpha wave activity in the occipital regions (O1, O2) for both edited and unedited images, which is related to visual processing and attention. Additionally, there was an increase in beta wave activity in the frontal regions (FP1, FP2, F4, F8) when participants viewed edited images, suggesting involvement in cognitive processes such as decision-making and evaluating advertising content. Gamma wave activity also increased in various regions, especially the frontal and parietal regions, which are associated with higher cognitive functions, such as attention, memory, and perception, when viewing the edited images. While the visual processing reflected by alpha waves remained consistent across different visual conditions, editing advertising photos appeared to boost neural activity in frontal and parietal regions associated with decision-making processes. These Findings suggest that photo editing could potentially influence consumer perceptions during virtual shopping experiences by modulating brain activity related to product assessment and purchase decisions.

Keywords: virtual purchase decision, advertising photo, EEG parameters, decision Making

Procedia PDF Downloads 22
3019 Mg AZ31B Alloy Processed through ECASD

Authors: P. Fernández-Morales, D. Peláez, C. Isaza, J. M. Meza, E. Mendoza

Abstract:

Mg AZ31B alloy sheets were processed through equal-channel angular sheet drawing (ECASD) process, following the route A and C at room temperature and varying the processing speed. SEM was used to analyze the microstructure. The grain size was refined and presence of twins was observed. Vickers microhardness and tensile testing were carried out to evaluate the mechanical properties, showing in general; a remarkable increase in the first pass and slight increases during subsequent passes and, that the route C produces better uniform properties distribution through the thickness of the samples.

Keywords: ECASD, Mg Alloy, mechanical properties, microstructure

Procedia PDF Downloads 347
3018 Mechanical Properties of Poly(Propylene)-Based Graphene Nanocomposites

Authors: Luiza Melo De Lima, Tito Trindade, Jose M. Oliveira

Abstract:

The development of thermoplastic-based graphene nanocomposites has been of great interest not only to the scientific community but also to different industrial sectors. Due to the possible improvement of performance and weight reduction, thermoplastic nanocomposites are a great promise as a new class of materials. These nanocomposites are of relevance for the automotive industry, namely because the emission limits of CO2 emissions imposed by the European Commission (EC) regulations can be fulfilled without compromising the car’s performance but by reducing its weight. Thermoplastic polymers have some advantages over thermosetting polymers such as higher productivity, lower density, and recyclability. In the automotive industry, for example, poly(propylene) (PP) is a common thermoplastic polymer, which represents more than half of the polymeric raw material used in automotive parts. Graphene-based materials (GBM) are potential nanofillers that can improve the properties of polymer matrices at very low loading. In comparison to other composites, such as fiber-based composites, weight reduction can positively affect their processing and future applications. However, the properties and performance of GBM/polymer nanocomposites depend on the type of GBM and polymer matrix, the degree of dispersion, and especially the type of interactions between the fillers and the polymer matrix. In order to take advantage of the superior mechanical strength of GBM, strong interfacial strength between GBM and the polymer matrix is required for efficient stress transfer from GBM to the polymer. Thus, chemical compatibilizers and physicochemical modifications have been reported as important tools during the processing of these nanocomposites. In this study, PP-based nanocomposites were obtained by a simple melt blending technique, using a Brabender type mixer machine. Graphene nanoplatelets (GnPs) were applied as structural reinforcement. Two compatibilizers were used to improve the interaction between PP matrix and GnPs: PP graft maleic anhydride (PPgMA) and PPgMA modified with tertiary amine alcohol (PPgDM). The samples for tensile and Charpy impact tests were obtained by injection molding. The results suggested the GnPs presence can increase the mechanical strength of the polymer. However, it was verified that the GnPs presence can promote a decrease of impact resistance, turning the nanocomposites more fragile than neat PP. The compatibilizers’ incorporation increases the impact resistance, suggesting that the compatibilizers can enhance the adhesion between PP and GnPs. Compared to neat PP, Young’s modulus of non-compatibilized nanocomposite increase demonstrated that GnPs incorporation can promote a stiffness improvement of the polymer. This trend can be related to the several physical crosslinking points between the PP matrix and the GnPs. Furthermore, the decrease of strain at a yield of PP/GnPs, together with the enhancement of Young’s modulus, confirms that the GnPs incorporation led to an increase in stiffness but to a decrease in toughness. Moreover, the results demonstrated that incorporation of compatibilizers did not affect Young’s modulus and strain at yield results compared to non-compatibilized nanocomposite. The incorporation of these compatibilizers showed an improvement of nanocomposites’ mechanical properties compared both to those the non-compatibilized nanocomposite and to a PP sample used as reference.

Keywords: graphene nanoplatelets, mechanical properties, melt blending processing, poly(propylene)-based nanocomposites

Procedia PDF Downloads 172
3017 Password Cracking on Graphics Processing Unit Based Systems

Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik

Abstract:

Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper is mainly to propose how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.

Keywords: GPGPU, password cracking, secret key, user authentication

Procedia PDF Downloads 269
3016 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia

Authors: Rohan Bhasin

Abstract:

Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.

Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM

Procedia PDF Downloads 149
3015 Detection of Phoneme [S] Mispronounciation for Sigmatism Diagnosis in Adults

Authors: Michal Krecichwost, Zauzanna Miodonska, Pawel Badura

Abstract:

The diagnosis of sigmatism is mostly based on the observation of articulatory organs. It is, however, not always possible to precisely observe the vocal apparatus, in particular in the oral cavity of the patient. Speech processing can allow to objectify the therapy and simplify the verification of its progress. In the described study the methodology for classification of incorrectly pronounced phoneme [s] is proposed. The recordings come from adults. They were registered with the speech recorder at the sampling rate of 44.1 kHz and the resolution of 16 bit. The database of pathological and normative speech has been collected for the study including reference assessments provided by the speech therapy experts. Ten adult subjects were asked to simulate a certain type of stigmatism under the speech therapy expert supervision. In the recordings, the analyzed phone [s] was surrounded by vowels, viz: ASA, ESE, ISI, SPA, USU, YSY. Thirteen MFCC (mel-frequency cepstral coefficients) and RMS (root mean square) values are calculated within each frame being a part of the analyzed phoneme. Additionally, 3 fricative formants along with corresponding amplitudes are determined for the entire segment. In order to aggregate the information within the segment, the average value of each MFCC coefficient is calculated. All features of other types are aggregated by means of their 75th percentile. The proposed method of features aggregation reduces the size of the feature vector used in the classification. Binary SVM (support vector machine) classifier is employed at the phoneme recognition stage. The first group consists of pathological phones, while the other of the normative ones. The proposed feature vector yields classification sensitivity and specificity measures above 90% level in case of individual logo phones. The employment of a fricative formants-based information improves the sole-MFCC classification results average of 5 percentage points. The study shows that the employment of specific parameters for the selected phones improves the efficiency of pathology detection referred to the traditional methods of speech signal parameterization.

Keywords: computer-aided pronunciation evaluation, sibilants, sigmatism diagnosis, speech processing

Procedia PDF Downloads 267
3014 Effect of Birks Constant and Defocusing Parameter on Triple-to-Double Coincidence Ratio Parameter in Monte Carlo Simulation-GEANT4

Authors: Farmesk Abubaker, Francesco Tortorici, Marco Capogni, Concetta Sutera, Vincenzo Bellini

Abstract:

This project concerns with the detection efficiency of the portable triple-to-double coincidence ratio (TDCR) at the National Institute of Metrology of Ionizing Radiation (INMRI-ENEA) which allows direct activity measurement and radionuclide standardization for pure-beta emitter or pure electron capture radionuclides. The dependency of the simulated detection efficiency of the TDCR, by using Monte Carlo simulation Geant4 code, on the Birks factor (kB) and defocusing parameter has been examined especially for low energy beta-emitter radionuclides such as 3H and 14C, for which this dependency is relevant. The results achieved in this analysis can be used for selecting the best kB factor and the defocusing parameter for computing theoretical TDCR parameter value. The theoretical results were compared with the available ones, measured by the ENEA TDCR portable detector, for some pure-beta emitter radionuclides. This analysis allowed to improve the knowledge of the characteristics of the ENEA TDCR detector that can be used as a traveling instrument for in-situ measurements with particular benefits in many applications in the field of nuclear medicine and in the nuclear energy industry.

Keywords: Birks constant, defocusing parameter, GEANT4 code, TDCR parameter

Procedia PDF Downloads 134
3013 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability

Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli

Abstract:

Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.

Keywords: agriculture 4.0, agri-food suppy chain, industry 4.0, voluntary traceability

Procedia PDF Downloads 134
3012 Acoustic Analysis of Ball Bearings to Identify Localised Race Defect

Authors: M. Solairaju, Nithin J. Thomas, S. Ganesan

Abstract:

Each and every rotating part of a machine element consists of bearings within its structure. In particular, the rolling element bearings such as cylindrical roller bearing and deep groove ball bearings are frequently used. Improper handling, excessive loading, improper lubrication and sealing cause bearing damage. Hence health monitoring of bearings is an important aspect for radiation pattern of bearing vibration is computed using the dipole model. Sound pressure level for defect-free and race defect the prolonged life of machinery and auto motives. This paper presents modeling and analysis of Acoustic response of deep groove ball bearing with localized race defects. Most of the ball bearings, especially in machine tool spindles and high-speed applications are pre-loaded along an axial direction. The present study is carried out with axial preload. Based on the vibration response, the orbit motion of the inner race is studied, and it was found that the oscillation takes place predominantly in the axial direction. Simplified acoustic is estimated. Acoustic response shows a better indication in identifying the defective bearing. The computed sound signal is visualized in diagrammatic representation using Symmetrised Dot Pattern (SDP). SDP gives better visual distinction between the defective and defect-free bearing

Keywords: bearing, dipole, noise, sound

Procedia PDF Downloads 280
3011 Numerical Modeling of hybrid Photovoltaic-Thermoelectric Solar Unit by Applying Various Cross-Sections of Cooling Ducts

Authors: Ziba Khalili, Mohsen Sheikholeslami, Ladan Momayez

Abstract:

Combining the photovoltaic/thermal (PVT) systems with a thermoelectric (TE) module can raise energy yields since the TE module boosts the system's energy conversion efficiency. In the current study, a PVT system integrated with a TE module was designed and simulated in ANSYS Fluent 19.2. A copper heat transfer tube (HTT) was employed for cooling the photovoltaic (PV) cells. Four different shapes of HTT cross-section, i.e., circular, square, elliptical, and triangular, with equal cross-section areas were investigated. Also, the influence of Cu-Al2O3/water hybrid nanofluid (0.024% volume concentration), fluid inlet velocity (uᵢ ), and amount of solar radiation (G), on the PV temperature (Tₚᵥ) and system performance were investigated. The ambient temperature (Tₐ), wind speed (u𝓌), and fluid inlet temperature (Tᵢ), were considered to be 25°C, 1 m/s, and 27°C, respectively. According to the obtained data, the triangular case had the greatest impact on reducing the compared to other cases. In the triangular case, examination of the effect of hybrid nanofluid showed that the use of hybrid nanofluid at 800 W/m2 led to a reduction of the TPV by 0.6% compared to water, at 0.19 m/s. Moreover, the thermal efficiency ( ) and the overall electrical efficiency (nₜ) of the system improved by 0.93% and 0.22%, respectively, at 0.19 m/s. In a triangular case where G and were 800 W/m2 and 19 m/s, respectively, the highest amount of, thermal power (Eₜ), and, were obtained as 72.76%, 130.84 W and 12.03%, respectively.

Keywords: electrical performance, photovoltaic/thermal, thermoelectric, hybrid nanofluid, thermal efficiency

Procedia PDF Downloads 62
3010 Creating Energy Sustainability in an Enterprise

Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala

Abstract:

As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.

Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure

Procedia PDF Downloads 91
3009 Automatic Identification of Pectoral Muscle

Authors: Ana L. M. Pavan, Guilherme Giacomini, Allan F. F. Alves, Marcela De Oliveira, Fernando A. B. Neto, Maria E. D. Rosa, Andre P. Trindade, Diana R. De Pina

Abstract:

Mammography is a worldwide image modality used to diagnose breast cancer, even in asymptomatic women. Due to its large availability, mammograms can be used to measure breast density and to predict cancer development. Women with increased mammographic density have a four- to sixfold increase in their risk of developing breast cancer. Therefore, studies have been made to accurately quantify mammographic breast density. In clinical routine, radiologists perform image evaluations through BIRADS (Breast Imaging Reporting and Data System) assessment. However, this method has inter and intraindividual variability. An automatic objective method to measure breast density could relieve radiologist’s workload by providing a first aid opinion. However, pectoral muscle is a high density tissue, with similar characteristics of fibroglandular tissues. It is consequently hard to automatically quantify mammographic breast density. Therefore, a pre-processing is needed to segment the pectoral muscle which may erroneously be quantified as fibroglandular tissue. The aim of this work was to develop an automatic algorithm to segment and extract pectoral muscle in digital mammograms. The database consisted of thirty medio-lateral oblique incidence digital mammography from São Paulo Medical School. This study was developed with ethical approval from the authors’ institutions and national review panels under protocol number 3720-2010. An algorithm was developed, in Matlab® platform, for the pre-processing of images. The algorithm uses image processing tools to automatically segment and extract the pectoral muscle of mammograms. Firstly, it was applied thresholding technique to remove non-biological information from image. Then, the Hough transform is applied, to find the limit of the pectoral muscle, followed by active contour method. Seed of active contour is applied in the limit of pectoral muscle found by Hough transform. An experienced radiologist also manually performed the pectoral muscle segmentation. Both methods, manual and automatic, were compared using the Jaccard index and Bland-Altman statistics. The comparison between manual and the developed automatic method presented a Jaccard similarity coefficient greater than 90% for all analyzed images, showing the efficiency and accuracy of segmentation of the proposed method. The Bland-Altman statistics compared both methods in relation to area (mm²) of segmented pectoral muscle. The statistic showed data within the 95% confidence interval, enhancing the accuracy of segmentation compared to the manual method. Thus, the method proved to be accurate and robust, segmenting rapidly and freely from intra and inter-observer variability. It is concluded that the proposed method may be used reliably to segment pectoral muscle in digital mammography in clinical routine. The segmentation of the pectoral muscle is very important for further quantifications of fibroglandular tissue volume present in the breast.

Keywords: active contour, fibroglandular tissue, hough transform, pectoral muscle

Procedia PDF Downloads 337
3008 Occupational Heat Stress Condition According to Wet Bulb Globe Temperature Index in Textile Processing Unit: A Case Study of Surat, Gujarat, India

Authors: Dharmendra Jariwala, Robin Christian

Abstract:

Thermal exposure is a common problem in every manufacturing industry where heat is used in the manufacturing process. In developing countries like India, a lack of awareness regarding the proper work environmental condition is observed among workers. Improper planning of factory building, arrangement of machineries, ventilation system, etc. play a vital role in the rise of temperature within the manufacturing areas. Due to the uncontrolled thermal stress, workers may be subjected to various heat illnesses from mild disorder to heat stroke. Heat stress is responsible for the health risk and reduction in production. Wet Bulb Globe Temperature (WBGT) index and relative humidity are used to evaluate heat stress conditions. WBGT index is a weighted average of natural wet bulb temperature, globe temperature, dry bulb temperature, which are measured with standard instrument QuestTemp 36 area stress monitor. In this study textile processing units have been selected in the industrial estate in the Surat city. Based on the manufacturing process six locations were identified within the plant at which process was undertaken at 120°C to 180°C. These locations were jet dying machine area, stenter machine area, printing machine, looping machine area, washing area which generate process heat. Office area was also selected for comparision purpose as a sixth location. Present Study was conducted in the winter season and summer season for day and night shift. The results shows that average WBGT index was found above Threshold Limiting Value (TLV) during summer season for day and night shift in all three industries except office area. During summer season highest WBGT index of 32.8°C was found during day shift and 31.5°C was found during night shift at printing machine area. Also during winter season highest WBGT index of 30°C and 29.5°C was found at printing machine area during day shift and night shift respectively.

Keywords: relative humidity, textile industry, thermal stress, WBGT

Procedia PDF Downloads 160
3007 Emotional Processing Difficulties in Recovered Anorexia Nervosa Patients: State or Trait

Authors: Telma Fontao de Castro, Kylee Miller, Maria Xavier Araújo, Isabel Brandao, Sandra Torres

Abstract:

Objective: There is a dearth of research investigating the long-term emotional functioning of individuals recovered from anorexia nervosa (AN). This 15-year longitudinal study aimed to examine whether difficulties in cognitive processing of emotions persisted after long-term AN recovery and its link to anxiety and depression. Method: Twenty-four females, who were tested longitudinally during their acute and recovered AN phases, and 24 healthy control (HC) women, were screened for anxiety, depression, alexithymia, and emotion regulation difficulties (ER; only assessed in recovery phase). Results: Anxiety, depression, and alexithymia levels decreased significantly with AN recovery. However, scores on anxiety and difficulty in identifying feelings (alexithymia factor) remained high when compared to the HC group. Scores on emotion regulation difficulties were also lower in HC group. The abovementioned differences between AN recovered group and HC group in difficulties in identifying and accepting feelings and lack of emotional clarity were no longer present when the effect of anxiety and depression was controlled. Conclusions: Findings suggest that emotional dysfunction tends to decrease in AN recovered phase. However, using an HC group as a reference, we conclude that several emotional difficulties are still increased after long-term AN recovery, in particular, limited access to emotion regulation strategies, and difficulty controlling impulses and engaging in goal-directed behavior, thus suggesting to be a trait vulnerability. In turn, competencies related to emotional clarity and acceptance of emotional responses seem to be state-dependent phenomena linked to anxiety and depression. In sum, managing emotions remains a challenge for individuals recovered from AN. Under this circumstance, maladaptive eating behavior can serve as an affect regulatory function, increasing the risk of relapse. Emotional education and stabilization of depressive and anxious symptomatology after recovery emerge as an important avenue to protect from long-term AN relapse.

Keywords: alexithymia, anorexia nervosa, emotion recognition, emotion regulation

Procedia PDF Downloads 113
3006 Solar Calculations of Modified Arch (Semi-Spherical) Type Greenhouse System for Bayburt City

Authors: Uğur Çakir, Erol Şahin, Kemal Çomakli, Ayşegül Çokgez Kuş

Abstract:

Solar energy is thought as main source of all energy sources on the world and it can be used in many applications like agricultural areas, heating cooling or direct electricity production directly or indirectly. Greenhousing is the first one of the agricultural activities that solar energy can be used directly in. Greenhouses offer us suitable conditions which can be controlled easily for the growth of the plant and they are made by using a covering material that allows the sun light entering into the system. Covering material can be glass, fiber glass, plastic or another transparent element. This study investigates the solar energy usability rates and solar energy benefiting rates of a semi-spherical (modified arch) type greenhouse system according to different orientations and positions which exists under climatic conditions of Bayburt. In the concept of this study it is tried to determine the best direction and best sizes of a semi-spherical greenhouse to get best solar benefit from the sun. To achieve this aim a modeling study is made by using MATLAB. However this modeling study is running for some determined shapes and greenhouses it can be used for different shaped greenhouses or buildings. The basic parameters are determined as greenhouse azimuth angle, the rate of size of long edge to short and seasonal solar energy gaining of greenhouse.

Keywords: greenhousing, solar energy, direct radiation, renewable energy

Procedia PDF Downloads 464
3005 Developing Manufacturing Process for the Graphene Sensors

Authors: Abdullah Faqihi, John Hedley

Abstract:

Biosensors play a significant role in the healthcare sectors, scientific and technological progress. Developing electrodes that are easy to manufacture and deliver better electrochemical performance is advantageous for diagnostics and biosensing. They can be implemented extensively in various analytical tasks such as drug discovery, food safety, medical diagnostics, process controls, security and defence, in addition to environmental monitoring. Development of biosensors aims to create high-performance electrochemical electrodes for diagnostics and biosensing. A biosensor is a device that inspects the biological and chemical reactions generated by the biological sample. A biosensor carries out biological detection via a linked transducer and transmits the biological response into an electrical signal; stability, selectivity, and sensitivity are the dynamic and static characteristics that affect and dictate the quality and performance of biosensors. In this research, a developed experimental study for laser scribing technique for graphene oxide inside a vacuum chamber for processing of graphene oxide is presented. The processing of graphene oxide (GO) was achieved using the laser scribing technique. The effect of the laser scribing on the reduction of GO was investigated under two conditions: atmosphere and vacuum. GO solvent was coated onto a LightScribe DVD. The laser scribing technique was applied to reduce GO layers to generate rGO. The micro-details for the morphological structures of rGO and GO were visualised using scanning electron microscopy (SEM) and Raman spectroscopy so that they could be examined. The first electrode was a traditional graphene-based electrode model, made under normal atmospheric conditions, whereas the second model was a developed graphene electrode fabricated under a vacuum state using a vacuum chamber. The purpose was to control the vacuum conditions, such as the air pressure and the temperature during the fabrication process. The parameters to be assessed include the layer thickness and the continuous environment. Results presented show high accuracy and repeatability achieving low cost productivity.

Keywords: laser scribing, lightscribe DVD, graphene oxide, scanning electron microscopy

Procedia PDF Downloads 100
3004 Molecular Approach for the Detection of Lactic Acid Bacteria in the Kenyan Spontaneously Fermented Milk, Mursik

Authors: John Masani Nduko, Joseph Wafula Matofari

Abstract:

Many spontaneously fermented milk products are produced in Kenya, where they are integral to the human diet and play a central role in enhancing food security and income generation via small-scale enterprises. Fermentation enhances product properties such as taste, aroma, shelf-life, safety, texture, and nutritional value. Some of these products have demonstrated therapeutic and probiotic effects although recent reports have linked some to death, biotoxin infections, and esophageal cancer. These products are mostly processed from poor quality raw materials under unhygienic conditions resulting to inconsistent product quality and limited shelf-lives. Though very popular, research on their processing technologies is low, and none of the products has been produced under controlled conditions using starter cultures. To modernize the processing technologies for these products, our study aims at describing the microbiology and biochemistry of a representative Kenyan spontaneously fermented milk product, Mursik using modern biotechnology (DNA sequencing) and their chemical composition. Moreover, co-creation processes reflecting stakeholders’ experiences on traditional fermented milk production technologies and utilization, ideals and senses of value, which will allow the generation of products based on common ground for rapid progress will be discussed. Knowledge of the value of clean starting raw material will be emphasized, the need for the definition of fermentation parameters highlighted, and standard equipment employment to attain controlled fermentation discussed. This presentation will review the available information regarding traditional fermented milk (Mursik) and highlight our current research work on the application of molecular approaches (metagenomics) for the valorization of Mursik production process through starter culture/ probiotic strains isolation and identification, and quality and safety aspects of the product. The importance of the research and future research areas on the same subject will also be highlighted.

Keywords: lactic acid bacteria, high throughput biotechnology, spontaneous fermentation, Mursik

Procedia PDF Downloads 273
3003 Electrodynamic Principles for Generation and Wireless Transfer of Energy

Authors: Steven D. P. Moore

Abstract:

An electrical discharge in the air induces an electromagnetic (EM) wave capable of wireless transfer, reception, and conversion back into electrical discharge at a distant location. Following Norton’s ground wave principles, EM wave radiation (EMR) runs parallel to the Earth’s surface. Energy in an EMR wave can move through the air and be focused to create a spark at a distant location, focused by a receiver to generate a local electrical discharge. This local discharge can be amplified and stored but also has the propensity to initiate another EMR wave. In addition to typical EM waves, lightning is also associated with atmospheric events, trans-ionospheric pulse pairs, the most powerful natural EMR signal on the planet. With each lightning strike, regardless of global position, it generates naturally occurring pulse-pairs that are emitted towards space within a narrow cone. An EMR wave can self-propagate, travel at the speed of light, and, if polarized, contain vector properties. If this reflective pulse could be directed by design through structures that have increased probabilities for lighting strikes, it could theoretically travel near the surface of the Earth at light speed towards a selected receiver for local transformation into electrical energy. Through research, there are several influencing parameters that could be modified to model, test, and increase the potential for adopting this technology towards the goal of developing a global grid that utilizes natural sources of energy.

Keywords: electricity, sparkgap, wireless, electromagnetic

Procedia PDF Downloads 174
3002 Geological Structure Identification in Semilir Formation: An Correlated Geological and Geophysical (Very Low Frequency) Data for Zonation Disaster with Current Density Parameters and Geological Surface Information

Authors: E. M. Rifqi Wilda Pradana, Bagus Bayu Prabowo, Meida Riski Pujiyati, Efraim Maykhel Hagana Ginting, Virgiawan Arya Hangga Reksa

Abstract:

The VLF (Very Low Frequency) method is an electromagnetic method that uses low frequencies between 10-30 KHz which results in a fairly deep penetration. In this study, the VLF method was used for zonation of disaster-prone areas by identifying geological structures in the form of faults. Data acquisition was carried out in Trimulyo Region, Jetis District, Bantul Regency, Special Region of Yogyakarta, Indonesia with 8 measurement paths. This study uses wave transmitters from Japan and Australia to obtain Tilt and Elipt values that can be used to create RAE (Rapat Arus Ekuivalen or Current Density) sections that can be used to identify areas that are easily crossed by electric current. This section will indicate the existence of a geological structure in the form of faults in the study area which is characterized by a high RAE value. In data processing of VLF method, it is obtained Tilt vs Elliptical graph and Moving Average (MA) Tilt vs Moving Average (MA) Elipt graph of each path that shows a fluctuating pattern and does not show any intersection at all. Data processing uses Matlab software and obtained areas with low RAE values that are 0%-6% which shows medium with low conductivity and high resistivity and can be interpreted as sandstone, claystone, and tuff lithology which is part of the Semilir Formation. Whereas a high RAE value of 10% -16% which shows a medium with high conductivity and low resistivity can be interpreted as a fault zone filled with fluid. The existence of the fault zone is strengthened by the discovery of a normal fault on the surface with strike N550W and dip 630E at coordinates X= 433256 and Y= 9127722 so that the activities of residents in the zone such as housing, mining activities and other activities can be avoided to reduce the risk of natural disasters.

Keywords: current density, faults, very low frequency, zonation

Procedia PDF Downloads 156
3001 A Single-Channel BSS-Based Method for Structural Health Monitoring of Civil Infrastructure under Environmental Variations

Authors: Yanjie Zhu, André Jesus, Irwanda Laory

Abstract:

Structural Health Monitoring (SHM), involving data acquisition, data interpretation and decision-making system aim to continuously monitor the structural performance of civil infrastructures under various in-service circumstances. The main value and purpose of SHM is identifying damages through data interpretation system. Research on SHM has been expanded in the last decades and a large volume of data is recorded every day owing to the dramatic development in sensor techniques and certain progress in signal processing techniques. However, efficient and reliable data interpretation for damage detection under environmental variations is still a big challenge. Structural damages might be masked because variations in measured data can be the result of environmental variations. This research reports a novel method based on single-channel Blind Signal Separation (BSS), which extracts environmental effects from measured data directly without any prior knowledge of the structure loading and environmental conditions. Despite the successful application in audio processing and bio-medical research fields, BSS has never been used to detect damage under varying environmental conditions. This proposed method optimizes and combines Ensemble Empirical Mode Decomposition (EEMD), Principal Component Analysis (PCA) and Independent Component Analysis (ICA) together to separate structural responses due to different loading conditions respectively from a single channel input signal. The ICA is applying on dimension-reduced output of EEMD. Numerical simulation of a truss bridge, inspired from New Joban Line Arakawa Railway Bridge, is used to validate this method. All results demonstrate that the single-channel BSS-based method can recover temperature effects from mixed structural response recorded by a single sensor with a convincing accuracy. This will be the foundation of further research on direct damage detection under varying environment.

Keywords: damage detection, ensemble empirical mode decomposition (EEMD), environmental variations, independent component analysis (ICA), principal component analysis (PCA), structural health monitoring (SHM)

Procedia PDF Downloads 290
3000 Fuel Oxidation Reactions: Pathways and Reactive Intermediates Characterization via Synchrotron Photoionization Mass Spectrometry

Authors: Giovanni Meloni

Abstract:

Recent results are presented from experiments carried out at the Advanced Light Source (ALS) at the Chemical Dynamics Beamline of Lawrence Berkeley National Laboratory using multiplexed synchrotron photoionization mass spectrometry. The reaction mixture and a buffer gas (He) are introduced through individually calibrated mass flow controllers into a quartz slow flow reactor held at constant pressure and temperature. The gaseous mixture effuses through a 650 μm pinhole into a 1.5 mm skimmer, forming a molecular beam that enters a differentially pumped ionizing chamber. The molecular beam is orthogonally intersected by a tunable synchrotron radiation produced by the ALS in the 8-11 eV energy range. Resultant ions are accelerated, collimated, and focused into an orthogonal time-of-flight mass spectrometer. Reaction species are identified by their mass-to-charge ratios and photoionization (PI) spectra. Comparison of experimental PI spectra with literature and/or simulated curves is routinely done to assure the identity of a given species. With the aid of electronic structure calculations, potential energy surface scans are performed, and Franck-Condon spectral simulations are obtained. Examples of these experiments are discussed, ranging from new intermediates characterization to reaction mechanisms elucidation and biofuels oxidation pathways identification.

Keywords: mass spectrometry, reaction intermediates, synchrotron photoionization, oxidation reactions

Procedia PDF Downloads 56
2999 Effect on Surface Temperature Reduction of Asphalt Pavements with Cement–Based Materials Containing Ceramic Waste Powder

Authors: H. Higashiyama, M. Sano, F. Nakanishi, M. Sugiyama, O. Takahashi, S. Tsukuma

Abstract:

The heat island phenomenon becomes one of the environmental problems. As countermeasures in the field of road engineering, cool pavements such as water retaining pavements and solar radiation reflective pavements have been developed to reduce the surface temperature of asphalt pavements in the hot summer climate in Japan. The authors have studied on the water retaining pavements with cement–based grouting materials. The cement–based grouting materials consist of cement, ceramic waste powder, and natural zeolite. The ceramic waste powder is collected through the recycling process of electric porcelain insulators. In this study, mixing ratio between the ceramic waste powder and the natural zeolite and a type of cement for the cement–based grouting materials is investigated to measure the surface temperature of asphalt pavements in the outdoor. All of the developed cement–based grouting materials were confirmed to effectively reduce the surface temperature of the asphalt pavements. Especially, the cement–based grouting material using the ultra–rapid hardening cement with the mixing ratio of 0.7:0.3 between the ceramic waste powder and the natural zeolite reduced mostly the surface temperature by 20 °C and more.

Keywords: ceramic waste powder, natural zeolite, road surface temperature, water retaining pavements

Procedia PDF Downloads 400
2998 Automatic Furrow Detection for Precision Agriculture

Authors: Manpreet Kaur, Cheol-Hong Min

Abstract:

The increasing advancement in the robotics equipped with machine vision sensors applied to precision agriculture is a demanding solution for various problems in the agricultural farms. An important issue related with the machine vision system concerns crop row and weed detection. This paper proposes an automatic furrow detection system based on real-time processing for identifying crop rows in maize fields in the presence of weed. This vision system is designed to be installed on the farming vehicles, that is, submitted to gyros, vibration and other undesired movements. The images are captured under image perspective, being affected by above undesired effects. The goal is to identify crop rows for vehicle navigation which includes weed removal, where weeds are identified as plants outside the crop rows. The images quality is affected by different lighting conditions and gaps along the crop rows due to lack of germination and wrong plantation. The proposed image processing method consists of four different processes. First, image segmentation based on HSV (Hue, Saturation, Value) decision tree. The proposed algorithm used HSV color space to discriminate crops, weeds and soil. The region of interest is defined by filtering each of the HSV channels between maximum and minimum threshold values. Then the noises in the images were eliminated by the means of hybrid median filter. Further, mathematical morphological processes, i.e., erosion to remove smaller objects followed by dilation to gradually enlarge the boundaries of regions of foreground pixels was applied. It enhances the image contrast. To accurately detect the position of crop rows, the region of interest is defined by creating a binary mask. The edge detection and Hough transform were applied to detect lines represented in polar coordinates and furrow directions as accumulations on the angle axis in the Hough space. The experimental results show that the method is effective.

Keywords: furrow detection, morphological, HSV, Hough transform

Procedia PDF Downloads 222
2997 Genetic Data of Deceased People: Solving the Gordian Knot

Authors: Inigo de Miguel Beriain

Abstract:

Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.

Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people

Procedia PDF Downloads 140
2996 Design Criteria for Achieving Acceptable Indoor Radon Concentration

Authors: T. Valdbjørn Rasmussen

Abstract:

Design criteria for achieving an acceptable indoor radon concentration are presented in this paper. The paper suggests three design criteria. These criteria have to be considered at the early stage of the building design phase to meet the latest recommendations from the World Health Organization in most countries. The three design criteria are; first, establishing a radon barrier facing the ground; second, lowering the air pressure in the lower zone of the slab on ground facing downwards; third, diluting the indoor air with outdoor air. The first two criteria can prevent radon from infiltrating from the ground, and the third criteria can dilute the indoor air. By combining these three criteria, the indoor radon concentration can be lowered achieving an acceptable level. In addition, a cheap and reliable method for measuring the radon concentration in the indoor air is described. The provision on radon in the Danish Building Regulations complies with the latest recommendations from the World Health Organization. Radon can cause lung cancer and it is not known whether there is a lower limit for when it is not harmful to human beings. Therefore, it is important to reduce the radon concentration as much as possible in buildings. Airtightness is an important factor when dealing with buildings. It is important to avoid air leakages in the building envelope both facing the atmosphere, e.g. in compliance with energy requirements, but also facing the ground, to meet the requirements to ensure and control the indoor environment. Infiltration of air from the ground underneath a building is the main providing source of radon to the indoor air.

Keywords: radon, natural radiation, barrier, pressure lowering, ventilation

Procedia PDF Downloads 337
2995 The Explanation for Dark Matter and Dark Energy

Authors: Richard Lewis

Abstract:

The following assumptions of the Big Bang theory are challenged and found to be false: the cosmological principle, the assumption that all matter formed at the same time and the assumption regarding the cause of the cosmic microwave background radiation. The evolution of the universe is described based on the conclusion that the universe is finite with a space boundary. This conclusion is reached by ruling out the possibility of an infinite universe or a universe which is finite with no boundary. In a finite universe, the centre of the universe can be located with reference to our home galaxy (The Milky Way) using the speed relative to the Cosmic Microwave Background (CMB) rest frame and Hubble's law. This places our home galaxy at a distance of approximately 26 million light years from the centre of the universe. Because we are making observations from a point relatively close to the centre of the universe, the universe appears to be isotropic and homogeneous but this is not the case. The CMB is coming from a source located within the event horizon of the universe. There is sufficient mass in the universe to create an event horizon at the Schwarzschild radius. Galaxies form over time due to the energy released by the expansion of space. Conservation of energy must consider total energy which is mass (+ve) plus energy (+ve) plus spacetime curvature (-ve) so that the total energy of the universe is always zero. The predominant position of galaxy formation moves over time from the centre of the universe towards the boundary so that today the majority of new galaxy formation is taking place beyond our horizon of observation at 14 billion light years.

Keywords: cosmology, dark energy, dark matter, evolution of the universe

Procedia PDF Downloads 128
2994 Solar Building Design Using GaAs PV Cells for Optimum Energy Consumption

Authors: Hadis Pouyafar, D. Matin Alaghmandan

Abstract:

Gallium arsenide (GaAs) solar cells are widely used in applications like spacecraft and satellites because they have a high absorption coefficient and efficiency and can withstand high-energy particles such as electrons and protons. With the energy crisis, there's a growing need for efficiency and cost-effective solar cells. GaAs cells, with their 46% efficiency compared to silicon cells 23% can be utilized in buildings to achieve nearly zero emissions. This way, we can use irradiation and convert more solar energy into electricity. III V semiconductors used in these cells offer performance compared to other technologies available. However, despite these advantages, Si cells dominate the market due to their prices. In our study, we took an approach by using software from the start to gather all information. By doing so, we aimed to design the optimal building that harnesses the full potential of solar energy. Our modeling results reveal a future; for GaAs cells, we utilized the Grasshopper plugin for modeling and optimization purposes. To assess radiation, weather data, solar energy levels and other factors, we relied on the Ladybug and Honeybee plugins. We have shown that silicon solar cells may not always be the choice for meeting electricity demands, particularly when higher power output is required. Therefore, when it comes to power consumption and the available surface area for photovoltaic (PV) installation, it may be necessary to consider efficient solar cell options, like GaAs solar cells. By considering the building requirements and utilizing GaAs technology, we were able to optimize the PV surface area.

Keywords: gallium arsenide (GaAs), optimization, sustainable building, GaAs solar cells

Procedia PDF Downloads 68