Nov. 27, 2023, 2:12 p.m. |
Added
35
|
{"external_links": []}
|
|
Nov. 20, 2023, 2:03 p.m. |
Added
35
|
{"external_links": []}
|
|
Nov. 13, 2023, 1:33 p.m. |
Added
35
|
{"external_links": []}
|
|
Nov. 6, 2023, 1:31 p.m. |
Added
35
|
{"external_links": []}
|
|
Aug. 14, 2023, 1:30 p.m. |
Added
35
|
{"external_links": []}
|
|
Aug. 7, 2023, 1:31 p.m. |
Added
35
|
{"external_links": []}
|
|
July 31, 2023, 1:34 p.m. |
Added
35
|
{"external_links": []}
|
|
July 24, 2023, 1:35 p.m. |
Added
35
|
{"external_links": []}
|
|
July 17, 2023, 1:34 p.m. |
Added
35
|
{"external_links": []}
|
|
July 10, 2023, 1:26 p.m. |
Added
35
|
{"external_links": []}
|
|
July 3, 2023, 1:26 p.m. |
Added
35
|
{"external_links": []}
|
|
June 26, 2023, 1:25 p.m. |
Added
35
|
{"external_links": []}
|
|
June 19, 2023, 1:27 p.m. |
Added
35
|
{"external_links": []}
|
|
June 12, 2023, 1:29 p.m. |
Added
35
|
{"external_links": []}
|
|
June 5, 2023, 1:33 p.m. |
Added
35
|
{"external_links": []}
|
|
May 29, 2023, 1:27 p.m. |
Added
35
|
{"external_links": []}
|
|
May 22, 2023, 1:29 p.m. |
Added
35
|
{"external_links": []}
|
|
May 15, 2023, 1:31 p.m. |
Added
35
|
{"external_links": []}
|
|
May 8, 2023, 1:37 p.m. |
Added
35
|
{"external_links": []}
|
|
May 1, 2023, 1:27 p.m. |
Added
35
|
{"external_links": []}
|
|
April 24, 2023, 1:34 p.m. |
Added
35
|
{"external_links": []}
|
|
April 17, 2023, 1:28 p.m. |
Added
35
|
{"external_links": []}
|
|
April 10, 2023, 1:24 p.m. |
Added
35
|
{"external_links": []}
|
|
April 3, 2023, 1:26 p.m. |
Added
35
|
{"external_links": []}
|
|
Jan. 28, 2023, 11:08 a.m. |
Created
43
|
[{"model": "core.projectfund", "pk": 26995, "fields": {"project": 4188, "organisation": 7, "amount": 545042, "start_date": "2008-02-01", "end_date": "2011-01-31", "raw_data": 42845}}]
|
|
Jan. 28, 2023, 10:52 a.m. |
Added
35
|
{"external_links": []}
|
|
April 11, 2022, 3:46 a.m. |
Created
43
|
[{"model": "core.projectfund", "pk": 19103, "fields": {"project": 4188, "organisation": 7, "amount": 545042, "start_date": "2008-02-01", "end_date": "2011-01-31", "raw_data": 19650}}]
|
|
April 11, 2022, 3:46 a.m. |
Created
41
|
[{"model": "core.projectorganisation", "pk": 72850, "fields": {"project": 4188, "organisation": 5632, "role": "COLLAB_ORG"}}]
|
|
April 11, 2022, 3:46 a.m. |
Created
41
|
[{"model": "core.projectorganisation", "pk": 72849, "fields": {"project": 4188, "organisation": 23, "role": "LEAD_ORG"}}]
|
|
April 11, 2022, 3:46 a.m. |
Created
40
|
[{"model": "core.projectperson", "pk": 44817, "fields": {"project": 4188, "person": 6090, "role": "RESEARCH_COI_PER"}}]
|
|
April 11, 2022, 3:46 a.m. |
Created
40
|
[{"model": "core.projectperson", "pk": 44816, "fields": {"project": 4188, "person": 5967, "role": "PI_PER"}}]
|
|
April 11, 2022, 1:47 a.m. |
Updated
35
|
{"title": ["", "How does the Drosophila brain compute and see visual motion?"], "description": ["", "\nAnimals have neural mechanisms for detecting visual motion, enabling them to infer the speed and direction of objects that move in visual scenes. With its perceptual advantages the ability to detect motion has shaped the organisation and function of visual systems. However, the way in which the visual systems process and route motion information has proven to be a difficult problem to decipher. This proposal aims to elucidate the functional organisation of neural networks responsible for encoding visual motion in the brain of the fruit fly, Drosophila. We have recently developed an extremely versatile Drosophila preparation that enables us to visualise in real time how a specialised web of motion sensitive neurones (LPTCs) in the brains of transgenic flies translate moving images in the scene into neural activity patterns (calcium and voltage signals). These flies have genetically engineered eyes that are sensitive to ultraviolet (UV) light and brains that express green-sensitive fluorescence proteins (optical reporters) that react to changes in the neural activity (here calcium changes) in LPTCs. Since the spectral sensitivities of the eye and optical reporters do not overlap, we can visualise neural activity in the LPTCs when such a fly looks at moving objects, being oblivious of us simultaneously scanning its brain. In order to fully utilise this novel preparation requires the generation of a unique hybrid experimental apparatus that can visualise calcium signals and measure voltage responses with sharp microelectrodes simultaneously. For live imaging the flies will be placed in this apparatus in which they are presented with moving UV-light patterns while calcium and voltage signals are monitored from the LPTCs. Using this system, together with further genetic modifications in the eyes and the brain of the flies, we wish to investigate how visual motion signals are routed and processed by the fly's visual system. Here we plan to find answers to two important open questions. What is the contribution of different photoreceptor types in routing visual information to the brain so that the speed and direction of objects moving in the scene can be inferred? and what is the contribution of attentional signals from the brain in the visual motion processing? These questions will be studied by monitoring changes in calcium and voltage signals of LPTCs in transgenic flies in which selective neural pathways from the eyes or from the brain can be turned on and off, using temperature-sensitive genetic switches. Furthermore, the visual behaviour of the same flies will be characterised in a flight simulator system running in our laboratory. In this way we shall be able to correlate the genetically targeted changes in the routing and processing of visual motion information to the animal behaviour and cognitive phenomena. In a parallel approach, the results from these experiments will be analysed and modelled mathematically to find answers to the open question of how moving visual objects in the scene are encoded into moving neural images, as represented by activity patterns of networks of interconnected neurones in the brain.\n\n"], "extra_text": ["", "\nTechnical Abstract:\nWe wish to study how motion-sensitive neurones (LPTCs) in the Drosophila visual ganglia perform visual motion computations. The relative simplicity and the genetically malleable connections to LPTCs from the eyes and the brain can help us to make sense of the coding strategies of these neurones. We have recently generated transgenic flies that are predominantly UV-sensitive and in which a sub-group of LPTCs express Ca2+-reporters, and with these flies established a way to record intracellular voltage and Ca2+ signals in LTPCs to moving patterns in vivo. By genetically shifting the sensitivity of photoreceptors to UV-range their excitation spectra do not overlap with the fluorescence of the genetic reporters; live-imaging of Ca2+ signalling in LPTCs does not influence the visual functions of these flies. Building on these efforts, we wish to now dissect the bottom-up and top-down connections to LPTCs, and their respective roles in computing neural representations of motion signals. This we plan to do by recording changes in voltage and Ca2+ signals of LPTCs to moving UV-stimuli while switching on/off different synaptic inputs to them, using temperature-sensitive GAL4-UASshiTS1 system. A central requirement for this project is the ability to simultaneously monitor and correlate changes in Ca2+ and voltage signals of individual LPTCs with changes in the network environment. This cannot be sufficiently achieved with current instrumental set-ups, hence, we propose to construct a hybrid instrument that combines 2-photon scanning microscopy and electrophysiology, capable of simultaneous measurements of voltage and Ca2+ signals in the tissue depths and spatial resolution that are required here. Furthermore, to correlate the changes in the network activity to the fly's optomotor behaviour, the same flies will be tested in a flight simulator system. The results and analysis of these experiments will be used for making realistic mathematical models of motion detection\n\n\n\n"], "status": ["", "Closed"]}
|
|
April 11, 2022, 1:47 a.m. |
Added
35
|
{"external_links": [15897]}
|
|
April 11, 2022, 1:47 a.m. |
Created
35
|
[{"model": "core.project", "pk": 4188, "fields": {"owner": null, "is_locked": false, "coped_id": "6a902a4f-a365-4abb-b18a-79c0549dc683", "title": "", "description": "", "extra_text": "", "status": "", "start": null, "end": null, "raw_data": 19565, "created": "2022-04-11T01:38:06.124Z", "modified": "2022-04-11T01:38:06.124Z", "external_links": []}}]
|
|